The European Union is voting today (September 23) on its controversial chat control legislation, a measure security and privacy experts warn will destroy private messaging in the bloc.
The EU has been engaged in a concerted effort to undermine privacy and security by trying to pass legislation that would force companies to break end-to-end encryption (E2EE). The bloc has proposed the use of “client-side scanning,” a technology that scans files on the devices and alerts the authorities if anything illegal is discovered.
Tune in as we dive into the EU’s vote that could spell the end of private messaging!
After previous efforts were shot down, the EU has relabeled “client-side scanning” as “upload moderation,” essentially an effort to force users to agree to client-side scanning if they want to be able to send or upload any media files via a messaging platform that otherwise features E2EE. “Upload moderation” is a clever way to essentially render E2EE moot, while still being able to technically tout support for strong encryption.
Signal President Meredith Whittaker called out the EU for its efforts, slamming the bloc for trying to pull a fast one on users, and ignoring the mathematical reality that there is no way to maintain secure and private communication while simultaneously trying to undermine or circumvent E2EE.
Instead of accepting this fundamental mathematical reality, some European countries continue to play rhetorical games. They’ve come back to the table with the same idea under a new label. Instead of using the previous term “client-side scanning,” they’ve rebranded and are now calling it “upload moderation.” Some are claiming that “upload moderation” does not undermine encryption because it happens before your message or video is encrypted. This is untrue.
Rhetorical games are cute in marketing or tabloid reporting, but they are dangerous and naive when applied to such a serious topic with such high stakes. So let’s be very clear, again: mandating mass scanning of private communications fundamentally undermines encryption. Full stop. Whether this happens via tampering with, for instance, an encryption algorithm’s random number generation, or by implementing a key escrow system, or by forcing communications to pass through a surveillance system before they’re encrypted. We can call it a backdoor, a front door, or “upload moderation.” But whatever we call it, each one of these approaches creates a vulnerability that can be exploited by hackers and hostile nation states, removing the protection of unbreakable math and putting in its place a high-value vulnerability.
We ask that those playing these word games please stop and recognize what the expert community has repeatedly made clear. Either end-to-end encryption protects everyone, and enshrines security and privacy, or it’s broken for everyone. And breaking end-to-end encryption, particularly at such a geopolitically volatile time, is a disastrous proposition.
Patrick Breyer–former Pirate Party Member of the European Parliament and co-negotiator of the European Parliament’s critical position on the proposal—says the EU is voting on the revised measure today and goes on to describe the issues such a measure will cause if it passes.
“Instead of empowering teens to protect themselves from sextorsion and exploitation by making chat services safer, victims of abuse are betrayed by an unrealistic bill that is doomed in court, according to the EU Council’s own legal assessment,” writes Breyer. “Flooding our police with largely irrelevant tips on old, long known material will fail to save victims from ongoing abuse, and will actually reduce law enforcement capacities for going after predators. Europeans need to understand that they will be cut off from using commonplace secure messengers if this bill is implemented – that means losing touch with your friends and colleagues around the world. Do you really want Europe to become the world leader in bugging our smartphones and mandating untargeted blanket surveillance of the chats of millions of law-abiding Europeans?”
“Regardless of the objective – imagine the postal service simply opened and snooped through every letter without suspicion,” Breyer adds. “It’s inconceivable. Besides, it is precisely the current bulk screening for supposedly known content by Big Tech that exposes thousands of entirely legal private chats, overburdens law enforcement and mass criminalises minors.”
The EU Acknowledges the Measure Is Privacy-Invasive
Interestingly, the EU does not even try to hide the fact that its proposed measures are the most privacy-invasive solution available to it.
The company described its solution in 2022:
At the same time, the detection process would be the most intrusive one for users (compared to the detection of known and new CSAM) since it would involve searching text, including in interpersonal communications, as the most important vector for grooming.
Even more telling is the fact that EU ministers want to make sure they are exempt from the chat control legislation, the most damning indication of all that the EU is aware of the privacy implications of its efforts.
“The fact that the EU interior ministers want to exempt police officers, soldiers, intelligence officers and even themselves from chat control scanning proves that they know exactly just how unreliable and dangerous the snooping algorithms are that they want to unleash on us citizens,” said Breyer. “They seem to fear that even military secrets without any link to child sexual abuse could end up in the US at any time. The confidentiality of government communications is certainly important, but the same must apply to the protection of business and of course citizens communications, including the spaces that victims of abuse themselves need for secure exchanges and therapy. We know that most of the chats leaked by today’s voluntary snooping algorithms are of no relevance to the police, for example family photos or consensual sexting. It is outrageous that the EU interior ministers themselves do not want to suffer the consequences of the destruction of digital privacy of correspondence and secure encryption that they are imposing on us.”
Why Is the EU Pushing for Chat Control?
Given the issues surrounding chat control, many may wonder why the EU is hell-bent on passing such legislation, especially when the bloc touts itself as pro-privacy.
In short, chat control is being promoted as a way to combat child sexual abuse material (CSAM). Unfortunately, while such a goal is certainly admirable, trying to tackle it with chat control legislation is problematic at best.
“Let me be clear what that means: to detect grooming’ is not simply searching for known CSAM. It isn’t using AI to detect new CSAM, which is also on the table. It’s running algorithms reading your actual text messages to figure out what you’re saying, at scale.” — Matthew Green (@matthew_d_green), May 10, 2022.
“It is potentially going to do this on encrypted messages that should be private. It won’t be good, and it won’t be smart, and it will make mistakes. But what’s terrifying is that once you open up ‘machines reading your text messages’ for any purpose, there are no limits.” — Matthew Green (@matthew_d_green), May 10, 2022.
Private messaging platform Threema further describes the issues:
Of course, sharing CSAM is an absolutely intolerable, horrific crime that has to be punished. Before CSAM can be shared online, however, a child must have suffered abuse in real life, which is what effective child protection should be trying to prevent (and what Chat Control does not focus on). For this and many other reasons, child protection organizations such as Germany’s Federal Child Protection Association are against Chat Control, arguing that it’s “neither proportionate nor effective.”
Besides, there’s no way of really knowing whether Chat Control would actually be (or remain) limited to CSAM. Once the mass-surveillance apparatus is installed, it could easily be extended to detect content other than CSAM without anyone noticing it. From a service provider’s point of view, the detection mechanism, which is created and maintained by third parties, essentially behaves like a black box.
Experts Say There Are Better Options
In Germany’s arguments against the EU’s efforts, Chief Prosecutor Markus Hartmann, Head of the Central and Contact Point Cybercrime North Rhine-Westphalia, said the EU was going to far in its proposals. Instead, he said law enforcement agencies should be better funded and supported so they could better combat CSAM using traditional techniques. Other experts agree with Chief Prosecutor Hartmann.
“Child protection is not served if the regulation later fails before the European Court of Justice,” said Felix Reda from the Society for Freedom Rights. “The damage to the privacy of all people would be immense “, he added. “The tamper-free surveillance violates the essence of the right to privacy and cannot therefore be justified by any fundamental rights assessment.”
“The draft regulation basically misses the goal of countering child abuse representations,” emphasized the Computer scientist and spokeswoman for the Chaos Computer Club, Elina Eickstädt (via computer translation). “The design is based on a gross overestimation of capabilities of technologies “, especially with regard to the detection of unknown material.
What Happens If the Legislation Passes?
If the EU is successful in passing the legislation, citizens will lose access to private communications platforms, such as Signal and Threema, as both platforms have vowed to pull out of the EU.
In due time, the issue will likely make its way to EU courts, and experts hope the legislation will be struck down there.
In the meantime, [as Matthew Green says](“the most sophisticated mass surveillance machinery ever deployed outside of China and the USSR.”), EU citizens will have to contend with “the most sophisticated mass surveillance machinery ever deployed outside of China and the USSR.”