Marcos Cesar M. Pereira, co-coordination of the Encryption Observatory
An old acquaintance has returned to the media spotlight— for reasons we didn’t expect. The legislative project known as Chat Control has gained prominence in recent days due to a “new proposal” that could threaten end-to-end encryption, called [upload moderation](https://en.wikipedia.org/wiki/Upload_moderation). But first, let’s remember what Chat Control is about.
Recalling Chat Control
Chat Control, as it became known, is a proposal from the European Commission aimed at establishing rules to prevent and combat child sexual abuse. One of the provisions causing concern among various sectors is the requirement for solutions to scan users’ private messages, potentially breaking end-to-end encryption.
It’s worth noting that version 1.0 is in effect, derived from the derogation of the ePrivacy Directive approved in 2021, allowing email providers and messaging services to voluntarily scan personal messages of European citizens. In the proposed version 2.0, scanning would be mandatory for all emails and messaging providers, including those with end-to-end encryption.
The proposal has sparked strong reactions from various sectors of European and global society, as scanning private messaging implies a breach of end-to-end encryption. In this regard, experts published a joint letter in 2023 urging the European Commission to make changes. Here at ObCrypto, aiming to shed light on the debate, we invited members of the Internet Society Portugal Chapter to highlight the proposal’s issues.
Not only does the content of the project contain controversies. A Dutch researcher discovered that advertisements paid for by the European Commission were targeted at individuals based on religion and political orientation. The microtargeting campaign, aside from having specific targets, focused on seven European countries critical of the proposal.
By the end of 2023, a glimmer of hope seemed to emerge. Members of the Civil Liberties Committee of the European Parliament voted against mass monitoring of private communications. Regarding end-to-end encryption, members of the Parliament’s Committee position themselves to exclude encrypted services from the mandatory scanning of communications.
Now, is “upload moderation” the new buzzword?
Despite all the reactions, the European Commission continues to insist on the proposal to scan private messages. The draft presented by Belgium proposes upload moderation as an alternative to mandatory scanning. Under this measure, users must consent to have images and URLs scanned for child sexual abuse materials before sending.
The document’s recital, published by the Netzpolitik website, presents justifications for this measure. Observing the argumentative history used by governments for scanning private messages, nothing new emerges. The recital follows the “going dark” playbook, affirming the need for encrypted communications not to be safe havens for criminals and abusers to evade investigative forces.
The presented alternative appears to be a response to the contrary vote from the European Parliament’s Civil Liberties Committee. Instead of mass scanning without user knowledge, users must consent to scanning before sending content. Thus, the shift is from mandatory to consent-based.
The proposal has sparked numerous reactions from privacy activists, digital rights advocates, and digital services. Despite its proposed name, it involves a new form of client-side scanning, where user-submitted content is analyzed to match a database before transmission. Therefore, the measure not only weakens strong encryption but also creates new areas of vulnerability.
Among the notable reactions is that of Meredith Whittaker, president of Signal. Whittaker has already emphasized that Signal will leave Europe if forced to lower security levels. Furthermore, in a letter published on June 17, the organization’s president strongly criticized the project, urging lawmakers to stop playing word games that ultimately threaten encryption.
Members of the European Parliament have also issued an appeal for the proposal not to advance. In the letter, parliamentarians highlight, among various points, the importance of encrypted communications for everyone, including protecting children, adolescents, and victims of sexual abuse with a secure channel to seek help.
Another important statement, in which IP.rec is a signatory, was the joint letter from the Global Encryption Coalition (GEC). The document also aligns with previous ones, reinforcing that changing the name does not erase the risks of the measure. Additionally, it underscores the problematic nature of the proposal’s forced consent.
The reactions have had an effect. The vote scheduled for June 20, where European governments would adopt a position on the proposal, was postponed. The victory is temporary, but it should be celebrated considering the danger of the proposal.
When will this stop?
The aforementioned documents reinforce how upload moderation weakens encryption, in addition to introducing new vulnerabilities in systems. It’s important to note that, in addition to these risks, there is a fundamental problem with the quality of such consent, a topic reinforced in the GEC’s letter, as well as the various risks to human rights posed by this proposal.
Regarding consent, there is a problem as a starting point: what is the quality of this consent provided? As indicated in the recital, if approved, the user will have to consent to share visual content (such as photos, videos, and GIFs) and URLs. Without authorization, the user may use the service but without these functionalities.
Firstly, this proposal presents a choice that is not actually given voluntarily. If the user is faced with an option where they need to consent, in exchange for a limitation, would this consent be fair? There is clearly an introduction of deceptive design patterns, where the user wants to do something, in this case share content, but in exchange must do something they do not desire (have the content scanned), falling under a type of forced action pattern.
Aside from being unethical when it comes to product design, there is a legal conflict. Today, deceptive pattern practices are prohibited under European legislation itself. The approved Digital Services Act, in the Article 25, explicitly prohibits any form of manipulation of the user to perform an action they would not otherwise do.
Another point of concern is the impact on those who refuse to have their content scanned. As indicated by the project, those who do not provide consent will not be able to use visual media or URL sending functionalities. This could pose a risk to citizens’ rights, as the transmission of information would be limited by the absence of key functionalities in messaging services. Thus, rights such as freedom of expression, opinion, and information seem to be curtailed, along with the right to privacy.
In addition to these points, it should be noted that this proposal does not consider how users can bypass these restrictions. Even without consent, there is still the possibility of sending URLs, altering them so they become undetectable as URLs. The simplest and most known method is simply adding a space between the domain and the top-level domain (e.g., obcrypto.org would be transmitted by users as obcrypto. org). Furthermore, users could still store images in the cloud and share them via URL, combining this tactic to circumvent the restriction on media sharing.
This measure is not only a risk for Europe but also for other continents, especially in the Global South, which often look to the Global North for legislative inspiration. It would not be surprising to see the emergence of bills and/or amendments in Brazil that would compel messaging services to introduce upload moderation, particularly to combat misinformation given the history of the traceability proposal. Therefore, it is important to always keep a vigilant eye on the narrative adaptation of legislators when it comes to weakening end-to-end encryption.
All of this seems to have been ignored in the proposal that, in addition to falling into the same disguised errors of an alternative solution, brings new problems to the table. Various sectors have been emphasizing the importance of encryption in defending children and adolescents. At IP.rec, we see this endeavor producing analyses, translating documents, like the Parents’s Guide to Encryption, and the report Privacy and protection: A children’s rights approach to encryption. But what we want to know is when they will finally understand that cryptography is also security for children and adolescents?