The Encryption Observatory (ObCrypto) interviewed Jeremy Malcolm, founder and Executive Director of Prostasia Foundation, a non-profit organization based in the United States with focus on the development of solutions on the combat of child sexual abuse while guaranteeing human and civil rights. The interview addressed issues related with public policy proposals in the United States and Europe that came up within the last year as alleged solutions to the dissemination of child sexual abuse material, but that offered threats to encryption, and, therefore, Internet security and the rights of users.
Question: Recently, the European Commission has decided to launch a public consultation on effective ways to combat child sexual abuse, both online and offline. One of the ideas is to demand service providers report child abuse content to public authorities. However, when platforms deploy end-to-end encryption, there is distrust from law enforcement agencies regarding the role of encryption in combating child sexual abuse. This point of view is reinforced in a document that was leaked and elaborated within the European Commission. Posed that way, the terms of this debate tend to question whether encryption is essential to the protection of children or if it could be a barrier when authorities go after those criminals. Where do you stand in this debate? What is the role of encryption in the protection of children? Could any “exceptional access” be deployed?
Answer: We support the role of platforms in reporting child abuse content to authorities, but weakening privacy should not be a part of this. Children should not grow up in a world in which it is impossible to communicate privately online. Such a world is not safer for them, it is more dangerous. Any European mandate for platforms to weaken privacy (which includes providing “exceptional access”) would be futile, because there will always be encrypted communication platforms that are beyond the reach of European authorities. Child abuse content will simply shift to those platforms, and nothing will have been gained.
Q: Contemporary debates over data protection and child safety on the Internet give, perhaps, little emphasis to the role of encryption. Analyzing materials such as the position of multilateral entities (such as UNICEF) on the subject, what we have is still unsure of propositions. How do you see the participation of multilateral bodies over this debate? We must expect firmer understandings regarding encryption?
A: Multilateral entities are giving out mixed messages on encryption, and this is due in part to the significant influence of government-affiliated child safety groups. For example, the UN Committee on the Rights of the Child significantly weakened draft language on encryption in its recently-released general comment No. 25 (2021) on children’s rights in relation to the digital environment, in response to submissions such as those of ECPAT, which argued that “end-to-end encryption in its current form poses substantial risks to children that are not proportionate to the privacy pay offs afforded to technology users.”
Q: Some people say that online platforms should strip away their end-to-end encryption features so that law enforcement could track down potential criminals related to child sexual abuse material (CSAM). However, some say that once a platform is unencrypted, the creators and broadcasters of CSAM can only migrate to new platforms that offer this type of service. How to address possible resilient solutions? How are platforms currently cooperating in investigations without compromising encryption?
A: Some platforms, such as Facebook, have made effective use of metadata to identify potential CSAM offenders. Traditional police investigative techniques, such as stings and targeted surveillance, have also been successfully used to unmask CSAM offenders. However, the idea that every offender can be caught is illusory, and should not be a goal of rational child protection policy. Much more can be gained by investing in prevention interventions, such as research, education and stigma-free support, all of which are significantly neglected in comparison to enforcement activities.
Q: The narratives regarding the U.S. debate, nowadays, are also close to the mentioned moves by the European Commission when it comes to fighting CSAM. The proposed modifications by the EARN IT Act in the “safe harbor” provided by Section 230 to intermediaries, for instance, comes with loud voices of entities such as the National Center for Missing and Exploited Children (NCMEC). Could you explain the terms of the bill, how it impacts the intermediary liability model, and if/how it relates with the representations of reporting and fighting CSAM? How could encryption be affected?
A: The EARN IT Act would make platforms liable for any acts of online child sexual exploitation on their platforms, unless they fulfil an uncertain set of obligations imposed by a government-appointed committee (in the version of the Act that was first introduced), or by state legislatures (in the second version). We know what this will lead to, due to experience with a previous law, FOSTA/SESTA, which took a similar approach: it resulted in the over-removal of consent about sex, including sex education and even resources for child sexual abuse prevention. The EARN IT Act could also result in platforms choosing not to deploy encryption technologies, for fear of potential liability. The EARN IT Act would accomplish little, since under existing U.S. law, platforms are already required to remove and report CSAM that comes to their attention, and they do not have safe harbor protection for such content.
Q: Some entities, such as the NCMEC, claim that end-to-end encryption might jeopardize reports and tips that are sent to law enforcement authorities regarding child pornography – the reason why chat controls would be important. According to MEP Patrick Breyer, however, there is currently a major flaw in the messaging and chat control proposals, such as the majority of the reports being without merit, resulting in a great number of false reports. With that said, are there benefits in those kinds of monitoring to provide reports? What are the risks to privacy, freedom of expression, and security, especially when we consider children’s use of online platforms?
A: There is already a significant backlog of CSAM reports that far outstrips available enforcement resources. It is also true that many existing reports are meritless: when confronted with claims from Swiss authorities that its reports were only about 10% accurate, NCMEC’s response was that its accuracy rate was closer to 63%. That’s still exceptionally poor, when you consider that every false report puts a citizen under police suspicion of being a child abuser! Incentivizing Internet companies to remove and report even more content will only result in a worse accuracy rate, and increase the backlog of reports. Much more could be done by providing additional resources for prevention and enforcement, which is something that the EARN IT Act would not do.
Q: Moving forward, we still have to fight child abuse and, at the same time, preserve internet stability and security. How could service providers cooperate with law enforcement observing proportionality, legality, and efficiency safe-guards? And beyond judicial and technical solutions, what would you recommend to prevent the problem in terms of social and psychological measures?
A: The most effective method of addressing image-based child abuse online remains user reporting. Since the vast majority of Internet users find such abuse truly abhorrent, CSAM is quickly reported and is rarely encountered by those who are not looking for it. But to stop it from appearing online in the first place requires a prevention-focused approach. Because our society views child sexual abuse primarily as a criminal justice problem rather than as a preventable public health problem, we have barely begun to scratch the surface of what prevention programs could accomplish. Prostasia Foundation is one of the few organizations working with leading experts to research how we can prevent offending by people who have a sexual interest in children, and we have raised over $50,000 to support such research this year. We also host a prevention-focused support forum in conjunction with U.S. hotline Stop It Now.