Reporters Without Borders (RSF) are concerned about the suggestion for countering sexual child abuse offered by the European Union (EU). The initiative implies that the content of messaging applications, including encrypted ones, could be permanently scanned. The RFS supposes that if these suggestions are approved, it could jeopardize journalists’ activities.
According to the draft published on May 11, the European Commission suggests a new EU legislative act aimed at preventing and countering online sexual child abuse and complementing the Digital Services Act. The latter is due to be adopted until late June.
In order to protect minors, the Commission recommends systematically monitoring chat content in messaging applications, including encrypted messaging applications. RFS affirms that encrypted messaging applications should be exempted from regular surveillance due to the reason that, for protecting their sources of information, journalists rely specifically on these tools in their activity.
“This proposal has a very laudable goal but feigns a complete misunderstanding of encryption,” says Vincent Berthier, the head of RSF’s Tech Desk. “It’s simple: scanning end-to-end encrypted messaging services would render them useless and would be tantamount to mass surveillance! Such a demand from the European Commission is unacceptable and dangerous, both for press freedom and democracies,” the RSF representative affirms.
The organization specifies that platforms which provide end-to-end encrypted messaging services cannot decrypt chat content; they can only determine the sender and the recipient. According to the organization, if the content of these communications becomes accessible to a third party, “the security is irreparably compromised.”
The source mentions that, when presenting the proposed regulation, European commissioner for home affairs Ylva Johansson said the aim was “about detecting child abuse material,” not about undermining data encryption. “The proposal says that, to preserve users’ right to privacy, only disputed content should be monitored and reported by service providers. The intention is commendable but is technically impossible because all chats would have to be scanned in order to identify those that pose a problem,” RFS specifies.
The NGO considers that the Commission has to consider the consequences for protection of journalists’ sources and confidentiality of their work. A case-centered approach based on prior investigations “would be much more effective and would not undermine such a fundamental principle as the protection of sources.”
According to a communiqué from the European Commission, “the new rules will help rescue children from further abuse, prevent material from reappearing online, and bring offenders to justice.” The suggested regulations stipulate that providers will be required to detect, report, and delete materials which contain elements of child sexual abuse from their services.