The countries are opposed to the planned massive surveillance of private encrypted messages as part of an EU regulation and are demanding corrections.
The EU Commission’s obligation in the fight against child sexual abuse for service providers to search their users’ private communications for conspicuous patterns with technical aids and possibly by overturning encryption comes up against “serious fundamental rights concerns”. The Bundesrat made this clear on Friday and called on the federal government to take countermeasures.
Effective and targeted measures required
In their statement adopted on Friday, the federal states point out that the providers, when complying with the disclosure order brought into play by the Commission, “monitor all Internet-based communication and may also gain knowledge of content that can be assigned to the most personal area of life”. Communication with particularly protected interlocutors and persons subject to professional secrecy such as lawyers, doctors, journalists and parliamentarians can also be included.
The Bundesrat is therefore asking the Federal Government to work on the details of the design of the ordinance so that its interventions and benefits, especially for young people, are “balanced out as best as possible”. The executive should ensure that “effective and targeted measures are taken to combat sexual abuse, while at the same time the right to confidentiality of private communications is maintained to the highest degree in the future.”
“Considerable doubts” about the compatibility
With the controversial project, providers of consistently encrypted messaging and other communication services such as WhatsApp, Apple, Signal and Threema can also be obliged by official orders to locate photos and videos of child abuse in the messages of their users. Indications of potential perpetrators stalking children via the Internet (“cyber grooming”) would also have to be found.
The European and legal committees of the state chambers had recommended that the prime ministers also report “considerable doubts” about the compatibility of the proposal “with higher-ranking Union law”. General monitoring obligations, measures to scan private communications and identification obligations also encountered “general concerns with regard to their proportionality – also in view of the low efficiency and effectiveness and high error rate when using algorithms”. However, the two bodies did not find a majority in the plenum for their petition.
Chat control – critical rating and review
In other member states, criticism of the project is still comparatively restrained. In principle, the Federal council also supports “the aim and intention of the Commission to improve the protection of young people from sexualized violence through the proposed ordinance”. The concern is correct, “to combat the sexual abuse of children on the Internet with preventive and repressive measures”.
In view of the “increasingly alarming number of cases of child pornography” in the digital age, the federal states welcome the basic concern to “improve the detection and criminal prosecution of child pornography content that is shared via online services or stored there”. However, individual concrete suggestions such as chat control should be evaluated critically and must be checked.
Freedom of expression as well as freedom of communication and the media are “highest social goods and protected under constitutional law”, the Federal Council justifies its criticism on this point. The function of the media as a “public watchdog” in a democratic society should not be restricted by “chilling effects”, ie measures that have an adverse, inhibiting, intimidating or deterrent effect. The protection of informants and sources, on which investigative journalists are particularly dependent, must also be maintained. Ultimately, the regulatory competence also lies with the member states.
Criticism of cumbersome and time-consuming deletion
Basically, the Chamber praises the fact that two articles in the draft “also take into account the timely removal of depictions of child sexual abuse from the Internet”. The fact that such online content is publicly visible and accessible worldwide places a particular burden on the victims. The speedy deletion of such recordings is therefore essential. The federal government established the principle of “delete instead of blocking” years ago. The Commission nevertheless continues to push for web blocks.
The Bundesrat is critical of the fact that the path outlined by the Commission to the deletion of the illegal content “via the coordinating authority at the place of business and the competent judicial or administrative authority of the Member State is very cumbersome and therefore time-consuming”. Depictions of child sexual abuse are clearly unlawful. There is therefore no “fear of legal misjudgments by the hosting services”. The obligation to delete should therefore arise immediately and take effect immediately upon knowledge of the provider.
(bme)