Security
Headlines
HeadlinesLatestCVEs

Headline

A Controversial Plan to Scan Private Messages for Child Abuse Meets Fresh Scandal

An EU government body is pushing a proposal to combat child sexual abuse material that has significant privacy implications. Its lead advocate is making things even messier.

Wired
#apple#git#sap

Danny Mekić, an Amsterdam-based PhD researcher, was studying a proposed European law meant to combat child sexual abuse when he came across a rather odd discovery. All of a sudden, he started seeing ads on X, formerly Twitter, that featured young girls and sinister-looking men against a dark background, set to an eerie soundtrack. The advertisements, which displayed stats from a survey about child sexual abuse and online privacy, were paid for by the European Commission.

Mekić thought the videos were unusual for a governmental organization and decided to delve deeper. The survey findings highlighted in the videos suggested that a majority of EU citizens would support the scanning of all their digital communications. Following closer inspection, he discovered that these findings appeared biased and otherwise flawed. The survey results were gathered by misleading the participants, he claims, which in turn may have misled the recipients of the ads; the conclusion that EU citizens were fine with greater surveillance couldn’t be drawn from the survey, and the findings clashed with those of independent polls.

The micro-targeting ad campaign categorized recipients based on religious beliefs and political orientation criteria—all considered sensitive information under EU data protection laws—and also appeared to violate X’s terms of service. Mekić found that the ads were meant to be seen by select targets, such as top ministry officials, while they were concealed from people interested in Julian Assange, Brexit, EU corruption, Eurosceptic politicians (Marine Le Pen, Nigel Farage, Viktor Orban, Giorgia Meloni), the German right-wing populist party AfD, and “anti-Christians.”

Mekić then found out that the ads, which have garnered at least 4 million views, were only displayed in seven EU countries: the Netherlands, Sweden, Belgium, Finland, Slovenia, Portugal, and the Czech Republic.

At first, Mekić could not figure out the country selection, he tells WIRED, until he realized that neither the timing nor the purpose of the campaign was accidental. The Commission’s campaign was launched a day after the EU Council met without securing sufficient support for the proposed legislation Mekić had been studying, and the targeted counties were those that did not support the draft.

The legislation in question is a controversial proposal by the EU Commission known as Chat Control or CSA Regulation (CSAR) that would obligate digital platforms to detect and report any trace of child sexual abuse material on their systems and in their users’ private chats, covering platforms such as Signal, WhatsApp, and other messaging apps. Digital rights activists, privacy regulators, and national governments have strongly criticized the proposal, which the European data protection supervisor (EDSR) Wojciech Wiewiorowski said would amount to “crossing the Rubicon” in terms of mass surveillance of EU citizens.

“I think it is fair to say that this was an attempt to influence public opinion in countries critical of the indiscriminate scanning of all digital communications of all EU citizens and to put pressure on the negotiators of these countries to agree to the legislation,” says Mekić. “If the European Commission, a significant institution in the EU, can engage in targeted disinformation campaigns, it sets a dangerous precedent.”

The Dutch researcher’s find adds to the controversy surrounding the Commission, which has recently come under fire due to allegations that certain AI firms and advocacy groups with significant financial backing have had a notable level of influence over the shaping of CSAR—allegations that lead Commissioner Ylva Johansson countered, asserting that she had committed no wrongdoing. Johansson’s office did not respond to WIRED’s request for comment.

“There’s an inexplicable obsession with this file [CSAR] in the Commission. I don’t know where that comes from,” EU lawmaker Sophie in ’t Veld told WIRED after she submitted a priority parliamentary question on the case. “Why are they doing [the campaign] while the legislative process is still ongoing?”

As the pressure on the Commission has been mounting, Johansson lashed out against the influential civil rights nonprofit European Digital Rights, one of the strongest critics of the legislation and a defender of end-to-end encryption. She referred to its funding from Apple, suggesting EDRi was laundering the company’s talking points. “Apple was accused of moving encryption keys to China, which critics say could endanger customer data. Yet no one asks if these are strange bedfellows, no one assumes Apple is drafting EDRI’s speaking points,” she noted in a Commission blog on October 13.

Referring to EDRi’s independence and transparent funding processes, senior policy adviser Ella Jakubowska tells WIRED: “Attempts to delegitimize civil society suggest a worrying effort to silence critical voices, in line with broader trends of shrinking civic space. When both the content and the process regarding this law are so troubling, we need to ask how the European Commission allowed it to reach this point.”

Adding to the Commission’s conspicuous ad campaign, Mekić’s discovery came on the same day the European Commission formally sent X a request for information under the Digital Services Act, the sweeping EU digital disinformation law, following indications of “spreading of illegal content and disinformation” on the platform.

“The Commission’s ability to enforce the DSA could be undermined if their own services are willing to flagrantly disregard these rules. Whilst the DSA is supposed to rein in the power of big tech, the CSA Regulation would force digital service providers to become a privatized police force in all our online chats, emails, and messages,” says Jakubowska. “It’s hard to see how these regimes can coexist or more broadly, how the EU can uphold fundamental human rights if it passes a law which could violate the essence of certain rights.”

The votes in the European Parliament and EU Council are still pending. “The Commission seems to forget its own role here: It is not a legislator. It only has the prerogative of proposing legislation,” says in ’t Veld. “It has no business interfering in the internal debate, neither in the Council nor in the European Parliament. The Commission is completely out of bounds here.”

Commenting on the negotiations, the outcome of which remains uncertain, EDRi’s Jakubowska observed that “it is reassuring to see that many lawmakers are alert to just how terrifying it would be for the EU to endorse a proposal that in essence amounts to distributed spyware on everyone’s devices.”

As far as the contested ad campaign is concerned, Wiewiorowski, the European data protection supervisor, initiated a pre-investigation requesting from the Commission “information related to the described use of microtargeted ads,” with a response due by the end of last week. Wiewiorowski’s office declined WIRED’s request to comment on the potential for a formal investigation.

Ironically, Danny Mekić now believes he’s been shadowbanned by X, alongside other journalists, scientists, and researchers who have been critical of CSAR. X did not respond to WIRED’s request for comment.

“I haven’t received any response from X,” Mekić says. “Steps are currently being taken to find out what caused this and whether it was an algorithmic or human decision or whether it was done at the request of so-called trusted flaggers—such as, for example, the European Commission itself, or one of the organizations involved in the CSAR lobby.”

Wired: Latest News

The Worst Hacks of 2024