Danger: Client-side scanning
European governments are proposing vague legislation that would likely require that messages be scanned for objectionable content before the message is sent (client-side scanning). This is bad.It happens only I am presenting, but the idea for this talk came out Cryptohagen.
I will present:
- What is being proposed, and why it is bad
- Why people would propose such a bad thing
- Things you should pay attention to in the near future
Proposal and why it is bad
In recent years the European Union has expressed concern the solicitation of children for sexual purposes (grooming) and about the existance of child sexual abuse material (CSAM) and have proposed increasing surveillance to deal with these things.- Proposal for a regulation (2020)
- Proposal for a regulation (2021)
- Informal deal with European Parliament on temporary rules
Proposals from 2020 and 2021 have not been passed.
This year the European Union proposed regulation 2022/0155(COD), which that would require tech companies to screen their platforms for child sexual abuse material (CSAM) and establish the EU Centre on Child Sexual Abuse to coordinate the screening. The rules are supposed to rescue children from abuse, prevent CSAM from reappearing online, and bring offenders to justice.
- Europe has a plan to fight online child abuse. Critics fear it may erode privacy
- Fighting child sexual abuse
- EU vil scanne alle dine beskeder og mails som led i kampen mod børneporno
Implementation
Here are some ways this could be implemented.One approach to CSAM detection is to compare any image someone transmits with a database of known CSAM.
- Acquire and validate CSAM.
- Identify the CSAM with a hash function, possibly a perceptual function.
- Photos in iCloud Photos are hashed and compared with the hashes of the known CSAM.
You can use a perceptual hash function to match on similar images, with some false positives, or a more random hash function to match only on exactly the same image.
The Canadian Center for Child Protection and Apple have such systems.
Even with a perceptual hash function, such systems find only known images. Researchers at Google, the National Center for Missing and Exploited Children (NCMEC), and Thorn say that, to keep up with the increasing amount of CSAM, we need other methods that can detect images appearing to be CSAM but have not been seen before.
But even the most advanced such system works only on images, not, for example, on text-based grooming. So I figure, any compliant implementation would also require review by humans before matching material is sent to the EU Centre on Child Sexual Abuse.
The name "client-side scanning"
I actually don't like the name "client-side scanning" that I chose for the title. You can do this sort of image matching on any images you can find. Client-side scanning is specifically when some authority checks messages before the message goes to the intended recipient (Internet Society), and this is one way we could address some of the issues while pretending to preserve end-to-end encryption.In practice any implementation will require human review
Fun story: I introduced two people who worked in different organizations on detecting CSAM. In front of me they were discussing some techniques for detecting CSAM. At the end of the discussion, one was going to send an email to the other with like a link to an image analysis library that he used for detection. (I forget exactly.) He noted that he would not write about why he was sending it, lest the email be incorrectly classified as CSAM. The point is that we need human review for any system like this.- If we use an artificial intelligence to detect potential grooming and potential CSAM, we surely will have lots of false positives.
- If we use a perceptual hash function just on images, we will have false positives.
- Even if we use a non-perceptual hash function (e.g., MD5), we could still have false positives: What if the EU Centre on Child Sexual Abuse needs to share its database of CSAM with some other group that is working to stop child sexual abuse?
Why it is bad
Logical arguments
Freedom
Privacy, freedom of expression, &c., ignoring issues with end-to-end encryption
- Duh
- Risk of false positives, especially with "artificial intelligence" methods
- Manual review (Breaking encryption myths)
This breaks end-to-end encryption, in varying degrees depending on the exact implementation.
- Keys Under Doormats
- Especially problematic if messages leave the client computer (Ot Van Daalen)
Ineffectiveness
Maybe there is a trade-off between protecting freedom and stopping child sexual abuse.Illegal online activities olready occur in places that would not be covered.
- Scanning is not showned to be necessary, effective, or proportionate for combatting child abuse
- According to the Child Protection Association of Germany, board member Joachim Türk, most illegal online activities already occur via forums and the dark web (John Cody). It would be more effective to focus on the places where the illegal activity in fact occurs.
Existing methods are known to be effective and less invasive.
- Existing methods for detection and removal of CSAM are already effective. Service providers already remove such material. (Robert Bongen, Daniel Moßbrucker)
- The effective CSAM-detection methods are much less invasive than client-side screening (Patrick Breyer)
Information security
We lose a lot when when we break end-to-end encryption.Also, client-side scanning enables new attacks
Children
Children need privacy too.A practical example
Who uses end-to-end encryption?
- Email with GPG
- Signal
- Cryptpad
Imagine that these softwares would be changed to comply with the new law.
Suppose you use one of these to communicate with your spouse, expecting privacy. One day, one of you shares a picture of your dog, and it matches a known CSAM image. (It's not what it looks like, Bugs in our pockets). Another day, the other of you says to meet at a particular place and time, and this is incorrectly marked as solicitation of a child for sex ("grooming").
Your correspondence is sent to a person for manual review. This is already bad; someone is reading your correspondence that should have been public. But it can be worse.
- Suppose that you exchanged passwords in this conversation. The reviewer can see your passwords, and the passwords stay in the review center's database.
- Suppose you mentioned something that was legal where you were but is illegal somewhere else or that becomes illegal in the future (being gay, smoking weed, getting an abortion). Maybe you will be referred to the police?
Meanwhile, the people who really are abusing children patch the software to disable the client-side scanning features in order not to be surveilled.
What authorities say
- Data screening is legal only in the presence of concrete danger.
- A general threat does not suffice.
- General screening for terrorists is not allowed.
Regarding a similar proposal from 2020, the European Data Protection Supervisor said that Confidentiality of communications is a cornerstone of the fundamental rights to respect for private and family life. Even voluntary measures by private companies constitute an interference with these rights when the measures involve the monitoring and analysis of the content of communications and processing of personal data.
- Encryption and anonymity play a critical role in securing the rights of freedom of opinion and expression.
- Access to Tor should be protected and promoted.
Why people would propose such a bad thing
Maybe they think what they want is possible
The proposals all emphasize that the screening must be conducted in a way that does not compromize privacy, but they don't say how this can be accomplished.
This report by civil society group 5rights is exemplary of this lack of understanding. Most of the report explains the various rights that children have, including privacy. A paragraph comments that privacy should not protect child sex abusers nor interfere with child sexual abuse detection systems, but the report does not say how this can be accomplished.
Thorn's lobbying
Ashton Kutcher
- How a Hollywood star lobbies the EU for more surveillance
- Dude, where's my privacy?: How a Hollywood star lobbies the EU for more surveillance
- Ashton Kutcher: Stalling new EU privacy rules would let 'kids be abused in the dark'
Thorn
Thorn's proprietary scanning service Safer
Lobbyist meetings
- Freedom of information request
- FOI request
- FOI request
- FOI request
- FOI request
- FOI request
- List of meetings with Thorn
- Email from Thorn
- Thorn lobbyist registration
Ursula von der Leyen has proposed stuff like this before.
Who is doing something about it
Fortunately we still have freedom of expression.- European Data Protection Supervisor
- Joint statement on the dangers of the EU’s proposed regulation for fighting child sexual abuse online.
- european-commission-must-uphold-privacy-security-and-free-expression-by-withdrawing-new-law.pdf
- You can sign.
Denmark will be the first big election since this legislation has been proposed.