Danger: Client-side scanning

Documentation
Login

Danger: Client-side scanning

European governments are proposing vague legislation that would likely require that messages be scanned for objectionable content before the message is sent (client-side scanning). This is bad.

It happens only I am presenting, but the idea for this talk came out Cryptohagen.

I will present:

  1. What is being proposed, and why it is bad
  2. Why people would propose such a bad thing
  3. Things you should pay attention to in the near future

Proposal and why it is bad

In recent years the European Union has expressed concern the solicitation of children for sexual purposes (grooming) and about the existance of child sexual abuse material (CSAM) and have proposed increasing surveillance to deal with these things.

Proposals from 2020 and 2021 have not been passed.

This year the European Union proposed regulation 2022/0155(COD), which that would require tech companies to screen their platforms for child sexual abuse material (CSAM) and establish the EU Centre on Child Sexual Abuse to coordinate the screening. The rules are supposed to rescue children from abuse, prevent CSAM from reappearing online, and bring offenders to justice.

Implementation

Here are some ways this could be implemented.

One approach to CSAM detection is to compare any image someone transmits with a database of known CSAM.

  1. Acquire and validate CSAM.
  2. Identify the CSAM with a hash function, possibly a perceptual function.
  3. Photos in iCloud Photos are hashed and compared with the hashes of the known CSAM.

You can use a perceptual hash function to match on similar images, with some false positives, or a more random hash function to match only on exactly the same image.

The Canadian Center for Child Protection and Apple have such systems.

Even with a perceptual hash function, such systems find only known images. Researchers at Google, the National Center for Missing and Exploited Children (NCMEC), and Thorn say that, to keep up with the increasing amount of CSAM, we need other methods that can detect images appearing to be CSAM but have not been seen before.

But even the most advanced such system works only on images, not, for example, on text-based grooming. So I figure, any compliant implementation would also require review by humans before matching material is sent to the EU Centre on Child Sexual Abuse.

The name "client-side scanning"

I actually don't like the name "client-side scanning" that I chose for the title. You can do this sort of image matching on any images you can find. Client-side scanning is specifically when some authority checks messages before the message goes to the intended recipient (Internet Society), and this is one way we could address some of the issues while pretending to preserve end-to-end encryption.

In practice any implementation will require human review

Fun story: I introduced two people who worked in different organizations on detecting CSAM. In front of me they were discussing some techniques for detecting CSAM. At the end of the discussion, one was going to send an email to the other with like a link to an image analysis library that he used for detection. (I forget exactly.) He noted that he would not write about why he was sending it, lest the email be incorrectly classified as CSAM. The point is that we need human review for any system like this.

Why it is bad

Logical arguments

Freedom

Privacy, freedom of expression, &c., ignoring issues with end-to-end encryption

This breaks end-to-end encryption, in varying degrees depending on the exact implementation.

Ineffectiveness

Maybe there is a trade-off between protecting freedom and stopping child sexual abuse.

Illegal online activities olready occur in places that would not be covered.

Existing methods are known to be effective and less invasive.

Information security

We lose a lot when when we break end-to-end encryption.

Also, client-side scanning enables new attacks

Children

Children need privacy too.

A practical example

Who uses end-to-end encryption?

Imagine that these softwares would be changed to comply with the new law.

Suppose you use one of these to communicate with your spouse, expecting privacy. One day, one of you shares a picture of your dog, and it matches a known CSAM image. (It's not what it looks like, Bugs in our pockets). Another day, the other of you says to meet at a particular place and time, and this is incorrectly marked as solicitation of a child for sex ("grooming").

Your correspondence is sent to a person for manual review. This is already bad; someone is reading your correspondence that should have been public. But it can be worse.

Meanwhile, the people who really are abusing children patch the software to disable the client-side scanning features in order not to be surveilled.

What authorities say

German Constitutional Court

Regarding a similar proposal from 2020, the European Data Protection Supervisor said that Confidentiality of communications is a cornerstone of the fundamental rights to respect for private and family life. Even voluntary measures by private companies constitute an interference with these rights when the measures involve the monitoring and analysis of the content of communications and processing of personal data.

UN Human Rights Council

Why people would propose such a bad thing

Maybe they think what they want is possible

The proposals all emphasize that the screening must be conducted in a way that does not compromize privacy, but they don't say how this can be accomplished.

This report by civil society group 5rights is exemplary of this lack of understanding. Most of the report explains the various rights that children have, including privacy. A paragraph comments that privacy should not protect child sex abusers nor interfere with child sexual abuse detection systems, but the report does not say how this can be accomplished.

Thorn's lobbying

Ashton Kutcher

Thorn

Thorn's proprietary scanning service Safer

Lobbyist meetings

Ursula von der Leyen has proposed stuff like this before.

Who is doing something about it

Fortunately we still have freedom of expression.

Denmark will be the first big election since this legislation has been proposed.