Apple are monitoring the images on your phone. Dump your iPhone.

UPDATE : Apple backed down, in that they announced a PAUSE, which might be a cancel, or might be a “we’ll try again later when everyone has forgotten about it”.

This story is based on ArsTechnica and Wired.

Who is affected?

They will be (or are, depending when you read this) doing two things.

Checking the images on your phone and in your messages against a database of known Child Sexual Abuse Images. (CSAM).

This is actually not too much of a problem in principle. The CSAM hashes have been used for years (decades maybe by now). It is in effect a list of specific images already known to be CSAM, and if they hit your phone, to be honest, that does beg the question of “WTF are you doing with CSAM on your phone?”

ISPs, the big storage people, file transfer companies, many hosts – the places where your files are – use CSAM hash databases to monitor their systems already.

Monitoring images coming to phones it THINKS belong to children, and if it THINKS it detects a sexually explicit image, telling the parent.

This is Machine Learning based. That’s AI to me. Do you trust AI? (Note, this isn’t about CSAM going to children, this is about ordinary porn).

Again, good intentions, and might well shut down kids sharing nudes of their classmates. But, for us :-

What happens if the image is of YOU? A kids follows you Twitter, (because we all know that’s possible) and grabs an image. If your image is being shared by teens and parents in a school are getting alerts, could the police jump to the conclusion that you are somehow trying to entice the kids into illegal activity? Even if they soon realise their error, it would certainly spoil your day/week and it might be months before you see your phone again.

It’s not clear from the articles what ages these children will be. I suppose it will probably default to under 18s and let parents turn it on or off as they decide their kids are growing.

Why is this a problem?

Because it blows a hole in the privacy that everyone thinks Apple provides. Note : Apple make it clear that they don’t think it does, but what matters to us is the scope for mission creep. If Apple can do this, what might courts and governments decide they must do AS WELL.

A company can say to a court “we CANNOT break privacy, because TECHNICAL MODEL” and experts agree and courts can say “OK”, and leave it up to the Gvt to legislate as to whether or not that product should be sold. That’s the current Apple position.

A company can’t say “we WON’T break privacy because we only want to break it for CSAM, and not for “FSSW bust at hotel”. Well they can say that, and a court can say “Fuck you, you WILL provide the info required”.

If Apple can convince the courts that they can’t, it also becomes a lot easier for Governments to say “ah, if you can do that, we will now require you to do this…”. Govt builds hashed DB of more images. Such as those of suspected SW. hoovered up from Twitter, AW and escort sites. And suddenly the fact that your image is in that DB, and you were at the hotel, makes you guilty. Even when you’re not.

It can go out of Apple’s control. And it’s centralised, so potentially it’s a very effective surveillance system, (in the way that breaking into an individual suspects phone is not).

THAT is the problem.

And at the moment, Android provides an alternative.

Maybe this is the way the world is going, but that’s no reason to give up before you have to.

Leave a Reply

Your email address will not be published. Required fields are marked *