Skip to main content

Civil rights groups call on Apple to abandon CSAM ‘iPhone surveillance’

More than 90 civil rights groups around the world have signed an open letter objecting to what they call iPhone surveillance capabilities, asking Apple to abandon its plans for CSAM scanning.

Additionally, they would also like the iPhone maker to drop plans for the iMessage nude detection, as this could place young gay people at risk.

Signatories to the letter include the American Civil Liberties Union (ACLU), the Canadian Civil Liberties Association, Australia’s Digital Rights Watch, the UK’s Liberty, and the global Privacy International.

The letter highlights the primary risk many have raised, of misuse by repressive governments.

Once this capability is built into Apple products, the company and its competitors will face enormous pressure – and potentially legal requirements – from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable.

Those images may be of human rights abuses, political protests, images companies have tagged as “terrorist” or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them.

And that pressure could extend to all images stored on the device, not just those uploaded to iCloud. Thus, Apple will have laid the foundation for censorship, surveillance, and persecution on a global basis.

But it also says that the separate scanning of children’s iMessage accounts for nudes, another form of iPhone surveillance, could put children at risk.

The system Apple has developed assumes that the “parent” and “child” accounts involved actually belong to an adult who is the parent of a child, and that those individuals have a healthy relationship. This may not always be the case; an abusive adult may be the organizer of the account, and the consequences of parental notification could threaten the child’s safety and wellbeing. LGBTQ+ youths on family accounts with unsympathetic parents are particularly at risk.

It says the organizations respect Apple’s intent, but the company should stand by its privacy values.

We support efforts to protect children and stand firmly against the proliferation of CSAM. But the changes that Apple has announced put children and its other users at risk both now and in the future. We urge Apple to abandon those changes and to reaffirm the company’s commitment to protecting its users with end-to-end encryption. We also urge Apple to more regularly consult with civil society groups, and with vulnerable communities who may be disproportionately impacted by changes to its products and services.

It follows the German parliament writing a similar letter to Apple a couple of days ago.

Via Reuters

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear