Apple sued over abandoning CSAM detection for iCloud


Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM).

The lawsuit argues that by not doing more to prevent the spread of this material, it’s forcing victims to relive their trauma, according to The New York Times. The suit describes Apple as announcing “a widely touted improved design aimed at protecting children,” then failing to “implement those designs or take any measures to detect and limit” this material.

Apple first announced the system in 2021, explaining that it would use digital signatures from the National Center for Missing and Exploited Children and other groups to detect known CSAM content in users’ iCloud libraries. However, it appeared to abandon those plans after security and privacy advocates suggested they could create a backdoor for government surveillance.

The lawsuit reportedly comes from a 27-year-old woman who is suing Apple under a pseudonym. She said a relative molested her when she was an infant and shared images of her online, and that she still receives law enforcement notices nearly every day about someone being charged over possessing those images.

Attorney James Marsh, who is involved with the lawsuit, said there’s a potential group of 2,680 victims who could be entitled to compensation in this case.

TechCrunch has reached out to Apple for comment. A company spokesperson told The Times Apple is “urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.”

In August, a 9-year-old girl and her guardian sued Apple, accusing the company of failing to address CSAM on iCloud.

Similar Posts