Chairman and Chief Editor
Bedour Ibrahim
عاجل
madinet masr
English

In 2021, Apple had tested its own CSAM-detection features

Apple sued by West Virginia for alleged failure to stop child sexual abuse material on iCloud, iOS devices

Thu, Feb. 19, 2026
Apple 
Apple 

West Virginia’s attorney general has filed a consumer protection lawsuit against Apple, alleging that it has failed to prevent child sexual abuse materials from being stored and shared via iOS devices and iCloud services.

John “JB” McCuskey, a Republican, accused Apple of prioritizing privacy branding and its own business interests over child safety, while other big tech companies, including Google, Microsoft, and Dropbox, have been more proactive, using systems like PhotoDNA to combat such material.

PhotoDNA, developed by Microsoft and Dartmouth College in 2009, uses “hashing and matching” to automatically identify and block child sexual abuse material (CSAM) images when they have already been identified as such and reported to authorities.

In 2021, Apple had tested its own CSAM-detection features that could automatically find and remove images of child exploitation, and report those that had been uploaded to iCloud in the U.S. to the National Center for Missing & Exploited Children.

But the company withdrew its plans for the features after privacy advocates who worried that this technology could create a back door for government surveillance, and be tweaked and exploited to censor other kinds of content on iOS devices.

The company’s efforts since then have not satisfied a broad array of critics.

In 2024, UK-based watchdog the National Society for the Prevention of Cruelty to Children said Apple failed to adequately monitor, tabulate and report CSAM in its products to authorities.

And in a 2024 lawsuit filed in California’s Northern District, thousands of child sexual abuse survivors sued Apple, alleging the company never should have abandoned its earlier plans for CSAM detection features, and by allowing such material to proliferate online, it had caused survivors to relive their trauma.

Apple has positioned itself as the most privacy-sensitive of the Big Tech companies, since its CEO Tim Cook wrote an open letter on the topic in 2014.

If West Virginia’s suit is successful, it could force the company to make design or data security changes. The state is seeking statutory and punitive damages, and injunctive relief requiring Apple to implement effective CSAM detection.

In an e-mailed statement, a spokesperson for Apple told CNBC that “protecting the safety and privacy of our users, especially children, is central to what we do.”

The company pointed to parental controls and features like Communication Safety which, “automatically intervenes on kids’ devices when nudity is detected in Messages, shared Photos, AirDrop and even live FaceTime calls,” as an indication of its commitment to provide “safety, security, and privacy” to users.

“We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids,” the spokesperson added.