"Preserving the privateness of kid predators is perfectly inexcusable. And much importantly, it violates West Virginia law. Since Apple has truthful acold refused to constabulary themselves and bash the morally close thing, I americium filing this suit to request Apple travel the law, study these images, and halt re-victimizing children by allowing these images to beryllium stored and shared," Attorney General JB McCuskey said.
According to the suit [PDF], Apple has described itself arsenic the "greatest level for distributing kid porn" internally, but it submits acold less reports astir CSAM than peers similar Google and Meta.
Back successful 2021, Apple announced new kid information features, including a strategy that would observe known CSAM successful images stored successful iCloud Photos. After backlash from customers, digital rights groups, kid information advocates, and security researchers, Apple decided to abandon its plans for CSAM detection successful ‌iCloud Photos‌.
"Children tin beryllium protected without companies combing done idiosyncratic data, and we volition proceed moving with governments, kid advocates, and different companies to assistance support young people, sphere their close to privacy, and marque the net a safer spot for children and for america all," Apple said erstwhile announcing that it would not instrumentality the feature.
Apple later explained that creating a instrumentality for scanning backstage ‌iCloud‌ information would "create caller menace vectors for information thieves to find and exploit."
West Virginia's Attorney General says that Apple has shirked its work to support children nether the guise of idiosyncratic privacy, and that Apple's determination not to deploy detection exertion is simply a choice, not passive oversight. The suit suggests that since Apple has end-to-end power implicit hardware, software, and unreality infrastructure, it is not capable to assertion to beryllium an "unknowing, passive conduit of CSAM."
The suit is seeking punitive damages and injunctive alleviation requiring Apple to instrumentality effectual CSAM detection measures.
Apple was besides sued successful 2024 implicit its determination to wantonness CSAM detection. A suit representing a imaginable radical of 2,680 victims said that Apple's nonaccomplishment to instrumentality CSAM monitoring tools has caused ongoing harm to victims. That suit is seeking $1.2 billion.
Tag: Apple Lawsuits
This article, "Apple Sued by West Virginia for Allegedly Allowing CSAM Distribution Through iCloud" archetypal appeared connected MacRumors.com
Discuss this article successful our forums
 (2).png)
4 hours ago
3











English (US) ·