Apple is once again facing a billion dollar lawsuit, as thousands of victims come out against the company for its alleged complicity in spreading child sex abuse materials (CSAM).
Thank you for reading this post, don't forget to subscribe!In a lawsuit filed Dec. 7, the tech giant is accused of reneging on mandatory reporting duties — which require U.S.-based tech companies to report instances of CSAM to the National Center for Missing & Exploited Children (NCMEC) — and allowing CSAM to proliferate. In failing to institute promised safety mechanisms, the lawsuit claims, Apple has sold “defective products” to specific classes of customers (CSAM victims).
Some of the plaintiffs argue they have been continuously re-traumatized by the spread of content long after they were children, as Apple has chosen to focus on preventing new cases of CSAM and the grooming of young users.
“Thousands of brave survivors are coming forward to demand accountability from one of the most successful technology companies on the planet. Apple has not only rejected helping these victims, it has advertised the fact that it does not detect child sex abuse material on its platform or devices thereby exponentially increasing the ongoing harm caused to these victims,” wrote lawyer Margaret E. Mabie.
Mashable Light Speed
‘Abysmal’ working conditions, exploitation of webcam models exposed
The company has retained tight control over its iCloud product and user libraries as part of its wider privacy promises. In 2022, Apple scrapped its plans for a controversial tool that would automatically scan and flag iCloud photo libraries for abusive or problematic material, including CSAM. The company cited growing concern over user privacy and mass surveillance by Big Tech in its choice to no longer introduce the scanning feature, and Apple’s choice was widely supported by privacy groups and activists around the world. But the new lawsuit argues that the tech giant merely used this cybersecurity defense to skirt its reporting duties.
“Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk,” wrote Apple spokesperson Fred Sainz in response to the lawsuit. “We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users. Features like Communication Safety, for example, warn children when they receive or attempt to send content that contains nudity to help break the chain of coercion that leads to child sexual abuse. We remain deeply focused on building protections that help prevent the spread of CSAM before it starts.”
Tech companies have struggled to control the spread of abusive material online. A 2024 report by UK watchdog National Society for the Prevention of Cruelty to Children (NSPCC) accused Apple of vastly underreporting the amount of CSAM shared across its products, with the company submitting just 267 worldwide reports of CSAM to NCMEC in 2023. Competitors Google and Meta reported more than 1 million and 30 million cases, respectively. Meanwhile, growing concern over the rise of digitally-altered or synthetic CSAM has complicated the regulatory landscape, leaving tech giants and social media platforms racing to catch up.
While Apple faces a potential billion-dollar lawsuit should the suit move to and be favored by a jury, the decision has even wider repercussions for the industry and privacy efforts at large. The court could decide to force Apple into reviving its photo library scanning tool or implement other industry features to remove abusive content, paving a more direct path toward government surveillance and wielding another blow to Section 230 protections.
2024-12-10 18:23:35