Apple on Thursday gave its biggest explanation yet for abandoning a controversial plan last year to detect known child sexual abuse material (CSAM) stored in iCloud Photos.
Apple statement, shared by wired Reprinted below, this is from child safety organization Heat Initiative’s report that the company will “detect, report, and remove” CSAM from iCloud and give users more tools to report such content to the company. This is in response to this demand.
Eric Neuenschwander, Apple’s director of user privacy and child safety, said in the company’s response to the Heat Initiative that “child sexual abuse material is abhorrent, and we do not support children sexually. We are committed to breaking the chains of coercion and influence that make us vulnerable to abuse.” However, after working with privacy and security researchers, digital rights organizations, and child safety advocates, the company has developed a CSAM scanning mechanism that is designed to protect privacy even if it is specifically built to protect privacy. It added that it concluded that it was not possible to proceed with .
“Scanning all users’ personally stored iCloud data would create new threat vectors for data thieves to find and exploit,” Neuenschwander wrote. “It can also lead to slippery waters with unintended consequences. For example, scanning one type of content opens the door to bulk monitoring, which can extend across content types to other encrypted messaging systems. There is a possibility that a desire to search may arise.”
In August 2021, Apple announced plans for three new child safety features. These include a system that detects known CSAM images stored in iCloud Photos, a communication safety option that blurs sexually explicit photos in the Messages app, and child exploitation resources in Siri. Communication Safety launched in the US with iOS 15.2 in December 2021 and has since expanded to the UK, Canada, Australia, and New Zealand, with “Siri” resources also available, but CSAM detection will not be launched until finally. I never did.
Apple originally said that CSAM detection would be implemented in iOS 15 and iPadOS 15 updates by the end of 2021, but the company announced that the feature would be implemented based on “feedback from customers, advocacy groups, researchers, and others.” has been postponed. The plan was criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.
Apple’s latest response to the issue comes as the debate over encryption is being reignited by the UK government. Surveillance law revision plan under consideration That would require tech companies to disable security features like end-to-end encryption without informing the public.
Apple has announced that it will end services such as FaceTime and iMessage in the UK if the bill is passed in its current form.
Note: Due to the political or social nature of the discussion on this topic, discussion threads are political news forum. All forum members and site visitors can read and follow threads, but only forum members with at least 100 posts can post.