Due to privacy concerns, Apple has postponed the launch of the CSAM photo scanning system.
Apple disclosed plans to check iCloud Photos for suspected Kid Sexual Abuse Material (CSAM) earlier this month as part of its efforts to strengthen child safety measures. Following criticism from security professionals and digital rights organizations such as the Electronic Frontier Foundation, Apple has postponed the implementation of CSAM detection.
Apple Postpones the Implementation of CSAM Detection
Apple had planned to launch CSAM detection later this year. It applies to iCloud accounts set up as families for iOS 15, iPadOS 15, and macOS Monterey. The Cupertino behemoth has not yet announced a new release date for the feature. Apple has also not specified which aspects of CSAM detection it intends to improve or how it intends to approach the function to strike a good balance between privacy and safety.
“Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” According to an official statement issued by Apple.
(also see: Apple Announces iPhone 12 and iPhone 12 Pro Service Program for ‘No Sound Issues’)
To remind you, Apple’s CSAM detection works only on-device and does not scan photos in the cloud. It looks for known CSAM picture hashes given by NCMEC and other child safety groups. This on-device matching happens just before you submit an image to iCloud Photos. However, researchers have recently found hash collisions, which can effectively recognize pictures as false positives. Apple has also stated that since 2019, it has been checking iCloud Mail for child abuse.
VIA MacRumors