Apple has confirmed its plans to fight child abuse through new technology in iOS, macOS, iMessage, and watchOS.
This technology aims to prevent the spread of child abuse images by scanning all photos that will be saved to iCloud. This scan is performed on the user’s device, but is promised to protect the user’s privacy.
This scan of his image is based on special cryptography to detect signs of child abuse present in photos. The scan itself will only occur when the photo is uploaded to iCloud.
Then if the scanned photo is detected as child abuse from the existing cryptographic data, then the photo will be reported to Apple if it meets the child sexual abuse material (CSAM) requirements, as quoted by detikINET from The Verge, Friday (6/8/2021).
For years, Apple used a hash system to scan emailed child abuse images, much like the system used by Gmail and other email service providers.
However, with this new cryptographic technology, Apple can scan and search for photos of child abuse before they are sent to other users, or even if the photos are never sent to other users.
Apple assures that they will not accept any photos that do not match the CSAM database. They also will not accept metadata or other visual data from photos that do not meet the requirements.
Then the user will not be able to access or view the database of CSAM photos, and the user also does not know which photos are marked as CSAM by the system.
The system used is equipped with various safeguards, which in theory, makes the error rate of only 1 false warning per 1 trillion users annually.
Then any warnings that appear from photos that are suspected to be in the CSAM category will be reviewed by Apple and the National Center for Missing and Exploited Children (NCMEC), before being sent to law enforcement.
This security system will be applied to Apple’s various operating system platforms in the United States starting this fall. Previously there was also a similar feature that previously existed in iMessage, namely scanning photos in children’s accounts to look for the presence of pornographic content.