Apple has announced new measures to limit the spread of child sexual abuse material (CSAM). Cupertino-based tech giant has launched a tool to scan CSAM or child sexual abuse content stored on your iPhone. These new CSAM detection features will be introduced in the future with iOS 15, iPadOS 15, WatchOS 8 and MacOS Monterey versions.
The CSAM detection features will work in three areas, including photos, Siri and search and messages. Apple says the measures were developed in collaboration with child protection experts and ensure users’ privacy.
The new CSAM detection tool will allow you to identify child abuse images stored in iCloud photos. Apple claims that instead of scanning images in the cloud, the new tool “matches the device using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child protection organizations.”
With privacy in mind, Apple ensures that the database is a “set of unread hashes that are securely stored on users’ devices.”
Privacy in the head
Apple explained that this “matching process is driven by a cryptographic technology called private set intersection, which determines if there are any matches without revealing the results.” It highlights that the iPhone “creates a cryptographic safety voucher that encodes match results with additional encrypted data about the image. This voucher is uploaded to iCloud Photos with images.
It uses another technology called “Threshold Secret Sharing” to ensure the content of the security voucher is “cannot be interpreted by Apple unless the iCloud photo account exceeds the limit of known CSAM content”. “Cryptographic technology allows Apple to interpret the content of security vouchers related to matching CSAM images only when the limit is exceeded.” Apple explains.
More CSAM detection system
The company has announced the addition of a new tool to the Messages app to alert children and their parents when taking or sending sexually explicit photos. Apple explains that after receiving such child abuse content, the image “will become blurred and the child will be alerted and helpful resources will be presented and reassured that they will be fine if they do not want to see this image.” This tool will also let parents know when a child sees the content of such child abuse.
Similarly, if a child tries to send a sexually explicit image, he or she will be warned before sending the image, and if the child chooses to do so, the parents will have a chance to receive a message. The company claims that the feature is designed to prevent Apple from accessing messages.
Apple is expanding Siri and search directions by providing additional resources to help children and parents stay safe online. Siri will be able to inform users on how to report CSAM. Both Siri and Search will be able to intervene when users search for queries related to CSAM.
The post Apple wants to scan your iPhone for child porn content appeared first on BGR India.