iPhones will be scanned for child sex abuse images in the USA.
Apple has announced that it will be implementing a system that will scan iPhones for child sex abuse material (CSAM) for US customers.
Before an image is stored on iCloud photos, the technology will search for matches already know as CSAM.
If an initial match is found, a human reviewer will be appointed to assess it further and then will report the user to law enforcement. But how will it ensure that the technology is not violating the privacy of the users?
Concerns have been raised that it will violate the privacy of users by having access to prohibited and politically sensitive information. Experts express the concerns that the technology can be used by authoritarian governments to spy on citizens.
Apple announced that new versions of iOS and iPadOS – due to be released later this year – will have “new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy”.
The system will work by comparing pictures to a database of known child sexual abuse images compiled by the US National Center for Missing and Exploited Children (NCMEC) and other child safety organizations.
These images are then translated into “hashes”, numerical codes that can then be matched to an image stored on an iPhone.
The technology will identify similar images that are edited from the source.
“Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes,” Apple said.
Apple further commented that it will manually review every report and then will take action. Apple has voiced that will not impact user’s privacy but some experts tend to differ.
“Regardless of what Apple’s long-term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content,” Matthew Green, a security researcher at Johns Hopkins University, said.
“Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone.”