IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Why is Apple building a tool to scan iPhone photos?

Answer: To look for child abuse imagery.

Apple
Shutterstock
Apple is reportedly working on a tool in iOS 15, the latest version of the operating system on iPhones, that will scan photos on the device for child sexual abuse material (CSAM) when they’re uploaded to iCloud. Apple confirmed the tool was real after Matthew Green, a cryptography professor at Johns Hopkins University, tweeted that he had heard about it.

The tool, called NeuralHash, will convert the photos being uploaded into unique strings of code called hashes. It then compares those hashes to a database of known CSAM hashes. If a user reaches a certain threshold of matches from separate photos, the system will lock down the user’s account and alert Apple’s manual review team. Then, and only then, will the images be decrypted for review.

Some security experts have pointed to concerns with this system, such as hackers flooding accounts until the user gets locked out. Apple also stated that the feature will be required in order to use iCloud Photos, so the only way users have to opt out is to not do so.