Apple has taken a huge step towards protecting children by announcing its new plan to scan iPhone photos for images of child abuse. The company will use a “neural match” system to scan photographs and if anything looks suspicious, a human at Apple will be notified to review the images and contact the authorities if necessary.
According to Apple, the new system will “continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up system.”
The system is designed to protect users’ privacy by scanning photos without making private communications readable by the company.
Julia Cordua, CEO of Thorn, said that Apple’s technology balances “the need for privacy with digital safety for children.” Thorn is a nonprofit that uses technology to protect children from sexual abuse.
The neural match system was trained to find images of abused children by scanning a massive database of photos supplied by the National Center for Missing and Exploited Children.
“The reality is that privacy and child protection can co-exist. We applaud Apple and look forward to working togeth… https://t.co/Isp5NImT5S
— NCMEC (@MissingKids)
1628198557.0
“Apple’s expanded protection for children is a game-changer,” John Clark, the president and CEO of the National Center for Missing and Exploited Children, said in a statement. “With so many people using Apple products, these new safety measures have the life-saving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material.”
Ninety percent of all photos are taken with mobile phones and Apple is the number one selling smartphone in America. That means child abusers will have a much harder time getting away with their heinous acts without being caught.
Apple’s decision to scan the cloud to catch child abusers is a bit of an about-face for the company. In the past, it has steadfastly stood up to law enforcement agencies’ requests to use its technology to glean information for criminal investigations.
While organizations that protect children are excited about the new system, some fear the new technology will be exploited by bad actors to invade people’s privacy. Worse, it could open floodgates for governments across the globe to access Apple users’ personal data.
“What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,'” Matthew Green, a top cryptography researcher at Johns Hopkins, asks. “Does Apple say no? I hope they say no, but their technology won’t say no.”
“This will break the dam — governments will demand it from everyone,” Green tweeted.
“It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of . . . our phones and laptops,” Ross Anderson, professor of security engineering at the University of Cambridge, said according to Financial Times.
Some fear that the technology will be used to set people up. A bad actor could send someone a photo that triggers the system, putting the unwilling person in serious trouble.
If Apple’s new system goes according to plan, it will be a powerful tool to catch those who abuse children and will be a strong deterrent as well. But if the system’s critics are correct, it could destroy the trust consumers have with Apple and give authoritarians direct access to our private lives.