Apple is backing away from controversial photo scanning plans
[ad_1]
In August, Apple has identified several new features to stop the spread of child sexual abuse materials. Reaction to privacy from cryptographers Edward Snowden it was almost immediately immediate, not only in connection with the decision taken by Apple scan iCloud photos for CSAM, but check to see if they match your iPhone or iPad. After weeks of screaming, Apple is standing. At least for now.
“We announced last month plans to feature features to protect children from predators who use communication tools to hire and exploit them and to help limit the spread of child sexual abuse material,” the company said in a statement Friday. “Based on the feedback from clients, advocacy teams, researchers and others, we have decided to take extra time in the coming months to gather input and make improvements before releasing these important child safety features.”
Apple did not provide further guidance on what form these improvements might take or how this entry process might work. But privacy advocates and security investigators are optimistic about the break.
“I think Apple is a smart move,” says Alex Stamos, a former Facebook security chief and co-founder of cybersecurity consulting firm Krebs Stamos Group. “There is a very complicated set of compensation involved with this problem and it is very difficult for Apple to come up with an optimal solution without listening to a variety of stock markets.”
CSAM scanners create cryptographic “hashes” of well-known offensive images — a kind of digital signature — and then comb huge amounts of data for matches. Many companies are already doing it somehow, including Apple iCloud for iCloud. But with the intention of extending this scan to iCloud photos, the company suggested that you take an extra step to verify these hashes on your device, even if you have an iCloud account.
The concern to establish this ability to compare images on your phone with a set of well-known CSAM hashes — provided by the National Center for Lost and Exploited Children — immediately raised concerns that the tool could one day be used for another use. “Apple would extend a CSAM scanning feature to everyone’s phones that governments could do and turn it into a surveillance tool so that Apple can also make people’s phones look for other materials,” says Riana Pfefferkorn, a researcher at the Stanford Internet Observatory.
Apple has faced numerous requests from the U.S. government to build a tool that will allow law enforcement to unlock and decrypt iOS devices in the past. But the company also has it he made concessions To countries like China, where customer data lives on state servers. At a time when legislators around the world were stepping up their efforts to remove encryption more generally, the introduction of the CSAM tool was particularly difficult.
“They clearly feel it’s politically challenging, which they believe shows how unbearable it is, an“ attitude ”that Apple will always deny pressure from the government, says cryptographer Matthew Green of John Green Hopkins University. “If they feel they need to scan, they should scan unencrypted files on their servers,” is a common practice for other companies, such as Facebook, which regularly review not only CSAM but also terrorist and other permitted content. Green also suggests that Apple should do iCloud storage end-to-end encrypted, even if he didn’t want to, he couldn’t see those pictures.
The discussion about Apple’s plans was also technical. Hashing algorithms can generate false positives, even when two images are not incorrectly identified as matches. These errors are of particular concern in the context of CSAM. Shortly after Apple reported, researchers began finding clashes in the iOS “NeuralHash” algorithm that Apple wanted to use. Apple said at the time that the version of NeuralHash that was up for grabs was not the same as the one that would be used in the scheme and that the system was accurate. Some of the collisions also have no material impact in practice, says Paul Walsh, founder and CEO of security company MetaCert, considering that Apple’s systems match 30 hashes before the alarms go off, that humans would be able to review what CSAM is and what a false positive.
[ad_2]
Source link