Apple executive Craig Federighi said the company failed to effectively explain the new child safety protections that scan iPhone users photos for evidence of abuse.
In an interview with the Wall Street Journal, the personable Federighi said the “messages got jumbled” over the photo scanning policy, which has been met with fierce criticism from privacy advocates. Some have said the system, which aims to prevent the material being uploaded to iCloud, is tantamount to surveillance.
Federighi says Apple wishes it could have been clearer over the policy rollout, which will come with iOS 15 in the United States, and also include tools to restrict the sharing of child sexual abuse material via iMessage.
“It’s really clear a lot of messages got jumbled pretty badly in terms of how things were understood,” Apple’s senior vice president of software engineering told the WSJ. “We wish that this would’ve come out a little more clearly for everyone because we feel very positive and strongly about what we’re doing.”
The announcement of the CSAM policy has somewhat damaged Apple’s reputation as a privacy-first company with many worried about the ramifications if the company’s security is penetrated by a government, for instance.
Apple has said that images users are attempting to upload to iCloud are scanned against a list of known CSAM images from the National Center for Missing and Exploited Children (NCMEC) in the United States. All searches take place on the device rather than in the cloud.
Federighi also looked to assure innocent users they won’t get flagged for false positives and find themselves in trouble with the law through no fault of their own. He said users will only be detected if the scans detect around 30 of images that are known to the authorities.
There are currently no plans to roll out the system in the UK or other countries, Federighi says in the interview, but it will be considered on a case-by-case bases. The “hashes” used to detect the images will ship with all versions of iOS 15, but they won’t be used for scanning anywhere but the US.
Federighi assured that the system will have “multiple levels of audibility” depending on the country the policy rolls out in. That will mean “you don’t have to trust any one entity, or even any one country, as far as what images are part of this process.”