Apple tries to assuage privacy fears over its new child-porn tracking system, but some aren’t convinced
- The tech giant is coaching employees on how to handle questions from concerned consumers
- Privacy advocates such as the Electronic Frontier Foundation warned that the technology could be used to track things other than child pornography

The tech giant is coaching employees on how to handle questions from concerned consumers, and it’s enlisting an independent auditor to oversee the new measures, which attempt to root out so-called CSAM, or child sexual abuse material.
Apple also clarified Friday that the system would only flag cases where users had about 30 or more potentially illicit pictures.
The uproar began earlier this month when the company announced a trio of new features: support in the Siri digital assistant for reporting child abuse and accessing resources related to fighting CSAM; a feature in Messages that will scan devices operated by children for incoming or outgoing explicit images; and a new feature for iCloud Photos that will analyse a user’s library for explicit images of children.
If a user is found to have such pictures in their library, Apple will be alerted, conduct a human review to verify the contents, and then report the user to law enforcement.
Privacy advocates such as the Electronic Frontier Foundation (EFF) warned that the technology could be used to track things other than child pornography, opening the door to “broader abuses.”