Advertisement
Apple
WorldUnited States & Canada

Apple to scan US iPhones for images of child sex abuse

  • The plan drew praise from child protection groups but also sparked concerns over potential misuse for government surveillance
  • Apple says its ‘neuralMatch’ tool will flag sensitive content without making private communications readable by the company

Reading Time:2 minutes
Why you can trust SCMP
The Apple logo is seen at the entrance to a store in Brussels, Belgium in July. Photo: Reuters
Associated Press

Apple unveiled plans to scan US iPhones for images of child abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused by governments looking to surveil their citizens.

Apple said its messaging app will use on-device machine learning to warn about sensitive content without making private communications readable by the company.

The tool Apple calls “neuralMatch” will detect known images of child sexual abuse without decrypting people’s messages. If it finds a match, the image will be reviewed by a human who can notify law enforcement if necessary.

Advertisement

But researchers say the tool could be put to other purposes such as government surveillance of dissidents or protesters.

Matthew Green of Johns Hopkins, a top cryptography researcher, was concerned that it could be used to frame innocent people by sending them harmless but malicious images designed to appear as matches for child porn, fooling Apple’s algorithm and alerting law enforcement – essentially framing people.

Advertisement
Advertisement
Select Voice
Choose your listening speed
Get through articles 2x faster
1.25x
250 WPM
Slow
Average
Fast
1.25x