Apple Asked to Scrap Plans to Scan Users’ Images

Apple Asked to Scrap Plans to Scan Users' Images

( – Sometimes, the best of intentions can lead to the worst of circumstances. Such is the case with a recently announced Apple scheme to protect children from abuse.

On August 5, Apple announced a new program to keep children safe from online predators who use mobile devices and other digital communication means to “recruit and exploit them.” In addition, the Big Tech company hoped to curb the distribution of Child Sexual Abuse Material (CSAM).

The proposed system would provide communication tools enabling parents to more easily monitor their children’s online activities. It would also update SIR and Search to provide increased protection from unsafe encounters.

Lastly, the system would use “on-device machine learning” to analyze incoming image attachments to determine if a photograph is “sexually explicit.” Apple claims the system doesn’t access messages, but the potential invasion of privacy is setting off alarm bells in some circles.

On August 19, a coalition of more than 90 worldwide groups sent an open letter to Apple head Tim Cook asking him to abandon the program.

The group acknowledged the program’s intention to protect children from possible exploitation and limit the spread of CSAM. However, they expressed concerns that others might use the system to “censor protected speech” and violate users’ privacy rights.

They also warned the program’s “scan and alert” feature could result in parents receiving information about their children that could compromise their wellbeing, such as LGBTQ youths with parents who are opposed.

The concept behind the program may be well-intentioned. However, violating privacy rights can be like opening Pandora’s Box. Once you lift the lid, you can never undo the resulting damage.

Copyright 2021,