Since August, EFF and others have been telling Apple to cancel its new child safety plans. Apple is now changing its tune about one component of its plans: the Messages app will no longer send notifications to parent accounts.

That’s good news. As we’ve previously explained, this feature would have broken end-to-end encryption in Messages, harming the privacy and safety of its users. So we’re glad to see that Apple has listened to privacy and child safety advocates about how to respect the rights of youth. In addition, sample images shared by Apple show the text in the feature has changed from “sexually explicit” to “naked,” a change that LBTQ+ rights advocates have asked for, as the phrase “sexually explicit” is often used as cover to prevent access to LGBTQ+ material. 

Now, Apple needs to take the next step, and stop its plans to scan photos uploaded to a user’s iCloud Photos library for child sexual abuse images (CSAM). Apple must draw the line at invading people’s private content for the purposes of law enforcement. As Namrata Maheshwari of Access Now pointed out at EFF’s Encryption and Child Safety event, “There are legislations already in place that will be exploited to make demands to use this technology for purposes other than CSAM.” Vladimir Cortés of Article 19 agreed, explaining that governments will “end up using these backdoors to … silence dissent and critical expression.” Apple should sidestep this dangerous and inevitable pressure, stand with its users, and cancel its photo scanning plans.

Apple: Pay attention to the real world consequences, and make the right choice to protect our privacy.

Read further on this topic: 

 

Related Issues