This month, Apple announced several new features under the auspices of expanding its protections for young people, at least two of which seriously walk back the company’s longstanding commitment to protecting user privacy. One of the plans—scanning photos sent to and from child accounts in Messages—breaks Apple’s promise to offer end-to-end encryption in messaging. And when such promises are broken, it inevitably opens the door to other harms; that’s what makes breaking encryption so insidious. 

Apple’s goals are laudable: protecting children from strangers who use communication tools to recruit and exploit them, and limiting the spread of child sexual abuse material. And it’s clear that there are no easy answers when it comes to child endangerment. But scanning and flagging Messages images will, unfortunately, create serious potential for danger to children and partners in abusive households. It both opens a security hole in Messages, and ignores the reality of where abuse most often happens, how dangerous communications occur, and what young people actually want to feel safe online. 

JOIN THE WORLDWIDE PROTEST

TELL APPLE: DON'T SCAN OUR PHONES

How Messages Scanning Works

In theory, the feature works like this: when photos are sent via Messages between users who are under a certain age (13), those photos will be scanned by a machine learning algorithm. If the algorithm determines that the photo contains “sexually explicit” material, it will offer the user a choice: don’t receive or send the photo, and nothing happens; or choose to receive or send the photo, and the parent account on the Family Sharing plan will be notified. The system also scans photos of users between 13 and 17 years old, but only warns the user that they are sending or receiving an explicit photo, not the parents.  

Children Need In-App Abuse Reporting Tools Instead

The Messages photo scanning feature has three limitations meant to protect users. The feature requires an opt-in on the part of the parent on the Family Sharing plan; it allows the child account to decide not to send or receive the image; and it’s only applicable to Messages users that are designated as children. But it’s important to remember that Apple could change these protections down the road—and it’s not hard for a Family Sharing plan organizer to create a child account and force (or convince) anyone, child or not, to use it, easily enabling spying on non-children. 

Creating a better reporting system would put users in control—and children are users.

Kids do experience bad behavior online—and they want to report it. A recent study by the Center for Democracy and Technology finds that user reporting through online reporting mechanisms is effective in detecting “problematic content on E2EE [end-to-end encrypted] services, including abusive and harassing messages, spam, mis- and disinformation, and CSAM.” And, when given the choice to use online tools to do so, versus reporting to a caregiver offline, they overwhelmingly prefer using online tools. Creating a better reporting system like this would put users in control—and children are users.

But Apple’s plan doesn’t help with that. Instead, it treats children as victims of technology, rather than as users. Apple is offering the worst of both worlds: the company inserts its scanning tool into the private relationships between parents and their children, and between children and their friends looking for “explicit” material, while ignoring a powerful method for handling the issue. A more robust reporting feature would require real work, and a good intake system. But a well-designed system could meet the needs of younger users, without violating privacy expectations.

Research Shows Parents Are A Bigger Danger for Children than Strangers

Apple’s notification scheme also does little to address the real danger in many many cases.  Of course, the vast majority of parents have a child’s best interests at heart. But the home and family are statistically the most likely sites of sexual assault, and a variety of research indicates that sexual abuse prevention and online safety education programs can’t assume parents are protective. Parents are, unfortunately, more likely to be the producers of child sexual abuse material (CSAM) than are strangers. 

In addition, giving parents more information about a child’s online activity, without first allowing the child to report it themselves, can lead to mistreatment, especially in situations involving LGBTQ+ children or those in abusive households. Outing youth who are exploring their sexual orientation or gender in ways their parents may not approve of has disastrous consequences. Half of homeless LGBTQ youth in one study said they feared that expressing their LGBTQ+ identity to family members would lead to them being evicted, and a large percentage of homeless LGBTQ+ youth were forced to leave their homes due to their sexual orientation or gender. Leaving it up to the child to determine whether and to whom they want to report an online encounter gives them the option to decide how they want to handle the situation, and to decide whether the danger is coming from outside, or inside, the house. 

It isn’t hard to think of other scenarios where this notification feature could endanger young people. How will Apple differentiate a ten year-old sharing a photo documenting bruises that a parent gave them in places normally hidden by clothes—which is a way that abusers hide their abuse—from a nude photo that could cause them to be sextorted? 

Children Aren’t the Only Group Endangered by Apple’s Plan

Unfortunately, it’s not only children who will be put in danger by this notification scheme. A person in an abusive household, regardless of age, could be coerced to use a “child” account, opening Messages users up to tech-enabled abuse that’s more often found in stalkerware. While Apple’s locked down approach to apps has made it less likely for someone to install such spying tools on another’s iPhone, this new feature undoes some of that security. Once set up, an abusive family member could ensure that their partner or other household member doesn’t send any photos that Apple considers sexually explicit to others, without them being notified. 

Finally, if other algorithms meant to find sexually explicit images are any indication, Apple will likely sweep up all sorts of non-explicit content with this feature. Notifying a parent that a child is sending explicit material when they are not could also lead to real danger. And while we are glad that Apple’s notification scheme stops at twelve, even teenagers who will see only a warning when they send or receive what Apple considers a sexually explicit photo could be harmed. What impact does it have when a young woman receives a warning that a swimsuit photo being shared with a friend is sexually explicit? Or photos of breastfeeding? Or nude art? Or protest photos

Young People Are Users, Not Pawns

Apple’s plan is part of a growing, worrisome trend. Technology vendors are inserting themselves more and more regularly into areas of life where surveillance is most accepted and where power imbalances are the norm: in our workplaces, our schools, and in our homes. It’s possible for these technologies to help resolve those power imbalances, but instead, they frequently offer spying, monitoring, and stalking capabilities to those in power. 

This has significant implications for the future of privacy. The more our technology surveils young people, the harder it becomes to advocate for privacy anywhere else. And if we show young people that privacy isn’t something they deserve, it becomes all-too-easy for them to accept surveillance as the norm, even though it is so often biased, dangerous, and destructive of our rights. Child safety is important. But it’s equally important not to use child safety as an excuse to dangerously limit privacy for every user.

By breaking the privacy promise that your messages are secure, introducing a backdoor that governments will ask to expand, and ignoring the harm its notification scheme will cause, Apple is risking not only its privacy-protective image in the tech world, but also the safety of its young users.

JOIN THE WORLDWIDE PROTEST

TELL APPLE: DON'T SCAN OUR PHONES

Further Reading: 

Related Issues