Headline
Apple Expands Its On-Device Nudity Detection to Combat CSAM
Instead of scanning iCloud for illegal content, Apple’s tech will locally flag inappropriate images for kids. And adults are getting an opt-in nudes filter, too.
In December, Apple announced that it was killing a controversial iCloud photo-scanning tool the company had devised to combat child sexual abuse material (CSAM) in what it said was a privacy-preserving way. Apple then said that its anti-CSAM efforts would instead center around its “Communication Safety” features for children, initially announced in August 2021. And at the company’s Worldwide Developers Conference in Cupertino today, Apple debuted expansions to the mechanism, including an additional feature tailored to adults.
Communication Safety scans messages locally on young users’ devices to flag content that children are receiving or sending in messages on iOS that contain nudity. Apple announced today that the feature is also expanding to FaceTime video messages, Contact Posters in the Phone app, the Photos picker tool where users choose photos or videos to send, and AirDrop. The feature’s on-device processing means that Apple never sees the content being flagged, but beginning this fall, Communication Safety will be turned on by default for all child accounts—kids under 13—in a Family Sharing plan. Parents can elect to disable to feature if they choose.
“The Communication Safety feature is one where we really want to give the child a moment to pause and hopefully get disrupted out what might be a grooming conversation,” says Apple’s head of user privacy Erik Neuenschwander. “So it’s meant to be high friction. It’s meant to be that there is an answer which we think is likely right in that child’s situation, which is not to move forward, and we really want to make sure they’re educated.”
Apple said in December that it planned to make an application programming interface (API) available so third-party developers could easily integrate Communication Safety into their apps and use it to detect child sexual abuse material, or CSAM. The API, known as the Sensitive Content Analysis framework, is available now for developers. Platforms like Discord have already said that they plan to incorporate it into their iOS apps.
A Communication Safety prompt for a child’s account.
Photograph: Apple
One criticism of anti-CSAM initiatives like Communication Safety is that they don’t have full coverage across apps and services to flag content and protect children everywhere, potentially allowing communications with abusers to slip through. And this is precisely why Apple says it invested heavily to develop a robust API that app makers can easily incorporate into their software.
“There was a lot of work that we had to do to create this model,” Neuenschwander says. “We can make that investment once and then have that be leveraged across the platform for potentially all of the applications our users use.”
An opt-in Sensitive Content Warning for an adult’s account.
Photograph: Apple
On top of Communication Safety, Apple says that it has also received feedback about interest in a version of the feature for adults. So today, the company also launched “Sensitive Content Warning,” which similarly uses local scanning to flag and blur images and videos that contain nudity. The feature is optional and controlled in the iOS “Privacy & Security” settings menu. And it’s designed to be more subtle than Communication Safety. The goal is to protect adult users from content they don’t want to see without being intrusive or unhelpful.
Meaningfully addressing and reducing child sexual abuse online is a complicated and difficult problem. And Apple says that it isn’t done exploring and investing in new solutions. But with such high stakes, there’s a real urgency for a company like Apple to roll out features that will have an impact and promote their adoption as widely as possible.