Two years ago, Apple announced a number of new child safety features, including a system that would use on-device processing to scan for child sexual abuse materials. Despite the privacy-focused implementation of the feature, Apple faced enough backlash for the feature that it ended up abandoning its plans.

Now, Apple finds itself in the position of facing renewed pressure from advocacy groups and activist investors to take better action against CSAM.

As first reported by Wired, the child safety advocacy group Heat Initiative is launching a multi-million dollar campaign in which it presses Apple on this issue. We reported this morning on Apple’s response to Heat Initiative’s campaign, in which the company acknowledged the potential precedent of implementing the CSAM detection feature.

“Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types,” Erik Neuenschwander, Apple’s director of user privacy and child safety, said in an interview with Wired.

Now, Heat Initiative’s full campaign has officially launched. On the website for the campaign, Heat Initiative takes an aggressive stance against Apple. The campaign includes language such as: “Child sexual abuse material is stored on iCloud. Apple allows it.”

The campaign explains:

Apple’s landmark announcement to detect child sexual abuse images and videos in 2021 was silently rolled back, impacting the lives of children worldwide. With every day that passes, there are kids suffering because of this inaction, which is why we’re calling on Apple to deliver on their commitment.

The advocacy group says that it is calling on Apple to “detect, report, and remove child sexual abuse images and videos from iCloud.” It also wants the company to “create a robust reporting mechanism for users to report child sexual abuse images and videos to Apple.”

The campaign’s website also includes several “Case Studies” that graphically detail instances in which iCloud was used to store sexual abuse photos and videos. The site also includes a button to “Email Apple leadership directly.” This button opens an email form for a mass email sent to Apple’s entire executive team.

Heat Initiative has also sent a letter addressed to Tim Cook in which the group says Apple’s inaction puts “children in harm’s way.”

In our recent research, we have come across hundreds of cases of child sexual abuse that have been documented and spread specifically on Apple devices and stored in iCloud. Had Apple been detecting these images and videos, many of these children would have been removed from their abusive situations far sooner.

That is why the day you make the choice to start detecting such harmful content, children will be identified and will no longer have to endure sexual abuse. Waiting continues to put children in harm’s way, and prevents survivors, or those with lived experience, from healing.

But in addition to the pressure from Heat Initiative’s looming advertising, Apple will also soon face pressure from investors on this matter. 9to5Mac has learned that Christian Brothers Investment Services is planning to file a shareholder resolution that would call on the company to take action on improving CSAM detection.

Christian Brothers Investment Services describes itself as a “Catholic, socially responsible investment management firm.” The proposal is believed to play a role in Heat Initiative’s advertising campaign as well. Simultaneously, the New York Times is also now reporting that Degroof Petercam, a Belgian investment firm, will also back the resolution.

As we’ve explained in the past, this puts Apple between a rock and a hard place. Privacy advocates view the company’s initial implementation of CSAM detection as a dangerous precedent. Child safety advocates, meanwhile, say the company isn’t doing enough.

While Apple did abandon its plans to detect known CSAM images when they are stored in iCloud, the company has implemented a number of other child safety features.

Follow ChanceThreadsTwitterInstagram, and Mastodon. Donate to support St. Jude Children’s Research Hospital.


Add 9to5Mac to your Google News feed. 

FTC: We use income earning auto affiliate links. More.

Read More