A sign outside a big concrete building of Twitter headquarters reads @twitter with the blue twitter bird above it.

Twitter reportedly has a child porn problem that surpasses its ability to take down accounts advertising child sexual abuse content.
Photo: Justin Sullivan (Getty Images)

Reports have shown Twitter has struggled to deal with a rash of accounts peddling child sex abuse material on its platform, and now advertisers have declared they don’t want their ads to associate with a platform that can’t police itself for non-consensual sexual material.

Reuters first reported Wednesday that major brands discovered their ads had been appearing alongside tweets that were soliciting child abuse materials. The outlet said there were 30 advertisers, including the likes of Walt Disney Company, Coca-Cola, Dyson, NBCUniversal, and more that appeared next to the profile pages of Twitter accounts active in selling and soliciting child sexual abuse material.

Reuters reported that the names of around 30 brands came up in research on online child sex abuse from cybersecurity firm Ghost Data. Some of those advertisers, like chemical company Ecolab, home tech maker Dyson, car manufacturer Mazda, and more have already pulled their ads off the platform, according to the report.

A separate report from Business Insider said Twitter had started informing advertisers Wednesday they had discovered their ads running on these kinds of profiles, which apparently occurred after Reuters shared their account with Twitter. The emails seen by Insider reportedly said the company has banned accounts that were violating Twitter’s rules and it’s investigating how many people the ads may have reached.

In an email statement to Gizmodo, a Twitter spokesperson said “We are working closely with our clients and partners to investigate the situation and take the appropriate steps to prevent this from happening in the future” which apparently includes updates to detection and account suspension methodologies. The spokesperson added that they have suspended all those profiles they found were peddling child sexual abuse content, though they did not reveal how many accounts were involved.

G/O Media may get a commission

Eufy BoostIQ RoboVac

48% Off

Eufy BoostIQ RoboVac

A helping hand

Works with Alexa or Google Assistant, can be controlled via an app, will automatically vacuum up messes in your home and will even suck harder when it finds something stubborn, and it has a battery life of up to 100 minutes.

The company does have a bi-annual transparency report. From July through December 2021, the company says it suspended close to 600,000 accounts and moderated another 600,000 for child sexual exploitation.

But despite those numbers, the question remains just how much of a child sex abuse problem is on Twitter? After all, it’s not like Twitter is an active pornography center like Pornhub. That site has been knocked for profiting off of child abuse material and cited for hosting non-consensual material, leading some credit card companies to abandon support for the site and top executives to quit after a further string of controversies. Sites like Twitter and TikTok received 86,666 and 154,618 tip reports of child sexual abuse material in 2021 respectively, according to the National Center for Missing & Exploited Children. Pornhub, on the other hand, only had a little over 9,000, though the NCMEC noted these tips depend on the number of users a site hasand the nature of those people using the sites. 

Reuters cited Ghost Data which noted that 70% of the 500 accounts they identified were dealing in child sexual abuse materials did not get taken down over a 20-day period in September. At least one of those accounts was asking for sexual content for those “13+.” The accounts advertised their content on Twitter and then moved onto apps like Telegram or Discord in order to complete actual transactions, sharing the content using Mega and Dropbox.

According to recent reports Twitter has known about such accounts for quite a while. The Verge reported that at one point earlier this year Twitter seriously considered its own version of OnlyFans, basically allowing creators the option to create paid subscriptions for adult content. What ended the initiative was that a dedicated team reported Twitter consistently fails to police against “child sexual exploitation and non-consensual nudity at scale.”

The report, based on internal documents and interviews with unnamed staff, notes that employees have been warning the company about its child porn problem for over a year. Creating such an OnlyFans-type operation could have resulted in the loss of advertising dollars as well.

Other sites like Reddit have also been cited in lawsuits for their failure to police underage sexual content, but Twitter has a lot riding on its advertising dollars.

Considering 92% of Twitter’s 2021 revenue was dependent on advertising, according to BusinessofApps data, Twitter may need a much stronger banhammer to keep the advertisers happy.

Update 9/20/22 at 5:15 p.m. ET: This post was updated to clarify the information around Pornhub and add extra examples of the number of reported tips of CSM received by the National Center for Missing & Exploited Children.

Read More