FYI.

This story is over 5 years old.

Tech

Twitter Is Fighting an Uphill Battle to Censor Sexualised Images of Children

A much overlooked aspect of Twitter is how it is being used to share sexualised images of underage girls.
Image: Clay Patrick McBride/Getty

Twitter has grappled with offensive or illegal content on its network. Whether it's Islamic State supporters sharing graphic propaganda, or far-right racists harassing high profile users, the site has been often criticised for not doing enough to police its network.

Last week, Twitter introduced new features designed to give users more protections from persistent abuse. But another, much overlooked aspect of Twitter is how it is being used to share sexualised images of children. On Monday, one Twitter user shared what they claimed to be the usernames of a slew of those who post child pornography on the network.

Advertisement

Motherboard recently monitored and analysed content from one hashtag that is popular with these sorts of users, revealing a stream of inappropriate content, and found that Twitter is playing something of a whack-a-mole game with this corner of the network.

(Warning: Some of the tweets referenced in this story use graphic and disturbing language.)

Motherboard used Python scripts to monitor and analyse users publicly sharing tweets with sexualised images of children in real time over a week long period. Some hashtags were used to share adult pornography, but many of the tweets referenced, and included images of, underage children too.

Not all accounts sharing this kind of material use hashtags, however: Some accounts simply post images with no text, making the tweets more difficult to find.

In one tweet from August, a user posted a photo of a little girl in a cheerleading outfit, who appears to be less than 10 years old, along with the text, "Someone to trade pics with and maybe cumtribute a pic." "Cumtributing" is when people masturbate onto a picture, and then upload a photo of it.

"mmmmm you look young and tasty :) how old are you?" one user replied to the tweet.

"OMG! Completely perfectly beautiful!" a second wrote.

Another tweet from the same month reads, "15 and so pretty," and comes with four photos of a young girl, one of which is her in a bikini.

"would still fuck her," a user replies.

Advertisement

An example of one tweet sexualizing an image of a very young girl. Many of the tweets Motherboard found featured language that was significantly more graphic, which we've declined to reproduce.

Using other Python scripts, it was possible to map the accounts that shared this kind of imagery to the other users they followed. Many of them followed the same selection of accounts. These included cam girls; girls who shared their own social media pictures; accounts that shared adult pornography; and other Twitter users who posted photos of girls, seemingly without their knowledge.

The Python script Motherboard used also gathered a handful of public tweets containing images that were likely child pornography—one appeared to be of an underage girl masturbating; another was apparently a young girl exposing her own genitalia in an explicit selfie that had somehow made its way publicly onto Twitter. Some accounts had "CP", a commonly used acronym for child pornography, in their Twitter handles, and one other had a nude underage girl as its profile picture.

Most of the images Motherboard found, however, were not sexually explicit in and of themselves: the girls are sometimes fully dressed, in swimwear or their school uniform. But it is the trade, publication, and re-contextualisation of these pictures as sexual objects that is alarming. Some tweets name those pictured, meaning that the individuals could have been specifically targeted, or bullied. Many of the images appear to be sourced from the girls' social media accounts.

Even if many of the images aren't in and of themselve illegal, "Then it really is about: who are we choosing to protect? What kind of standards are we setting? What sort of behaviors are we going to normalise, and turn a blind-eye to," Whitney Phillips, a Humboldt State University professor and author of This Is Why We Can't Have Nice Things, which explores online harassment and trolling, said.

Advertisement

Phillips drew a parallel between this activity on Twitter with Reddit's infamous creepshots and other similar forums. Here, users would share images of women taken without their consent, and sexualise them with commentary. Technically, many of these images weren't illegal, but were still largely offensive, and could be seen as harassment.

Regardless, Reddit had to act.

"Reddit had to make a choice: were they going to let the law determine what was and what was not appropriate," Phillips said. In 2012, Reddit banned the creepshots forum.

"Twitter now finds itself in a similar situation, where you have a lot of this really egregious behaviour," Phillips told Motherboard. Although the mediums are different, with Reddit dealing in entire subreddits, and Twitter more likely focused on policing individual users, the issue is largely the same: striking a balance between creating a platform for sharing diverse content, while maintaining a healthy environment for users.

Twitter acknowledged multiple requests for comment and an interview, but did not provide comment.

Twitter has, and does, respond to at least some this content, though.

When it comes to illegal images of child pornography, Twitter has, in many cases, provided information to the National Center for Missing and Exploited Children (NCMEC), which has then helped law enforcement identify people who have shared child pornography on the social network.

Advertisement

In August, a man from Hollywood, Florida, allegedly told investigators he obtained child pornography pictures and videos from Twitter, the Miami Herald reported. He now faces 11 counts of child pornography possession, and is accused of posting illegal material to Twitter himself as well. Earlier the same month, another man charged with 31 counts of possessing child pornography pleaded not guilty. He had allegedly posted images to Twitter of children from the ages of 7 to 15 performing sexual acts.

"[My old account] got banned this is my new account please share"

And as for the legal but inappropriate images, Motherboard found several of the accounts sharing pictures of schoolgirls or other similar material had been suspended days or weeks later, but many remained, and other accounts quickly sprang up too.

One of the shuttered accounts had some 3000 followers, and even created a poll asking how old its followers were. Over 60 percent out of nearly 300 respondents said they were over 20 years old.

But in that case, the banned user came straight back, and tried to re-establish contact with other accounts. For this type of content, and perhaps for policing its network in general, Twitter is constantly one step behind its users.

"[redacted old account] got banned this is my new account please share," the owner of the account wrote on August 29 after it was suspended by Twitter. Another account that this one interacted with is still online, and boasts over 10,000 followers, and has tweeted over 1000 images.

Even though Twitter has introduced new measures to curb harassment, those features—allowing users to mute keywords or threads—don't really apply to this sort of material. The images likely aren't appearing in the affected parties' notifications anyway, if they even use Twitter at all. Instead, they're being traded and gawked over elsewhere in the network. Many of the girls in the photos probably aren't even aware that pictures of them are floating around on Twitter.

When it comes to harassment, bullying, and inappropriate content, many of Twitter's users, and the network's problems, exist in a grey zone, and raise much more difficult questions than if the content was straight up illegal: Does Twitter need to clamp down on these accounts, even if much of what is being shared may not be breaking the law? What is Twitter's responsibility to those pictured in the images? And finally, due to its scale, will Twitter simply continue to play whack-a-mole with various umbrellas of accounts, including those who share sexual images of underage children?

"If you allow that kind of behaviour to fester […] ultimately I think that creates a space where people don't feel safe at are less likely to spend much of their time there," Phillips said.

Get six of our favorite Motherboard stories every day by signing up for our newsletter.