The TikTok logo overlain on broken glass.
Image: TikTok logo and broken glass from Wikimedia Commons, edited by Caroline Haskins. 

FYI.

This story is over 5 years old.

Tech

TikTok Can't Save Us from Algorithmic Content Hell

TikTok is not immune from engagement-hungry algorithms that dominate the internet as we know it.

A young woman wears a grey t-shirt that reads “Back the Blue” and shows the Blue Lives Matter version of the American flag. Using her hands heavily in demonstrative motions that seem to mimic a slam poetry performance, she lip-syncs to the following voiceover, read by a man:

“I’m pro-gun and pro-2a,
how does that make me a bigot or anti-gay?
And I’m pretty sure I was conceived during a threeway.
You look at my guns and say they’re a disgrace,
Tell me how many crimes you stop with your safe space.”

Advertisement

She ends the video by giving the middle finger. The caption includes the hashtags #2ndammendment, #3way, and #backtheblue.

This is one of dozens of conservative-leaning TikToks shown to me by the platform’s recommendation algorithm after training it for just a day as part of an experiment into whether the platform can put people into “filter bubbles,” just like YouTube and Facebook.

This particular rhyming sequence has been lip-synced by over 1,400 people, making it one of the most popular lip-synced sounds on TikTok among its conservative community. TikTok, a successor to musical.ly, is an app for creating and sharing 15-second videos. Like Vine, the app makes revenue partially through ads. Unlike Vine, a central aspect of the app is the easy and highly-visible option of placing licensed music, or DIY voiceovers and sounds over videos.

TikTok currently has a reputation for being a wholesome, quirky meme repository—its reputation among journalists stands in stark contrast to, say, Facebook or Twitter, whose battles with misinformation and targeted foreign disinformation campaigns have been well documented, or platforms like YouTube, which has been found to push people toward conspiracy theories and that academics believe can be a radicalizing force.

But, like YouTube, TikTok also relies on a recommendation algorithm that has the same problems as any other recommendation algorithm. Namely, we don’t know exactly how it works, and, anecdotally, the algorithm seems to create echo chambers of content that allow users to not only subconsciously construct their own reality, but also can push users toward more radical content.

Advertisement

Content discovery on TikTok is driven by a content-finding feature called the “For You” page—an algorithmically generated feed of videos that automatically appears when you open the app, unless someone you follow posted a TikTok. Often, the For You page is a mechanism for finding people to follow. But like other social media sites, TikTok doesn’t do a good job of explaining how the “For You” page works, in part because even the companies making algorithmic feeds and recommendation engines can’t always explain how they work for any specific person. What does it take for the app to start pushing users into far right, conspiracy-related content in the same way that YouTube is notorious for doing?

According to the TikTok's listing in the iOS App Store, the For You page is based on how a user interacts with videos on TikTok. “A personalized video feed specifically for you based on what you watch, like, and share,” the page reads. “TikTok will quickly adapt to your taste to offer the most relevant, interesting, fun, quirky, head-turning videos that you’ll never want to stop watching.”

In other words, if you loop or watch a video over-and-over again, like a video, or share it with your friends using the in-app messenger or SMS, videos that are related by unknown qualifications and metrics are more likely to appear in your feed. Users also have the option of saying that they are “Not interested” by tapping a button nested in the “Share” feature on a TikTok. TikTok declined to comment to Motherboard about the specifics of how the For You page algorithm works.

Advertisement

Whitney Phillips, a professor of communication and online media rhetoric at Syracuse University, told Motherboard that, people naturally “self-select into filter bubbles” within digital entertainment ecosystems that are reinforced by algorithms.

“This is most striking with white people who tend to be interested in other white people and white celebrities—you replicate the community with which you identify,” Philips told Motherboard. “People are not that interested in things that don’t immediately connect to them, resonante with them. Those resonance points often have to do with what people are they familiar or comfortable with.”

TikTok userbase is pretty small when compared to the billions of users on platforms like YouTube or Facebook, which have billions of monthly active users (TikTok has 80 million users in the US; up-to-date global user information is not currently available.) But much like YouTube and Facebook, TikTok’s algorithm feeds users content and teaches them which voices to trust. A simple Motherboard experiment—which involved creating a new account, and interacting with increasingly fringe videos as recommended by the app’s recommendation algorithm—shows that fringe communities can quickly dominate a person’s feed depending on the prior videos they’ve engaged with. In other words, the social media platform isn't above the algorithmic hell that has gotten platforms like YouTube into trouble. The reason that we see TikTok as as resistant to that is simply because it's not as large and established as these other platforms.

Advertisement

Over the course of several days, Motherboard used a new TikTok account and looped, liked, and shared videos that featured subjects that are common in conservative TikToks (such as uniformed members of the police and the military), tapped Not Interested on TikTok content showing young children engaging in apolitical memes or humor bits, and “saved” sounds and hashtags associated with conservative posts (such as #maga, #trump2020, or #conservative) After liking approximately 190 TikToks and spending about a day using the app, the feed was completely filled with conservative and far-right posts that praised Donald Trump, the Second Amendment, building a border wall along the US-Mexico border, and even the QAnon conspiracy theory.

Screenshot of videos that appeared in Motherboard’s experimental TikTok feed.

Image: Screenshot of videos that appeared in Motherboard’s experimental TikTok feed.

This is not an unusual or unexpected result. But Motherboard didn’t conduct this experiment in order to prove that TikTok is dominated by far right users, because it’s not. It also doesn't prove that TikTok’s algorithm disproportionately promotes conservative content, because it does not.

Rather, this experiment proves that the TikTok recommendation algorithm was working exactly as we understand it to work: it drives users to “engaging” content that they are inclined to like and share. This structure implicitly encourages users to spend as much time as possible on the app by showing them only content that they already like.

"You can just keep getting fed content without thinking of why that content is being placed in front of your eyeballs specifically.”

Advertisement

On platforms like YouTube, this algorithmic design has gained a reputation for “redpilling”—or amplifying and directing users toward far-right, fringe content, and eventually radicalizing viewers. For instance, the site’s autocomplete search suggestions have been known to direct people toward conspiracy theories, and the “Related Channels” feature on individual channels can sometimes direct users toward more fringe, extreme content on other channels. (It’s worth noting that YouTube just implemented a TikTok-like layout for its mobile app that involves swiping through video recommendations.)

Becca Lewis, a political subculture researcher for internet research group Data & Society, told Motherboard that an app like TikTok actively reinforces social branding that tells users which voices and perspectives that are worth trusting.

“I think that one aspect of a platform like TikTok, with shorter videos, is you may not get the full, longer-form ideological content,” Lewis said. “But you get people who are still very much promoting certain ideologies, who then convert it into entertainment. Which then can also people back onto other platforms where they may get more familiar with the ideological pieces.”

For instance, consider Baked Alaska, the far right antagonizer who was banned from Twitter in 2017 for hate speech. Baked Alaska doesn’t use TikTok to go on ideological rants. Instead, he makes videos about Fortnite and participates in memes on the platform, like using the “Hit or Miss” soundbite from the song Mia Khalifa. In these videos, he casually makes ideological references by showing MAGA gear and mentioning incels.

Advertisement

This social and ideological branding glues together what Lewis calls the “Alternative Influence Network" (AIN). Lewis defined the AIN with respect to YouTube, but she said the social rules that govern the AIN can extend to platforms across the internet. In the AIN, far right internet micro-celebrities—through cultivating a down-to-earth persona and telling people that the “mainstream media” isn’t trustworthy—teach people to only trust information from their community. In doing so, the AIN creates a digital social contract and helps fortify intractable, ideologically extreme word views.

“It’s an internet phenomena more broadly,” Lewis said. “And anywhere you see people engaging in these strategies of microcelebrities building visibility and building an audience, you can find strategies of trust-building taking place.”

This trust is so powerful, according to Lewis, because users can cultivate the sense that they are being more authentic than a heavily-financed mainstream or traditional news or media outlet. Lewis told Motherboard that on video platforms in particular, like YouTube and TikTok, this sense of authenticity can build extremely powerful levels of trust.

“Video is an intimate medium,” Lewis said. “People can broadcast from their bedrooms, they can broadcast from their bathrooms, they can talk live at any moment of the day to their audience. It can broadcast different close ups of their face that maybe seem imperfect in a way that makes the audience feel closer to them.”

Advertisement

In the case of TikTok, its algorithm envelops people in social communities that extend to other internet platforms, and in doing so, teaches them what information to trust.

“I think that there is an element of trust in the algorithms that you’re going to keep feeding content back to you that you want,” Lewis said. “If you keep getting content back in a direction that you’re happy with, then a complacency develops where you can just keep getting fed content without thinking of why that content is being placed in front of your eyeballs specifically.”

But sometimes, you are forced to think about why content is given to you: specifically, when you’re fed “engaging” content that you don’t like. On creator-driven platforms such as TikTok, “engaging” content can sometimes means content that’s antagonistic, shocking, and extreme.

For instance, in my personal experience on the app, I’ve been recommended videos that show men posting “duets”—or side-by-side responses to another person’s video (that the other person can’t consent to)—with girls that are young enough to be these men’s daughters or granddaughters.

Ever since I looped one video like this several times out of morbid fascination, my For You page semi-frequently feeds me similar TikToks. Without being explicitly threatening, I found these videos deeply uncomfortable—and so do TikTok’s users:

As highlighted in John Herrman’s piece for the New York Times Magazine, this isn’t a situation that’s unique to TikTok. Rather, TikTok is just another gatekeeper to information and community that’s governed by unknown rules and unknowable algorithms. Users are put in a rather helpless situation because companies like Twitter, Facebook, and Google have kept most details about their algorithm secret, and have justified doing so as protecting “trade secrets.”

TikTok’s recommendation algorithms haven’t yet ruined the app, radicalized masses, or contributed to mass misinformation campaigns, but the app still seems to be making the same tech decisions that YouTube and Facebook have.

So how can TikTok respond to this? For people experiencing unwanted content in their TikTok feeds, it’s worth being cautious of which videos you like, even ironically, as this could affect your recommendations. It’s also worth tapping “Not interested” on poor content, or fighting against your instincts and swiping past content that triggers a negative reaction.

But any further advice about how to alter your TikTok algorithm is honestly a crapshoot, because we don’t really have much insight as to how the For You algorithm works. It’s designed to work mindlessly. All you have to do is scroll, and whatever unconscious way that you’re using the app will influence the video that comes next. If you’re unconsciously bound to engage with more extreme content, that’s the content you’ll get. You don’t get to opt in or opt out of the feed. You’re just supposed to keep scrolling.