FYI.

This story is over 5 years old.

Tech

Interview with a Troll Whisperer

An interview with troll researcher Whitney Phillips about her book 'Why We Can’t Have Nice Things.'
Image: YouTube

Whitney Phillips is practically a troll whisperer. She spent years in the bowels of the internet in places like 4chan studying the elusive "subcultural troll"—those voracious provocateurs engaging in a form of proactive schadenfreude. Or as they say online, doing it for "the lulz," or making you mad so they can laugh. Her book "This is Why We Can't Have Nice Things" puts this type of behavior, mostly perpetuated by privileged white males, into broader cultural context.

Advertisement

By decoding their world, she also decodes mainstream society. Trolls act as both a mirror and an amplifier of our society's ills. Phillips' book is also notable for a first-hand look at Facebook RIP memorial trolls, a subgenre of antagonists mocking modern mourning culture that has died down due to better tools on the platform.

We caught up with Phillips to discuss her work, the term "trolling," and how we as a digital society can move forward.

Motherboard: How did you first decide you were going to study trolls?
Whitney Phillips: Originally when I started my PhD program in 2008, I went into the program thinking I was going to study political humor and simultaneous to that, my brother had gotten into 4chan, and as a joke he was encouraging me to go check it out because he said that I would like it.

Since I was already online anyway I was like "sure, whatever" and I checked it out and it was so… I didn't know what I was looking at it, it was confusing and because I didn't know how to classify or categorize any of the humor I was seeing, it intrigued me. It's been many years of trying to figure out something that just escaped me, that I didn't understand.

Do you feel like you understand what trolls are now?
No, because the sense of the term "trolling" has undergone such a profound transformation since I started researching it. At the time, what I was looking at was subcultural trolling, and yeah I understood that, but it evolved as subcultures often do and then media outlets started using the term more and more freely and it became associated with more and more stuff outside of subcultural trolling, that it's now become really hard for me to even wrap my head around what the term trolling means these days. I'm wary of the term as a behavioral catchall, and I actually believe it causes more problems than it solves, so no, in 2015, "what is trolling" is still a vexing question to me.

Advertisement

Why do you think the media broadened the word "troll" to include cyber bullies and random hate online?
I think that it, as a behavioral catchall, is really catchy. It's short, it's easily recognizable, it has a kind of cool, sinister sound to it.

So why do you define it as "subcultural trolling," of people who call themselves "trolls" who coordinate activities "for the lulz"?
I prefaced the book by saying I am defining the word in this way because I can't be responsible for talking about everything that's ever been described as trolling, that would be 27 different books, and that is not what I am doing here. It's hard to make any kinds of statements about why a lot of people do the things that they do online, especially when they are anonymous, but I do think that one thing the word trolling has afforded is it's emboldened people to be really awful and hateful and reveal those terrible sides of themselves.

"A lot of trolling stems from and is based on gender violence, and gendered rhetoric"

And then they get to say, "oh, no I'm just trolling," as a way of repudiating responsibilities for their own actions and then you're supposed to be forgiven because you're just playing. That's a bullshit cop-out. Even if you don't mean the things you say, that is almost irrelevant, because what kind of impact does your behavior have, regardless of what you might think of your speech or behaviour, if it ends up negatively impacting somebody?

Advertisement

I don't buy into the "I was just trolling" excuse, I think that that really functions as an excuse for people who don't want to claim ownership of the fact that they are just sexist assholes, or whatever flavor of asshole they might be.

Whitney Phillips was also interviewed on the Motherboard podcast:

I really liked how you talked about the relationship between trolls and mainstream media and how they utilize the same tools, the same social engineering. Did you get a lot of pushback for this?
That is a subject that I bring up often when I have interacted with the media and that does not often appear in stories about trolls. I can't speak to anyone else's editorial process but I do think that it is a sensitive subject when you work in a news outlet. You've got a job to do, and often your job is to deliver stories that are going to generate the most ratings. So it is a tricky conversation to have with journalists because I certainly don't want it to seem like I am accusing anybody personally, but the system, and the click-based economy generally, lends itself to often very sensationalist stories.

So you know the media will never ever stop being sensationalist… does that mean will trolls exist forever? Is that just a thing now, about living online?
I think trolling is symptomatic of so many larger cultural ills that it is not just about media sensationalism. A lot of trolling stems from and is based on gender violence, and gendered rhetoric.

Advertisement

"It's really important to focus on the verbs of online behavior"

There are all these types of things, all these cultural frameworks that trolls occupy really comfortably, and sensationalist media is just one of these frameworks—there are other ways trolls fit real snug in mainstream American culture.

How should the media cover troll activity in a responsible manner?
This answer has changed over the years and the way that I frame it now, I actually am reluctant to use the word "trolls" as a behavior catch-call because I don't think it is helpful. So instead, for example in response to GamerGate stuff, I am constantly being asked to talk about the trolling that was happening, and my response to that was, "that's not trolling, don't call that trolling." Don't give this a degree of playfulness, don't affix the "just trolling" to these behaviors, because it's so much more serious than that, call it what it is, and in this case, call it gendered violence, or call it violent misogyny, is probably a better way to put it.

It's really important to focus on the verbs of online behavior, particularly online aggression and antagonism, focusing on what is actually happening. Well, violent misogyny is happening, so let's call it that as opposed to this fuzzy noun that kind of has this connotation of playfulness, and has a way people can repudiate with "well, I was just trolling." If you accuse someone of being a violent misogynist, what are they going to say? "Oh, I was just being a violent misogynist"? It doesn't provide the same kind of cop-out, and I think that seeing the preponderance of especially violent misogynist behavior, I don't want to give people rhetorical outs. Avoiding those fuzzy nouns allows us to focus on the actual behavior and think carefully and critically about how we might respond to a specific behavior as opposed to basically a word that functions as a synonym for asshole.

Advertisement

How are you going to stop being people from assholes on the internet? That's such a vague question. No one would try to seriously answer that question with legislation, but that is happening with, what people are doing with the word "trolling," and that that causes more problems than it solves.

You mentioned legislation, do we really need some type of anti-trolling law, or some type of similar legislation?
I think because the word trolling has become so fuzzy, an "anti-trolling law" runs the risk of being too broad to be effective. If it really is essentially an anti-asshole law, what does that mean? Something needs to be done, but a broad legislative act, I don't know how much that could possibly do.

So how does feminist trolling or "trolling for good," which you discuss in your book, fit into all of this?
I don't know. I am ambivalent about that. I think that trolling rhetoric—essentially this process of tricking people into revealing their true intentions and then attacking them without them realizing it—can be such an effective rhetorical strategy. It can be really helpful pedagogically, the kind of poking and prodding until you end up having a more significant conversation than one that might have occurred otherwise. To get someone upset and caring about a subject so you can have a more meaningful dialogue. Trolling rhetoric can work really well for that, but the antagonism that is embedded in it and the fact that trolling rhetoric is often precluded on the lack of consent really concerns me. So I don't know how to feel about wholly embracing a behavioral practice that is based on not allowing other people to choose whether or not they are participating, they just are participating. I'm not saying that it can't be done, or that I haven't seen it be done, and it can be really satisfying to troll a racist or a misogynist, to get them so mad that they leave the space.

Advertisement

You mentioned in your book these subculture trolls use racist tropes and target minorities… is it to make fun of minorities, or our culture?
It's hard to know who the targets of trolling actually are, right, so with the GNAA and the Sandy Loot Crew, in that case they were utilizing racist tropes to fool people but in that case I don't know who they were aiming at [fooling]. Sometimes it is just scattershot.

"Whose speech are you most interested in protecting, privileging? Who do you think the rest of the country should have to hear?"

Another example, in the Facebook memorial page trolling, a young white girl had been murdered and people were all up in arms so this one troll decided to call attention to the fact that a young black girl had been murdered the same day on the opposite side of the country but no one seemed to care. So there they were, utilizing racial tensions for trollish ends.

The primary example is the way that they use the N word, the fact that it is such a prominent word in the trolling lexicon, that kind of racist aggression is so pervasive in those circles.

At the end of your book you talk about educating people about trolls, so people know what they are looking at online. How would you go about doing that?

That's such a slippery question because I certainly wouldn't want to advocate any kind of solution that is predicated on victim-blaming, like, "well, if you hadn't let yourself get trolled, then we wouldn't be in this situation, so just don't get trolled next time and then no problem." So I think those kinds of solutions, or any solution that ends with the punchline "don't feed the trolls." I say punchline because it's such a farcical thing to say as it's actually putting the onus on the person being targeted, and depending on what that person's life experiences been even something that might appear small to someone else could be absolutely devastating, so it's hard to know how certain insults will land… which is why I am really resistant to those kinds of "just buck up kid, if you can't take the heat stay out of the kitchen/internet" sayings.

Yeah that is something that I hear all the time from "trolls" themselves. "That person is so sensitive it, it's just words on a screen." But our whole lives now play out on screens.
Exactly. So I think instead of putting the onus on the target, which is not fair especially with the kinds of people who end up being targeted, I think it is good to bypass both the troll and the target, and talk about people who own the platform. A lot of these conversations come down to the question of whose speech are you most interested in protecting, privileging? Are you most interested in protecting and privileging the people who are just going to be disgusting towards women and people of color and LGBT people, or, platform owners, are you most interested in trying to allow for a space that will support the widest number of voices, the most democratic conversations?

That's actually what is missing in a lot of the conversations about "free speech" on the internet, which is not actually about preserving the First Amendment but more about censorship—people are so up in arms about protecting the speech of people who are awful, what about the people whose voices are being silenced? It becomes a choice at a certain point: whose voices are you most concerned about protecting, who do you want to hear, who do you think the rest of the country should have to hear.