FYI.

This story is over 5 years old.

Tech

Do Not Google 'Big Ol' Unless You Want to See Bestiality Porn

Just don't.

A (censored) screenshot of my Google search for 'big ol'. 

It started out like any other day in the office. The Motherboard staff was engaged in a lively discussion over how best to use the term "big ol'" in a headline. In particular, I was curious as to whether "big ol" or "big ole" would be the more appropriate way to go. Naturally, I googled 'big ol' to discern a point of reference. That point of reference turned out to be some graphic woman-on-dog bestiality porn.

Advertisement

I noticed the graphic pictures on the Images section, on which I, somewhat regrettably, clicked. The second image returned did indeed display an act of bestiality. In fact, four of the next 24 results were bestiality porn. Now, I have been using the internet for a while now, and I have a Reddit account, so I wasn't exactly shocked by the images itself; mostly, I was intrigued that they were turned up after the input of a distinctly non-pornographic search term. Big ol' sounds like slang my grandfather would use.

And yet here, in sequence, are the images that mine eyes came to rest upon: A Toby Keith album cover, a screenshot of a video of a woman fornicating with a large canine, a women with breasts that have been photoshopped into enormity, an Adventuretime still, two photos of women's butts, a giant dead bear, more butts, more bear, a nude woman, a bottomless woman rolling in snacks, and that same woman from earlier allowing the canine to perform cunnilingus on her. It goes on.

Again, Google searches that cough up potentially offensive images are nothing new. But the unusual combination of the banal search and the aggressively repulsive results—I counted four stills from the bestiality photo shoot in the first few rows of returns—were enough to lead me to reach out to Google.

I asked about their image removal policies, and whether Google proactively sought out to remove offensive imagery, or waited to receive a complaint. I didn't get a response to that query, or much clarity about what Google's internal system for removing offensive content currently is.

After a multiple day-delay, a Google rep, Jason Freidenfelds, told me that "No filter is perfect, and we're always improving our systems; we appreciate the feedback."

"We've updated things so within a few days the inappropriate results shouldn't show up anymore," he added. He also pointed me to the company's Removal Policies which state that offensive material would be removed, such as images and results depicting:

  • Pornography, including search queries with multiple meanings and search queries that are not offensive but could have offensive results
  • Bodily functions or bodily fluids
  • Vulgar words
  • Graphic content, including: injuries and medical conditions, depictions of death and human body parts
  • Animal cruelty or violence

He also sent me a link to a statement Google sent the Verge in 2012, after allegations arose that the company was censoring pornographic images in search after it updated its image search to work more like regular search. The truth is, while I wish Google would be a little more transparent about how it works, it is attempting to organize a vast flood of data. That means that if someone was assiduous in labeling the image text for each of the photos they uploaded from a bestiality shoot 'Cheryl and a big ol hound', then, from time to time, we're going to see pictures of a woman touching a dog's genitals when we search for old-timey slang from time to time, at least until someone complains.