Tech

Wikipedia and Google Identified Wrong Man as a Serial Killer for Years

After a Discovery show displayed his booking photo in an episode about a serial killer, Nathaniel White's image spread across the internet.
Police tape.
Getty Images

The Nathaniel White that killed six women in New York in the early 90s is not the same Nathaniel White that you’ll see in the Google Images search for his name. He’s also not the man who for two years appeared in the image on the Wikipedia page for the serial killer, alongside descriptions of his victims, who include a 14-year-old and several mothers.

The man in those images isn’t a serial killer, and has never even been to New York. But his image spread online for years, on Wikipedia, Google, and news stories about the murders, and the error has taken him from a relatively unknown man in Florida to someone who has to live in constant fear of revenge.

Advertisement

It started when a Discovery program “Evil Lives Here,” aired on August 13, 2018, showed the wrong White’s booking image as part of a show about the serial killer. The image appeared only for a moment in the broadcast, but from there, it was added to Wikipedia and spread to Google Images. Until Thursday, if you Googled “Nathaniel White,” the wrong man’s face appeared in the information box alongside search results about the killer. 

White filed a lawsuit against Discovery last year, alleging invasion of privacy, defamation by libel, intentional infliction of emotional distress and negligent infliction of emotional distress. The complaint also named Google, Facebook, Twitter, Wikipedia Inc., and several other search engines and tech companies. Last month, a judge ruled that under Section 230 of the Communications Decency Act, these companies weren’t liable for damages to White. The case against Discovery is still ongoing. 

White’s attorney Charles Barfield told me that this situation has been “traumatic, emotionally, psychologically” for his client. “His image is everywhere. He’s had to dress incognito. And in a weird way, the COVID situation, we feel, saved his life. Wearing a mask helped with being in disguise.” He was once approached by a neighbor who saw the Discovery broadcast and thought he was the serial killer. “He had no control over how these defendants put his image out there, not just nationally but internationally.” 

Advertisement

Florida’s open Sunshine Laws are partially responsible for how easily the two men were mixed up: mugshots are easily accessible by the public in Florida (and the reason why “Florida Man” appears in news stories so often). White served time in state prison for statutory rape, grand theft and aggravated battery, which is where his mugshot is taken—but was released seven years ago. The serial killer White is serving 150 years to life in a New York correctional facility.  

The Wikipedia editor who added the wrong White’s mugshot, whose username is Vwanweb, pulled it from a site called crimefeed.com, according to Andreas Kolbe, a Wikipedia editor and former co-editor-in-chief of the Wikipedia newsletter Signpost. Websites like these pull from police databases of booking photos and publish people’s images and names without their consent. Those images follow them around the internet for the rest of their lives, regardless of the crime or whether they were found innocent or guilty.  

Last month, Kolbe wrote an opinion piece for the Signpost detailing how this mistake happened on Wikipedia’s end. 

Advertisement

“I believe that like many other editors, Vwanweb simply followed community practices they had observed here,” he wrote. “In this subject area, this involves widespread use of ‘true crime’ sources that present crime as entertainment, and whose level of reliability is akin to that of tabloids and other types of publications that are banned or deprecated as sources in other parts of Wikipedia.” 

But to Kolbe, the situation is also an indictment of how Wikipedia editors operate. 

“I think there is a great amount of inertia in the Wikipedia community,” Kolbe told me. “Many people have this sense that if they cite a source that's wrong, ‘it's not our fault.’ Wikipedia policy used to have a principle called ‘Verifiability, not truth.’ It was supposed to mean that facts must be both true AND verifiable in a reliable source before they could be included in Wikipedia. But many contributors understood this to mean that ‘truth doesn't really matter because we don't really know what's true or not anyway. We're just copying what sources are saying.’” 

This causes real-life harm if the sources aren’t of good quality, Kolbe said. “With almost everyone being pseudonymous, contributors feel fairly safe from repercussions when they get things wrong. To many, Wikipedia is just a sort of game, an alternative universe where notable people and their lives are fair game for them to write about.”

Advertisement

Kolbe said he’d like to see the Wikimedia Foundation, the nonprofit group that owns and operates Wikipedia, assign paid staff to verifying police-sourced photos that appear on the site. “These are photos of real people,” he said. “This case shows that, given Wikipedia's reach and quasi-online monopoly, a mistake can cause real suffering.” 

In response to Kolbe’s Signpost editorial, Stephen LaPorte, Wikimedia Foundation Associate General Counsel, wrote that while he agrees that the image would ideally have been caught and removed immediately, the process worked overall. “At the same time, we also believe that the overall Wikipedia process for this case worked once the problem was discovered,” LaPorte wrote. “It appears that the broader internet was not aware of the original research mistake until after the Discovery TV program came out. Once the mistake was reported on, Wikipedia’s open structure allowed someone (an IP editor) to remove the image quickly and there was no further problem on Wikipedia.” 

While it used to be a website banned from citation in grade school, Wikipedia is considered a reasonably reliable source of information today—for example, in 2018, YouTube announced that it would attempt to combat its misinformation problem by including links to Wikipedia pages alongside controversial videos. But it’s still moderated and edited mostly by amateur fact-checkers, all of whom are volunteers capable of making mistakes. Usually, others catch the mistake, and sometimes small edits turn into years-long debates over accuracy. But in this case, the wrong image stayed up for two years, and was changed in September 2020 following news coverage of White’s lawsuit.

Advertisement

As for Google, information in the knowledge panel, a box that appears on some search results containing at-a-glance information about a topic, is also sometimes flawed. Nathaniel White’s knowledge panel links to the Wikipedia page, where the wrong image was hosted, and for years, it showed the mugshot of the wrong man as the first image users see when searching for his name. White’s attorneys told me that they’d tried contacting Google to have the images removed, but Google didn’t act.

After Motherboard contacted Google for comment, the company removed White’s images from the Knowledge Panel. 

“In Google Search knowledge panels, we show factual information about billions of entities, and we have systems in place to help us show high quality, accurate information,” a Google spokesperson said. “When there are instances where an image in a knowledge panel is incorrect or doesn't represent the entity, we remove it under our policies, which we've done here." 

White’s attorneys are frustrated by the ruling that Google and others are off the hook because they’re protected by Section 230. Barfield called it a “legal and factual disconnect” that Congress created the law before these tech giants existed. “When Congress originally implemented 230 None of these technology companies existed,” he said. “The social media images of Mr. White as a serial killer are still out there, and [platforms] think they don't have to take it down. That's a problem, and that risks his life every second of every day.”

Whether Section 230 needs to be repealed or amended is an ongoing debate among internet scholars and communities online; while some see the law as outdated and harmful, others argue that getting rid of it would unveil a whole new set of problems online, including that its removal would endanger free speech online. 

Regardless, the wrong Nathaniel White is living in fear because of the viral spread of an image he had no control over, according to his lawyers. 

“Not knowing that any second some crazed person from anywhere around the globe might take issue with you, thinking that you’re a serial killer,” Barfield said. “Someone may think he is that serial killer, and if they want to get revenge... at any second of the day, someone may come and try to take his life.”