Google Wants You to Help Fix the Fake-Fact Problem It Created

Barack Obama is the king of the United States. Republicans are Nazis who clearly hate the Constitution. Dinosaurs are being used to indoctrinate both children and adults into believing that the earth is millions of years old. Women can’t love men. Fire trucks are red because Russians are red.

If you took Google for its word, you’d believe all of these “facts” as truth. Each absurd claim has appeared as a “snippet” on a Google search result—you know, those boxes above the lists of links that try to answer your questions so that you don’t have to actually click through to a website.

Thankfully, Google has changed or removed all of these snippets once they became widely known. But high-profile misfires like these have put pressure on Google to seek new ways to curb inaccurate or offensive snippets before they poison credulous minds—or embarrass the company. Many of these measures, such as algorithm tweaks or new guidelines for the workers who evaluate search results, will happen behind the scenes. But the company will also roll out an expanded feedback form for reporting inappropriate snippets, search results, and autocomplete suggestions.

There’s a lot to like about this plan. It shows that Google is taking the problem of misinformation seriously while offering up a new level of transparency in making public some criteria for removing or changing search suggestions. But these fixes only solve one part of the problem with snippets. Improving snippet accuracy does nothing to address the problem of Google cannibalizing traffic from the sources from which it strips these answers. Nor does it resolve the underlying philosophical question: When should Google try to provide “one true answer” to a question versus just delivering a list of links? After all, the easiest way to get rid of misinformation in snippets is to get rid of snippets altogether, right?

But for the future of Google’s business, the answer is not that simple. Having an authoritative answer to as many search queries as possible is increasingly important to the company as it extends its reach beyond traditional, text-based search results into the world of voice-based personal assistants. When you ask your phone or your web-connected speaker a question, you want an answer, not a list of webpages. Even in the text-based world, you often want a quick answer to settle an argument.

But it turns out that turning search results into pat answers has a cost. Last week, The Outline reported that lost about 65 percent of its traffic after Google started including its data in snippets instead of leaving it to users to click through to the site. Site founder Brian Warner said he had to lay off half his staff. This undermining isn’t just a problem for the sites that Google scrapes for information. It’s a problem for Google itself, because if the companies that gather and publish this data can’t make money and have to close, Google loses its source of data.

Meanwhile, there are some questions Google clearly shouldn’t even try to answer. For example, as of now it doesn’t show a snippet for the query, “Does God exist?” But it also stays out of questions like, “Did the Holocaust actually happen?” and “Is climate change real?”

So where should Google draw the line? Conspiracy theorists claim that because jet fuel doesn’t burn hot enough to melt steel beams, 9/11 was an inside job. When you search “can jet fuel melt steel beams,” as The Outline points out, Google displays an excerpt from a Popular Mechanics article pointing out that although it’s technically true that burning jet fuel won’t melt steel, the beams that held up the World Trade Center buildings didn’t need to melt in order to collapse.

That’s useful information, but why is Google willing to combat 9/11 conspiracy theories but not Holocaust denialism? Perhaps the company would argue that explaining the historical evidence of the Holocaust is too complex to fit into a snippet (and indeed, Google doesn’t try to provide a definitive answer to the broader question “was 9/11 an inside job”). But if Google is going to position itself as the arbiter of truth, it should be willing to state the facts on climate change and the Holocaust.

The feedback it gathers from users may help Google decide when to stay out of a debate entirely, but the questions the company now faces don’t have snippet-sized answers. To succeed on new computing platforms where conventional search results don’t make sense, Google has put itself in the position of becoming an arbiter of facts. That’s not a simple job, nor one it can expect to succeed at doing simply by offloading the work on you.

Go Back to Top. Skip To: Start of Article.