Google’s aesthetic has always been rooted in a clean look – a homepage free of ads and pop-up clutter, with only the iconic ” Graffiti” adorn its homepage name. Part of what many users love about Google is its sleek design and ability to return very accurate results. However, the simplicity of Google’s homepage seems static. The way companies return information has changed so little over time. These incremental changes have largely gone unnoticed by the millions of users who rely on search engines every day, but they have fundamentally changed the information search process—and not necessarily for the better.
When Google first launched, the query returned a simple list of hyperlinked sites. Slowly, the format changed. First, Google introduced AdWords, which allows businesses to buy headroom and customize returns to maximize product placement. By the early 2000s, it was correcting spelling, providing news summaries under headlines, and predicting our queries with autocomplete. In 2007, it launched Universal Search, which aggregates relevant information in various formats (news, images, videos). And in 2012, it introduced the Knowledge Graph, providing a snapshot separate from the returns, a source of knowledge that many of us exclusively rely on when doing quick searches.
As research shows, most of these design changes are now tied to Google’s attributes that make their products better than the competition. The goal is not just to display a series of blue links, but to increasingly “provide direct answers,” according to an official SEC filing by Alphabet. By adding all of these features, Google, along with competitors like DuckDuckGo and Bing that also summarize content, effectively transformed the experience from an exploratory search environment to a platform designed around verification, with a more like fact-checking service.
Google recently wanted to answer our questions for us instead of asking us to hit back and find the answer for ourselves, if what you’re looking for is a straightforward fact like a gallon has How many ounces, then it’s not particularly problematic. The problem is, many people rely on search engines to find information on more complex topics. And, as my research has revealed, this shift can lead to false returns, which often undermine democratic participation, substantiate unsubstantiated claims, and are vulnerable to manipulation by those wishing to spread lies.
For example, if someone asked “when is the North Dakota caucus” during the 2020 presidential election, Google highlighted the misinformation, saying it was March 2020 Saturday the 28th. In fact, the Firehouse Caucus was on March 10, 2020 — that was the Republican convention on the 28th. To make matters worse, when such errors occur, there is no mechanism for users who notice the difference to flag it for information review.
Google Snippets can also mislead the public on important issues to maintain our democracy. When Trump supporters stormed the Capitol on January 6, 2021, conservative politicians and pundits quickly tried to characterize the rioters as “anti-Trump”, spreading the lie that an antifa (one who believes in positive and the loosely organized movement that actively opposes the far-right movement) is to blame for the violence. On the day of the attack, The
Washington Times article, The headline, “Facial Recognition Identifies Extremists Who Attack Capitol,” supports the claim, and the story was circulated in the House and on Twitter by elected officials.
The FBI found no evidence to support these claims, and The Washington Times ended up reviewing this article Corrected, false information is still widely available through simple Google searches. If you’re looking for “Washington Times Antifa Evidence,” the top return (as of this writing) is the original article titled “Facial Recognition Identifies Extremists Attack the Capitol.” Below, Google summarizes an inaccurate argument, emphasizing that those identified as extremists are antifa. Perpetuating these lies can have long-term effects, especially since those in my study described Google as a neutral provider of news and information. According to an April 2021 poll, more than 20 percent of Republican voters still blame antifa for the violence that day.
The problem is that many users still rely on Google for fact-checking information, and doing so may reinforce their beliefs about false claims. This is not only because Google sometimes provides misleading or incorrect information, but also because people who have done research with me believe that Google’s top searches return “more important,” “more relevant,” and “more accurate,” and they trust Google to be more than just It’s news – they see it as a more objective source. Many say the Knowledge Graph may be their only source of inquiries, but few realize how much Google has changed – it’s no longer the search engine it once was. In order to “do your own research,” people tend to search for what they see on Facebook or other social media platforms, but because of the way the content is tagged and categorized, they’re effectively caught in an information trap.
This leads to what I refer to in my book,
Propagandist’s playbook, as “the effect of IKEA misinformation.” Business academics have found that when consumers make their own, they rate the product higher than those of similar quality that have been assembled — they feel more competent, and therefore More satisfied with the purchase. Conspiracy theorists and propagandists are using the same tactics to give the information they provide a tangible, do-it-yourself quality. Doing a search on a given topic independently makes viewers feel like they’re engaging in an act of self-discovery, when they’re actually engaging in a scavenger hunt orchestrated by lies.
To fix this, users will have to recalibrate their thinking about what Google is and how information is getting back to them, especially as the intense mid-peak season approaches. Rather than assuming the returns confirm the truth, we must apply the same scrutiny we have learned to information on social media. Googling the exact same phrase you see on Twitter might return the same information you see on Twitter. Just because it comes from a search engine doesn’t make it more reliable. We must pay attention to the keywords we start with, but we should also spend more time exploring the information that is returned to us. Rather than relying on quick answers to tough questions, take the time to click on links, dig a little about who’s doing the report, and read information from a variety of sources. Then restart your search from a different angle and see how a slight change in syntax can change your results.
After all, things we might not even think about might just be a click away.