FACT FOCUS: Google autocomplete results around Trump lead to claims of election interference

Republican presidential candidate former President Donald Trump wraps up a campaign rally, Saturday, July 27, 2024, in St. Cloud, Minn. (AP Photo/Alex Brandon)

With fewer than 100 days until the 2024 election, social media users are claiming that a lack of Google autocomplete results about former President Donald Trump and his attempted assassination is evidence of election interference.

Many posts include screenshots showing what the autocomplete feature, which predicts what users are trying to type, has generated for text such as 鈥渁ttempted assassination of tr鈥 or 鈥減resident donald.鈥 Among the pictured results for the former phrase are references to other assassination attempts, including that of Harry Truman and Gerald Ford, but nothing for Trump. The latter provides two options 鈥 鈥減resident donald duck鈥 and 鈥減resident donald regan.鈥

Multiple high-profile figures, including Trump and sitting members of Congress, promoted the claim across social media platforms, collectively amassing more than 1 million likes and shares by Tuesday. Trump did not immediately respond to a request for comment.

Google attributed the situation to existing protections against autocomplete predictions associated with political violence, noting that 鈥渘o manual action was taken鈥 to suppress information about Trump.

Search engine experts said there are many reasons that could explain why some autocomplete results concerning the former president were not appearing.

Here鈥檚 a closer look at the facts.

CLAIM: Google is engaging in election interference by censoring autocomplete results about former President Donald Trump, including the assassination attempt at his Pennsylvania rally on July 13.

THE FACTS: It is true that Google鈥檚 autocomplete feature as of Monday was not finishing certain phrases related to Trump and the assassination attempt as shown in screenshots spreading online, but there is no evidence it was related to election interference.

By Tuesday, some of the same terms were providing relevant autocomplete results. The text 鈥減resident donald鈥 now also suggests 鈥淒onald Trump鈥 as a search option. Similarly, the phrase 鈥渁ttempted assassination of鈥 includes Trump鈥檚 name in autocomplete predictions. Adding 鈥渢r鈥 to the same phrase though makes the option disappear.

Completed searches about Trump and the assassination attempt done on both Monday and Tuesday yielded extensive relevant results regardless of what autocomplete predictions came up.

Google told the AP that its has automated protections regarding violent topics, including for searches about theoretical assassination attempts. The company further explained that its systems were out of date even prior to July 13, meaning that the protections already in place couldn鈥檛 take into account that an actual assassination attempt had occurred.

Additional autocomplete results now appearing about Trump are the result of systemic improvements 鈥 rather than targeted manual fixes 鈥 that will affect many other topics, according to the company.

鈥淲e鈥檙e rolling out improvements to our Autocomplete systems to show more up-to-date predictions,鈥 Google told The Associated Press in a statement. 鈥淭he issues are beginning to resolve, and we鈥檒l continue to make improvements as needed. As always, predictions change over time and there may be some imperfections. Autocomplete helps save people time, but they can always search for whatever they want, and we will continue to connect them with helpful information.鈥

Search engine experts told the AP that they don鈥檛 see evidence of suspicious activities on Google鈥檚 part and that there are plenty of other reasons to explain why there have been a lack of autocomplete predictions about Trump.

鈥淚t鈥檚 very plausible that there鈥檚 nothing nefarious here, that it鈥檚 other systems that are set up for neutral or good purposes that are causing these query suggestions to not show up,鈥 said Michael Ekstrand, an assistant professor at Drexel University who studies AI-powered information access systems. 鈥淚 don鈥檛 have a reason not to believe Google鈥檚 claim that this is just normal systems for other purposes, particularly around political violence.鈥

Thorsten Joachims, a professor at Cornell University who researches machine learning for search engines, explained that autocomplete tools typically work by looking at queries people make frequently over a certain period of time, providing the most frequent completions of those queries. Beyond that, a search engine may automatically prune predictions based on concerns such as safety and privacy.

This means that it鈥檚 plausible that Google鈥檚 autocomplete feature wouldn鈥檛 have accounted for recent searches about the assassination attempt on Trump, especially if its systems indeed had not been updated since before the shooting.

"Depending on how big the window is that they鈥檙e averaging over, that may simply not be a frequent query,鈥 Joachims said. 鈥淎nd it may not be a candidate for autocompletion.鈥 He added that it鈥檚 typical not to update a search model on a daily basis, given the costs and technical risks involved.

A about its autocomplete feature describes how the system reflects previous searches and why users certain predictions, including those that are violent in nature. The post also explains that predictions may vary based on variables such as a user鈥檚 location, the language they speak or rising interest in a topic.

Both Ekstrand and Joachims agreed that proving bias in a complex system like Google鈥檚 search engine from the outside would be extremely difficult. It would require much more data than just a couple of searches, for example, and would risk setting off the company鈥檚 protections against data scraping, reverse engineering and fraud.

鈥淚n general, claims that platforms are taking particular targeted actions against specific people on political bases are hard to substantiate,鈥 Ekstrand said. 鈥淭hey sometimes, I鈥檓 sure, happen, but there鈥檚 so many other explanations that it鈥檚 difficult to substantiate such claims.鈥

Joachims noted that the demographics of Google鈥檚 user base could impact the results of such a study if they skewed toward one side of the political aisle or another and therefore searched more for their preferred candidates. In other words, the way the system works would make it difficult to probe the system.

Technical issues aside, limiting autocomplete predictions as a method of political influence could simply be bad for business.

鈥淓ven if Google would like to do that, I think it would be a very bad decision because they could lose a lot of users,鈥 said Ricardo Baeza-Yates, a professor at Northeastern University whose research includes web search and information retrieval.

___

Find AP Fact Checks here: .

The 好色tv Press. All rights reserved.

More Science Stories

Sign Up to Newsletters

Get the latest from 好色tvNews in your inbox. Select the emails you're interested in below.