Google says it is making changes to its autocomplete feature ahead of the November presidential election.
Autocomplete suggests possible search terms based on what a user starts typing. In a blog post, Google said it was removing suggestions that could be viewed as being for or against a particular candidate or party.
The search engine also is blocking suggestions that could be viewed as claims about "the integrity or legitimacy of electoral processes."
"What this means in practice is that predictions like 'you can vote by phone' as well as 'you can't vote by phone,' or a prediction that says 'donate to' any party or candidate, should not appear in Autocomplete," Google wrote on Thursday. "Whether or not a prediction appears, you can still search for whatever you'd like and find results."
During an online press event, the company's senior director of global policy and standards acknowledged that some otherwise benign search predictions might get blocked by the policy. But David Graff told reporters that Google is practicing extra caution to keep "bad information" from appearing by way of a suggested search.
"We want to be very careful about the type of information that we highlight in the search feature given its prominence. Given the concern around elections and elections information, we want to be particularly conservative here," Graff said.
Google and social media counterparts Twitter and Facebook have been under increasing pressure to stop the spread of misinformation on their platforms. In the runup to the Nov. 3 election, all three companies have instituted policies intended to curb and fact-check information that be misleading or flat-out untrue.
On the same day as Google's announcement, Twitter said it was introducing restrictions on elected-related posts, including to tweets claiming victory for a candidate ahead of official results or attempts to disrupt the peaceful transfer of power.
Facebook also recently announced stricter policies around election-related posts as well as on new political ads in the week before Election Day. In that announcement, the company said it would label posts attempting to "delegitimize" the outcome of the election and delete claims that voters would get COVID-19.
Facebook and Twitter changed their policies not long after both tech giants announced that they had deleted Russia-linked accounts, which researchers say were trying to steer left-leaning voters away from Democratic presidential candidate Joe Biden and running mate Kamala Harris.
The tech giants have also been contending with false, and disproven, information about the COVID-19 pandemic. This included removing a Breitbart News video published to both social media sites and YouTube that featured a group of doctors arguing debunked claims about a controversial treatment for the coronavirus.
Twitter, Facebook and YouTube, which is owned by Google, all removed the video in July. Twitter also temporarily restricted the account of Donald Trump Jr., who had shared the video.
Editor's note: Google and Facebook are among NPR's financial supporters.