Google Search Bias Could “Shift 3 Million Votes Towards Hillary” In Upcoming Election

Published exclusively for Sputnik by Dr. Robert Epstein, a Senior Research Psychologist at the American Institute for Behavioral Research and Technology, in Vista, California.

In this exclusive report, distinguished research psychologist Robert Epstein explains the new study and reviews evidence that Google’s search suggestions are biased in favor of Hillary Clinton. He estimates that biased search suggestions might be able to shift as many as 3 million votes in the upcoming presidential election in the US.

Biased search rankings can swing votes and alter opinions, and a new study shows that Google’s autocomplete can too.

A scientific study I published last year showed that search rankings favoring one candidate can quickly convince undecided and weaker minded voters to vote for that candidate — as many as 80 percent of voters in some demographic groups. My latest research shows that a search engine could also shift votes and change opinions with another powerful tool: autocomplete.

Because of recent evidence that Google has been deliberately tinkering with search suggestions to make Hillary Clinton look good, this is probably a good time both to examine those claims and to look at my new research. As you will see, there is some cause for concern here.

In June of this year, Sourcefed released a video claiming that Google’s search suggestions — often called “autocomplete” suggestions — were biased in favor of Mrs. Clinton. The video quickly went viral: the full 7-minute version has now been viewed more than a million times on YouTube, and an abridged 3-minute version has been viewed more than 25 million times on Facebook.

The video’s narrator, Matt Lieberman, showed screen print after screen print that appeared to demonstrate that searching for just about anything related to Mrs. Clinton generated positive suggestions only. This occurred even though Bing and Yahoo searches produced both positive and negative suggestions and even though Google Trends data showed that searches on Google that characterize Mrs. Clinton negatively are quite common — far more common in some cases than the search terms Google was suggesting. Lieberman also showed that autocomplete did offer negative suggestions for Bernie Sanders and Donald Trump.

“The intention is clear,” said Lieberman. “Google is burying potential searches for terms that could have hurt Hillary Clinton in the primary elections over the past several months by manipulating recommendations on their site.”

Google responded to the Sourcefed video in an email to the Washington Times, denying everything. According to the company’s spokesperson, “Google Autocomplete does not favor any candidate or cause.” The company explained away the apparently damning findings by saying that “Our Autocomplete algorithm will not show a predicted query that is offensive or disparaging when displayed in conjunction with a person’s name.”

Since then, my associates and I at the American Institute for Behavioral Research and Technology (AIBRT) — a nonprofit, nonpartisan organization based in the San Diego area — have been systematically investigating Lieberman’s claims. What we have learned has generally supported those claims, but we have also learned something new — something quite disturbing — about the power of Google’s search suggestions to alter what people search for.

Lieberman insisted that Google’s search suggestions were biased, but he never explained why Google would introduce such bias. Our new research suggests why — and also why Google’s lists of search suggestions are typically much shorter than the lists Bing and Yahoo show us.

Our investigation is ongoing, but here is what we have learned so far:

Bias in Clinton’s Favor

To test Lieberman’s claim that Google’s search suggestions are biased in Mrs. Clinton’s favor, my associates and I have been looking at the suggestions Google shows us in response to hundreds of different election-related search terms. To minimize the possibility that those suggestions were customized for us as individuals (based on the massive personal profiles Google has assembled for virtually all Americans), we have conducted our searches through proxy servers — even through the Tor network — thus making it difficult for Google to identify us. We also cleared the fingerprints Google leaves on computers (cache and cookies) fairly obsessively.

Google says its search bar is programmed to avoid suggesting searches that portray people in a negative light. As far as we can tell, this claim is completely false.

http://www.zerohedge.com/news/2016-09-12/google-bias

Advertisements

About avirginiapatriot1776

I hope we have once again reminded people that man is not free unless government is limited. There’s a clear cause and effect here that is as neat and predictable as a law of physics: as government expands, liberty contracts. — Ronald Reagan
This entry was posted in Uncategorized. Bookmark the permalink.

Leave your name and comment (info below comment box is optional)

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s