How Google Search Results Work – Or, Technology Writers Can’t Do Philosophy

The AP published an article intended to counter the prevailing view that Google has biased search results. Google denies any bias whatsoever:

“We don’t bias our results toward any political ideology.”

So why do results so often seem biased? The technology writer at the AP promises to make everything clear for us.

Google has software which indexes every site it can find on the internet and keeps track of the most common search terms. So far, no chance for Google to inject bias of any kind, presuming this is what happens.

However, the rest of the article reveals a variety of ways Google employees can bias the results, and even ways they are required to bias results.

The technology writer for the AP cannot distinguish between a computer algorithm itself (which is mechanical) and the intent and effect of its design (which is based on the philosophy of its authors).

“Quality” Raters

According to the article, more than 10,000 “quality raters” judge the quality of search results using a 164 page document with such obviously political sections as:

“Using the Upsetting-Offensive Flag”

“Needs Met Rating for Upsetting-Offensive Tolerant Queries”

“Pages that Potentially Deceive Users”

“Lacking Expertise, Authoritativeness, or Trustworthiness”

“Mixed or Mildly Negative Reputation of the Website or Creator of the Main Content”

Who defines what is “upsetting” or “offensive”? Who decides what someone’s reputation is? Who decides if someone lacks expertise, authoritativeness, or trustworthiness?

Google.

It isn’t some mindless, apolitical machine that decides these things. It’s employees at Google, who bring their own beliefs with them. If these employees presume themselves to be neutral observers – as the author of the article seems to imply they are – it’s all the more dangerous.

What is high quality?

The example given for how quality might be determined is by looking at Pulitzer Prizes won by the author of the content. This presumes that the prize itself is neutral, that those who give the prize are neutral, that the authors receiving the prize are neutral, and that those who don’t receive the prize are of poorer quality. Every single one of these assumptions are political and philosophical, meaning that in the example which is given of how quality is determined, we already have a clear example of bias.

What is poor quality?

The pages which are given a low rating, on the other hand, are those which “spread hate, cause harm or misinformation, or which deceive users”.

For the Left, suggesting that women make less than men on average because of career decisions and not because of some evil mystical force called “the patriarchy” is considered “hateful” and “harmful” and “misinformation”. Google just fired an employee for suggesting this very thing.

The same people who fired him are the ones who determine what is “hateful” and “harmful” and “misinformation”. Again, for reasons unknown, the technology writer at the AP doesn’t think that this is a place where bias might enter into the design of Google’s algorithms.

Fake News

We are also told that sites are labelled “deceptive” if they “look like a news organization” but “in fact [have] articles to manipulate users in order to benefit a person, business, government, or other organization politically, monetarily, or otherwise”.

This presumes two things:

  1. That far-left employees at Google can determine the hidden motivations behind the authors of articles.
  2. That far-left employees at Google implicitly trust major news companies not to be deceptive in either what they report or what they fail to report.

Both of these things are examples of political bias.

Design and Designer

What the author of the article fails to understand is the difference between a mindless algorithm that does whatever it is programmed to do and the mindful intentions of the authors of that algorithm. Because he likely agrees with the politics of Google engineers, he thinks the algorithm is neutral. After all, his own views are obviously neutral (or so he thinks).

Too many software engineers lack a strong philosophical background and make elementary mistakes in reasoning (like presuming their own neutrality) which, when ignored, lead to things like a far-left bias to the most influential search algorithm in the world. Unfortunately, the tech writer at the AP is similarly unaware of his own biases, or is simply defending Google because of their political leanings.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s