Google’s Algorithm Says The World Is Flat


Google’s Algorithm Says The World Is Flat


by Laurie Sullivan , Staff Writer @lauriesullivan, January 10, 2018


Leonid Bershidsky, a Berlin-based Russian journalist and columnist for Bloomberg, made an interesting observation in an opinion piece titled “If Google Is Biased, So Are Its Algorithms.” It ran this week in Bloomberg.


Bershidsky makes the point that if Google’s policies are biased, then we need to look at how “its work culture and internal rules” influence its algorithms that serve up most the world’s information in its search engine. After all, they are written by the engineers who work for the company and follow the rules and culture.


Two of those engineers — James Damore and David Gudeman — filed a discrimination lawsuit against Google, their former employer, this week. Damore alleges he was fired for penning a memo against Google’s diversity policies, and Gudeman alleges he was fired for his conservative views.


The real issue here is not about how Google’s work culture and company rules influence its employees, but how the company’s culture and these rules influence the world. And they do — because of the nature of Google’s business. Opinions are the cause of bias in machine learning.


And sometime those opinions are a little too opinionated and the culture made a little to public. The culture within Google at times seems a little creepy, according to some of the screen shots that Damore provides in the lawsuit documentation. What happened to manners and representing yourself with honor and respect? The internet seems to have eliminated that type of culture.

The algorithms, designed by the engineers who build the code to surface the data, influence people who search for information. The culture and rules also influence the decisions made by marketers trying to reach consumers to share products, news and services from their brands. And ultimately, the culture and rules influence consumer decisions.


As Bershidsky points out, Google is the owner of the world’s biggest conduit to information, with a 69% global search market share. Algorithms and artificial intelligence lead people to this information.


Google is aware of the problem of algorithm bias, and published a video on YouTube describing the initial coding process. The narrator in the video says “just because something is based on data [it] doesn’t automatically make it neutral. Even with good intentions it’s impossible to separate ourselves from our own human biases, so our human biases become part of the technology we create in many different ways.”


Google does acknowledge that it takes a lot to train algorithms not to perpetuate human bias, but in Bershidsky’s opinion piece, he writes that “if the Damore lawsuit correctly describes an aggressively leftist culture at Google, the human input into the algorithms can be expected to favor the leftist worldview. That may lead to overcorrection — and to the burying of alternative views, noxiously right-wing or even mainstream conservative.”


Bershidsky provides this example. Type “the earth is” into Google’s search engine, and the first suggestion is “flat.” He refers to it as a likely “manifestation of algorithmic bias,” but doesn’t want it to be overcorrected.


Again, the biggest issue is not the correctness within Google, but how the company’s rules and culture influence the world.


“[Google’s] mission as a neutral conduit is more important even than workplace cohesion,” Bershidsky wrote. “Silencing any kind of views, no matter how offensive, undermines that mission.”


MediaPost.com: Search Marketing Daily

(33)