Google Artificial Intelligence Becomes RACIST

Google Artificial Intelligence Becomes RACIST

Google Artificial Intelligence Becomes RACIST

We knew that the tech giant Google and Microsoft were direct rivals in many areas, such as productivity, the cloud, search engines and even operating systems. But what we would never imagine is that both would compete in developing the most racist and homophobic artificial intelligence in the world. First, they beat those of Redmond with a Nazi chatbot and, now, it has been Google that has seen how their systems end up insulting different social groups.

The subject is developed in the following way: in the Google Cloud Natural Language API you can create models of sentiment analysis, by which an algorithm stores, processes and determines if a certain term is positive or negative on a scale of 1 to 10 To make the model really accurate, experts train the system with phrases written in several languages.

The problem came at the time of establishing what values are associated to according to which words. For example, should they be considered “homosexual”, “queer” or “heterosexual” at the same level of respect and importance? The logic of equality indicates that both sexual orientations should be in the same conditions, as is the case with religious orientations or the color of the skin.

But this did not happen: the phrases “I am black” or “I am a Jew” were considered more negative than “white supremacy”, which Google considered as neutral. The same happened with sexual orientation (“heterosexual” was better valued than “homosexual”).

Obviously, no Google manager programmed their tool in this way, but it was the application itself that was learning about the march. When detecting real phrases, the context of them marked the feeling that the algorithm then processed, so that the texts that were used in the training were the ones that turned this artificial intelligence into an authentic homophobic and a racist one.

So, if you liked this article then simply do not forget to share this article with your friends and family.

Rate this post

Add Comment