If the recent events in the US have taught us anything, it’s the unfortunate truth that we live in a world where racism still exists at a worryingly high level.
However, a person does not necessarily have to harbour racist ideologies to display discrimination. As humans, we come with a innate collection of flaws, one of which is our ability to be influenced by implicit biases.
These biases can have a alarming result on human behaviour. One previous study showed that an identical CV is 50% more likely to result in an interview invitation if the candidate’s name is European American than if it is African American. The latest results suggest that algorithms, unless explicitly programmed to counteract this, will be plagued with the same social prejudices.
It is unfortunate to see these stereotypes influence the behavior of our technology, although as Richard Sharp, former CTO of predictive marketing company Yieldify and current CTO at Shazam, points out there is a very clear reason behind this.
“To understand how bias creeps in you first need to understand the difference between programming in the traditional sense and machine learning. With programming in the traditional sense, a programmer analyses a problem and comes up with an algorithm to solve it (basically an explicit sequence of rules and steps). The algorithm is then coded up, and the computer executes the programmer’s defined rules accordingly.”
He adds, “With machine learning, it’s a bit different. Programmers don’t solve a problem directly by analyzing it and coming up with their rules. Instead, they just give the computer access to an extensive real-world dataset related to the problem they want to solve. The computer then figures out how best to solve the problem by itself.”
This bias has been seen in a number of studies and AI systems. For example, Two prominent research image collections, including one supported by Microsoft and Facebook exhibited a predictable gender bias in their division of activities such as cooking and sports. Pictures of shopping and washing are linked to women, while coaching and shooting are tied to men.
Furthermore, language based AI systems such as Siri or Alexa are are learning to exclude some African-American voices. While this may seem relatively insignificant compared to the issues of racism that countries like the US are currently experiencing, this could have a huge impact on other areas. Some experts believe that the problem may be more serious than we think, affecting a growing number of decisions in finance, healthcare, and education.
Racism and discrimination is a very serious issue for many areas of the world where people are unfairly treated for factors such as gender, skin color or ethnic origin. Now more than ever we must focus on tackling the implicit biases that we naturally have, in order to stop it from feeding into our technology and exacerbating the situation further.
It is important for us to recognize our flaws here and work together to tackle them in all areas of society, or one day you might find yourself walking into a voice activated lift with two angry Scotsmen, a chaotic situation for even the calmest of people.
Every now and then, I stumble upon posts such as these here and there: And,…
Winter(Physics) is Coming It now looks like Large Language Models running on the GPT technology…
Latin America’s tech industry is booming, with innovative new startups popping up across the region.…
The Global Initiative for Information Integrity on Climate Change claims to 'safeguard those reporting on…
In the late 19th Century, physicians began inserting hollow tubes equipped with small lights into…
This year wasn’t exactly what the video gaming industry expected — it declined by 7%…