Opinion: More diversity needed in AI to fight human biases

By Marcus Thuillier, Class of 2019

The following is an excerpt from an opinion column by Marcus Thuillier, MSiA Class of 2019, for The Daily Northwestern

As I approach graduation, diversity and representation has been on my mind. It’s been a key point of discussion in one of my classes this quarter, Leadership Insights and Skills for Data Scientists. It is of fundamental importance for data scientists, especially in the tech industry, to be aware of diversity or the lack thereof in their surroundings. As companies turn to data to answer some of their pressing questions, it is important to take a step back and consider what this data is actually telling us.

The data boom that has happened recently has led to plenty of opportunities in the tech industry. However, this boom also comes with its drawbacks. The artificial intelligence community has been portrayed for a long time as creating impartial and objective solutions to human problems. However, this idea is just a myth, as often data solutions carry bias and subjectivity.

This bias exists in two ways. First, the designer of the algorithm can be biased. Big companies like Facebook or Google, for example, have less than 15 percent of women working in AI research. These companies also have less than 4 percent of black employees and are known to discriminate against “older” employees. Like many aspects of our social system, the AI industry is dominated by white men between the ages of 18 and 35. The quality, accuracy and reliability of the products created suffer from it. These employees inject their biases in how they acquire, treat and format data. This directly impacts the models and algorithms they develop. A data scientist might choose to exclude some variables trying to prove a preexisting assumption. He might not control for outliers, for example, someone whose bank account is significantly bigger than anybody else’s in the dataset, which would skew the models. Yet another example is the possible presence of confounding variables, an extra variable that has an effect on what you want to predict. A correlation between murder rate and sale of ice cream can be caused by the raise in temperature, not by murders actually encouraging people to buy ice cream. Even worse, these biases perpetuate existing stereotypes and transfer a continued culture of discrimination into AI.

The solution to this problem is as often simple in thought but difficult to execute.

 

Read More: The Daily Northwestern | Opinion

View media coverage of our news story at the following link: https://dailynorthwestern.com/2019/11/13/opinion/thuillier-more-diversity-needed-in-ai-to-fight-human-biases/

McCormick News Article