Algorithms are the wrong usual suspect!

Nowadays, when thinking of Artificial Intelligence (AI), most of us are first thinking of killer robots and the loss of dozen millions of jobs.

While it is still critical to consider how AI will impact the job market as well as people’s expertise, we also need to think about an even more important and currently active threat : the algorithm bias.

When developing a digital technology, a bias can be introduced in the algorithm that helps an application run on your cell phone for instance. This bias can come from the explicit criteria of the algorithm or, the data on which the algorithm is calibrated or trained in the case of machine learning. As a result, some technology discrimination appears and result from excluding part of the users, who are not accurately represented in the algorithm criteria.

Some well-known past examples of technology discrimination can be underlined such as the first facial recognition algorithms that were not recognizing dark skin tone faces. Indeed, the conceptors may not have thought of dark skin tone when developing this technology, and/or the data on which the algorithm was trained might have had only photos of white tone faces.

In this article we explain what a technology discrimination is, as well as what a bias is and its origins. In addition we propose some key solutions to investigate toward avoiding the introduction of biases in the technologies we build.

https://medium.com/@aurliejean/algorithms-are-the-wrong-usual-suspect-6f6e73b6408d

Aurélie Jean

The Whiz Co-founder & CTO/CIO

Dr. Aurélie Jean codes to live and lives to code. She sees MixR as a platform to counter gender bias in algorithms. With her Ph.D in Material Sciences & Engineering (École des Mines, Paris), seven years of research on Computational Medicine & Biomechanics at Penn State and MIT, and recent experience applying her skills to finance at Bloomberg in NYC, Aurélie works to inspire everyone–well, ok, women and girls especially–to learn about science and become empowered users of the amazing technologies available to us.

Copyright © 2020 Statierra. All rights reserved.

Los Angeles, California
Share your feedback