11/9 – “How I’m fighting bias in algorithms” Deirdre Kelshaw

In the TED Talk, “How I’m fighting bias in algorithms,” Joy Buolamwini talks about the prevalence of the “coded gaze” — also known as algorithmic bias — and points out that who, how and why we code matters if we want to prevent the bias in these algorithms and its resulting unfair outcomes. 

Buolamwini argues this by reflecting on her own experience with social robots. For example, when she was in college, Buolamwini was tasked with getting a robot to play Peekaboo with her. However, the robot could not see her due to the color of her skin and therefore could not play the game. Yet, the robot was able to play once it interacted with Buolamwini’s white roommate. Through this experience, we learn about how computer vision uses machine learning techniques to do facial recognition. This means that we are the ones who teach computers how to recognize faces. However, if the training sets aren’t diverse enough, any face that deviates too much from the established norm will be harder to detect. This proves to be an issue when thinking about how biased algorithms are increasingly being used to make decisions that impact many aspects of our lives —  such as who gets hired or fired, who receives a loan, or who gets into college. In fact, law enforcement is also starting to use machine learning for predictive policing and some judges are using machine-generated risk scores to determine how long someone is in prison. If these algorithms prove to be biased, the decisions it makes prove to be unjust. 

Based on this information, it is up to us to make sure that social change is a priority in order to bring about greater equality. We need to build platforms that take bias into consideration and create more inclusive training sets. As Buolamwini mentions, “Technology needs to work for all of us, not just some of us.” While reflecting on this talk, I realized that I never truly thought about how the color of my skin benefits me even when it comes to technology. If we began to program these algorithms to be more inclusive, could we shift the mindsets of those still in the past for the better to bring about greater equality? I wonder this because technology has become so influential throughout the past decade or so. If we can change these systems, maybe we can change how people think as well.

3 thoughts on “11/9 – “How I’m fighting bias in algorithms” Deirdre Kelshaw

  1. I also watched this TedTalk and had a very similar take on it – as well with similar questions that arose from watching it. I am curious if our generation (who is generally more progressive) will have the power to change the mindsets of other’s through being the developer of these coming technologies. In my opinion, I think that we have the ability to make technology more accessible to people of color, but I am not sure about being able to change people’s fixed mindsets. I mentioned in my blog post that because coding is such an attractive field to get into, we have the ability to change the technology in which it produces, therefore, it is important for people of our age to watch this and be familiar with its ideas.

  2. I thought this was a really good summary, I especially liked when you talked about changing societal mindsets through machine learning. I think this is a really interesting topic considering ML algorithms practically run our lives through their abundance and applications. So, to say that promoting equality through fixing algorithmic bias I think is a real possibility, but obviously it is just a start.

  3. I haven’t watched the TED, but you provided a clear summary! The story of the interaction with social robot was impressive, and that vividly explained the bias in algorithms in a profound way. I hadn’t think this problem deeply before, and that reminded me to think about what may exist bias in our daily life situation. In addition, I like the reflection part in the end, and I hope the algorithms would become more and more inclusive in the future.

Leave a Reply

Your email address will not be published. Required fields are marked *