When we picture warriors, it isn’t very common to associate them with algorithms, computer science, and the Massachusetts Institute of Technology. The same stereotypical bias that lives in our minds concerning warriors, lives in the software and vision of computers. Except computers don’t have minds and eyes of their own, they’re programmed by groups of people to perform and see in certain ways; the ways they’re told.
Meet Joy Buolamwini, a master’s candidate at the MIT Media Lab tapped into the technical side of the bias of artificial intelligence. More or less, in an experiment, she found that her face–the face of a black woman–wasn’t recognized by something as simple as a web camera, and white faces were. She even went on to wear a white mask, which was recognized by the camera. This brought forth what she has dubbed “the coded gaze”, her take on algorithmic bias.
Algorithmic bias is integrated into technology and, as algorithms become more and more a part of our lives, can lead to pre-programmed biases that can be spread rapidly beyond the tech they inhabit. For a team of six to ten people, it can take more than six months to create a single app. What Buolamwini noticed was that the teams responsible for programming these apps are coding biases into technical architecture, informing how machine learning progresses. Basically, if a camera recognizes more white faces than black faces or more male faces than female faces, the inherent bias coded into the program’s framework should be called into question. Again, artificial intelligence and machine learning aren’t autonomous, they’re programmed to do what they’re told. So, if those 10 programmers are 90% white men and the program doesn’t recognize Buolanwini’s face, what else is happening on the back end?
When Buolamwini discovered this in something as simple as a web camera, she began to question how this is impacting other aspects of computing. This algorithmic bias caused her to fight against exclusion in computer programming by launching the Algorithmic Justice League (AJL), the mission of which aims to do the following: “Highlight such bias through provocative media and interactive exhibitions; to provide space for people to voice concerns and experiences with coded discrimination; and to develop practices for accountability during the design, development, and deployment phases of coded systems.”
The tech industry is not-so-secretly a boys club, populated predominantly by white men. Seeing brilliant hacktivists and computer science pioneers like Joy Buolamwini call into question an industry that boasts a worth of beyond $3 trillion speaks to the power of her voice and change that needs to happen. As technology continues to develop at a breakneck pace, we’ll be keeping our eyes open for the AJL and Buolamwini to keep up the fight, hacking away at algorithmic bias one line of code at a time.