香港六合彩资料

Study by 香港六合彩资料 Engineering student, MIT takes aim at biased AI facial-recognition technology

Photo of Deb Raji
A recent study by Deb Raji and researchers at the MIT Media Lab shows a need for stronger evaluation practices of AI products to mitigate gender and racial biases (photo by Liz Do)

A study by Deb Raji, a fourth-year student in the 香港六合彩资料鈥檚 Faculty of Applied Science & Engineering, and researchers at the Massachusetts Institute of Technology is underscoring the racial and gender biases found in facial-recognition services.

Raji spent the summer of 2018 as an intern at MIT鈥檚 Media Lab, where she audited commercial facial-recognition technologies made by leading companies such as Microsoft, IBM and Amazon. The researchers discovered that all of them had a tendency to mistake darker-skinned women for men.

But one service in particular 鈥 Amazon鈥檚 Rekognition 鈥 showed a higher level of bias than the rest. Although it could identify the gender of light-skinned men with nearly 100 per cent accuracy, it misclassified women as men 29 per cent of the time, and darker-skinned women for men 31 per cent of the time.

Rekognition was recently piloted by police in Orlando, Fla., using the service in policing scenarios such as scanning faces on cameras and matching them against those in criminal databases.

鈥淭he fact that the technology doesn鈥檛 characterize Black faces well could lead to misidentification of suspects,鈥 says Raji. 鈥淎mazon is due for some public pressure, given the high-stakes scenarios in which they鈥檙e using this technology.鈥

With rapid advancements and deployment of artificial intelligence (AI) products, this new study emphasizes the need to not only test systems for performance, but also for potential biases against underrepresented groups.

Although algorithms should be neutral, Raji explains that because data sets 鈥 information used to 鈥渢rain鈥 an AI model 鈥 are sourced from a society that still grapples with everyday biases, these biases become embedded into the algorithms.

鈥淟et鈥檚 say I want examples of what healthy skin looks like. If you Google it now, you will see mostly light-skinned women,鈥 says Raji. 鈥淵ou won鈥檛 see a man for pages, and you wouldn鈥檛 see a darker-skinned woman until you really scroll down. If you feed that into an AI model, it adopts this world view and adapts its decisions based on those biases.鈥

These biases should be called out, just as one would hold a person accountable, says Raji. 鈥淭here鈥檚 this increased danger when you embed that bias into an algorithm versus when a human makes a prejudiced decision. Someone will tell you it鈥檚 wrong, whether it鈥檚 the public or your boss,鈥 she says.

鈥淲ith AI, we tend to absolve this responsibility. No one is going to put an algorithm in jail.鈥

Raji鈥檚 passion on the subject of bias in machine learning comes from her time during a work experience placement at the startup Clarifai AI, where the topic of AI and ethics was regularly discussed at the research-oriented company.

鈥淚t鈥檚 something that the company noticed and was very explicit about addressing, and it鈥檚 a subject that personally resonated with me because I鈥檓 a visible minority,鈥 she says.

It also stems from her very own personal experiences with racially biased technologies. 鈥淚鈥檇 build something at a hackathon and wonder why it couldn鈥檛 detect my face, or why an automated faucet can鈥檛 detect my hand,鈥 she says.

Raji shared her experiences with computer scientist and digital activist, Joy Buolamwini, at MIT鈥檚 Media Lab. This led to the internship, and to Raji becoming the lead author .

鈥淚 know it looks like I wrote a research paper in three months,鈥 says Raji. 鈥淏ut this issue has been percolating inside of me for much longer.鈥

Raji is currently finishing her last term in engineering science and running a student-led initiative called Project Include, which trains students to teach computer programming in low income neighbourhoods in Toronto and Mississauga. She is also a mentee at Google AI. As part the mentorship program, she is working on a new thesis that focuses on practical solutions to hold companies accountable.

鈥淧eople sometimes downplay the urgency by saying, 鈥榃ell, AI is just so new,鈥欌 says Raji. 鈥淏ut if you鈥檙e building a bridge, would the industry allow you to cut corners and make those kinds of excuses?鈥

Topics

The Bulletin Brief logo

Subscribe to The Bulletin Brief

Engineering