Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Legislation to ban government use of facial recognition hits Senate for the third time

Welcome back

AI facial analysis demonstrates both racial and gender bias

Swapna Krishna
Swapna Krishna

Researchers from MIT and Stanford University found that that three different facial analysis programs demonstrate both gender and skin color biases. The full article will be presented at the Conference on Fairness, Accountability, and Transparency later this month.

Specifically, the team looked at the accuracy rates of facial recognition as broken down by gender and race. “Researchers at a major U.S. technology company claimed an accuracy rate of more than 97 percent for a face-recognition system they’d designed. But the data set used to assess its performance was more than 77 percent male and more than 83 percent white.” This narrow test base results in a higher error rate for anyone who isn’t white or male.

In order to test these systems, MIT researcher Joy Buolamwini collected over 1,200 images that contained a greater proportion of women and people of color and coded skin color based on the Fitzpatrick scale of skin tones, in consultation with a dermatologic surgeon. After this, Buolamwini tested the facial recognition systems with her new data set.

The results were stark in terms of gender classification. “For darker-skinned women . . . the error rates were 20.8 percent, 34.5 percent, and 34.7,” the release says. “But with two of the systems, the error rates for the darkest-skinned women in the data set . . . were worse still: 46.5 percent and 46.8 percent. Essentially, for those women, the system might as well have been guessing gender at random.”

There have certainly been accusations of bias in tech algorithms previously, and it’s well known that facial recognition systems often do not work as well on darker skin tones. Even with that knowledge, these figures are staggering, and it’s important that companies who work on this kind of software take into account the breadth of diversity that exists in their user base, rather than limiting themselves to the white men that often dominate their workforces.

 

Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics   

(11)

Report Post