Advertisement
Facial recognition
LifestyleTravel & Leisure

Facial recognition software biased against Asians and black people, major US government study finds

  • Tests on 189 algorithms from 99 manufacturers, who represent most of the industry, found higher number of incorrect matches for minorities than for white people
  • Use of facial recognition is set to widen at airports worldwide, and travellers may decide it’s worth the trade-off in accuracy if they can save a few minutes

Reading Time:3 minutes
Why you can trust SCMP
Facial recognition software tests in the United States shows it is biased against ethnic minorities. Algorithms produce a higher number of incorrect matches between two photos of Asian and black people relative to white people. Photo: The Washington Post via Getty Images
Associated Press

Facial recognition software has a higher rate of incorrect matches between two photos for Asian and black people relative to white people, a United States government study has found.

The evidence of bias against minorities in the software comes as its use is set to expand at airport security checkpoints in Asia, Europe and the United States.

The US Transportation Security Administration (TSA) and US Customs and Border Protection (CBP) have been testing facial recognition technology at airports across the US, expecting it will become the preferred method to verify a passenger’s identity.
Advertisement

The National Institute of Standards and Technology researchers studied the performance of 189 algorithms from 99 manufacturers representing most of the industry. Some algorithms performed better than others, they concluded, meaning that it’s likely the industry can correct the problems.

Facial recognition software has a higher rate of incorrect matches between photos for Asian and black people relative to white people. Photo: AFP
Facial recognition software has a higher rate of incorrect matches between photos for Asian and black people relative to white people. Photo: AFP
Advertisement

The institute found that US-developed algorithms had the highest rate of incorrect matches, or false positives, for American Indians.

Advertisement
Select Voice
Choose your listening speed
Get through articles 2x faster
1.25x
250 WPM
Slow
Average
Fast
1.25x