Human Created Facial Recognition Tools Practice Racial Biases

Human Created Facial Recognition Tools Practice Racial Biases

0 0

Background Issue

Facial recognition technology is not new in this technologically advanced era. It is being used in smartphones as well by the security machines for years now. However, recent research has highlighted a problematic area, and that is the facial recognition tools are practicing racial biases.

Facial recognition tools face difficulty in identifying certain races!

You have read it right! The facial recognition tools cannot correctly identify people belonging to the African American, native, and Asian races. There have been enough racial issues and stigmatization in the twentieth century, which significantly impacted the progress of the world. Racial biases being practiced by such modern technology is not acceptable for the twenty-first-century generation.
Learn more about cute humanoid robot SEER, here!

What are the causes?

You must be wondering about the grounds and basis of such a statement. Well, NIST conducted research on 189 software algorithms created by 99 developers from companies like Microsoft, Intel, and Panasonic. The researchers claimed that amazon did not submit its algorithm, and it specifically practices racial, as well as gender bias.
The research also highlighted that there are more demographic differentials, false positives in the algorithms. It means that demographic data of people is not stored correctly in the algorithms, due to which it confuses people and practices racial bias.
The problem basically occurs in one-to-many matching, in which the algorithm confuses the people belonging to different races. In the case of gender, it faces difficulty in identifying the women as compared to men, which highlight gender biases of the facial recognition tool.
Want to know how biasness can be avoided in machine learning? Explore here!

Take away!

NIST has conducted the research quite fairly by accessing data of millions of people from the FBI. The developers need to fix the algorithmic issues to sort out the issue. If facial recognition tools also practice biasness, society cannot make progress. It will stay tangled in issues of segregation and stigmatization.