And the research is clear on the biases in the system:
The systems falsely identified African-American and Asian faces 10 times to 100 times more than Caucasian faces, the National Institute of Standards and Technology reported on Thursday. Among a database of photos used by law enforcement agencies in the United States, the highest error rates came in identifying Native Americans, the study found.Another group it isn't good at? Children.
The technology also had more difficulty identifying women than men. And it falsely identified older adults up to 10 times more than middle-aged adults.
No comments:
Post a Comment