In the US, large corporations such as Amazon, Microsoft, IBM and Google have stopped selling facial recognition software to police because of many flaws and weaknesses.
One of them was a ‘racist’ error by the engineers who developed the facial recognition algorithm mainly white people, so the database only involved white people, and black people were easily recognised. wrong.
Porcha Woodruff is an example of the “racism” of facial recognition technology.
On August 3, Porcha Woodruff (32 years old, residing in Detroit, Michigan) filed a lawsuit against a female investigator and the city of Detroit for illegally arresting and detaining her nearly half a year ago.
At that time, a car robbery happened at the gas station. To investigate, the Detroit Police Department retrieved a camera at a gas station that recorded the suspect’s car theft and then used facial recognition software DataWorks Plus to compare. Search results turned up the name Porcha Woodruff from her identification that was seized while driving with her license expired in 2015.
And on February 16, six policemen came to the house with warrants to arrest Ms. Woodruff, then eight months pregnant, for carjacking. Her house was searched by the police, she was handcuffed and brought back for questioning. The police also seized her iPhone for evidence. After about 11 years of detention, she was released on bail.
Then, she went to the hospital for examination and the results were “stress spasms”, “slow heart rate due to dehydration”. 15 days later, because there was not enough evidence, the court dropped the charges against her.
In fact, the facial recognition system was wrong and Ms. Woodruff was mistakenly arrested.
In addition to Ms. Woodruff, there are five other people who claim to be wrongfully arrested by Detroit police due to a mistaken facial recognition system. All are black.
Three years ago, the director of the Detroit Police Department revealed a 96% error rate if this system was used alone without using other methods. Even so, the system was still used by Detroit police 125 times last year.
Following a false identification in 2019, the Detroit Police Department restricted the use of the identification system to only violent crime or trespassing investigations.
Facial recognition systems work by artificial intelligence (AI), but AI-related technology has many flaws that can be caused by non-standard data.
In 2019, the National Institute of Standards and Technology (NIST), the US, after studying dozens of algorithms, came to the conclusion that the probability of not recognizing black or Asian faces compared to when Caucasian identity is 100 times higher. Algorithms developed in the US show this difference more clearly than those developed in Asia.
In addition, there are a number of other factors that cause the facial recognition system to work incorrectly, which can confuse one person with another such as non-standard images (degraded resolution, insufficient brightness), or face changes (getting old, wearing makeup, wearing glasses).