Detroit Police Sued—Again—For Reckless Use Of Facial Recognition Technology
The Detroit Police Department’s use of facial recognition technology has come under scrutiny following three wrongful arrests made by officers who used the technology to solve crimes. A federal lawsuit has been filed against the city and Det. LaShauntia Oliver.
Facial recognition technology was used to investigate a pregnant woman, Porcha Woodruff, which resulted in her arrest.
On Feb. 16, the detective targeted Woodruff, according to her attorney, Ivan Land. The expectant mother was accused of carjacking and robbery. Oliver knew that the suspect in the case was not a pregnant woman but failed to investigate the case properly.
Woodruff was detained in front of her crying children and questioned for 11 hours. Her phone was also searched. Woodruff, who was eight months pregnant at the time, suffered from severe dehydration and had contractions due to high stress levels.
She was taken to the hospital for treatment after she met her bond of $100,000. According to the lawsuit, charges against Woodruff were later dropped, and Oliver made no mention in her police report that the actual suspect in her case was pregnant.
Land wrote in the federal lawsuit, “The need for reform and more accurate investigative methods by the Detroit Police has become evident as we delve into the troubling implications of facial recognition technology in this case.”
Only six people say that they have been falsely accused by facial recognition technology, but all six have been Black. This outcome was a primary concern of the ACLU when it warned about the potential for harm in how facial recognition software would be deployed against Black citizens. The group took the Detroit Police Department to court in April 2021 after the police wrongfully arrested Robert Williams in 2020 using the same facial recognition technology that they warned could be used against Black people with impunity.
Detroit Police Chief James White told the outlet that the allegations in the lawsuit concerned him, saying, “We are taking this matter very seriously, but we cannot comment further at this time due to the need for additional investigation. We will provide further information once additional facts are obtained and we have a better understanding of the circumstances.”
The use of artificial intelligence in policing has become a civil liberties issue. In September, Wired reported that the nonprofit civil liberties group Electronic Privacy Information Center (EPIC) sent a letter to United States Attorney General Merrick Garland asking him to investigate whether or not cities using ShotSpotter are violating the Civil Rights Act. ShotSpotter, a technology that is supposed to be able to detect gunshots, is being placed in areas where Black people are abundant. Still, its compliance with the Civil Rights Act has never been seriously assessed.
According to EPIC’s letter, “State and local police departments around the country have used federal financial assistance to facilitate the purchase of a slew of surveillance and automated decision-making technologies, including ShotSpotter.” Ron Wyden, a U.S. senator concerned with privacy issues, told Wired that he would push Garland to accept EPIC’s recommendations, saying, “There is more than enough evidence at this point to conclude that technologies like ShotSpotter do essentially nothing to stop crime, but instead have a well-documented discriminatory impact on marginalized and vulnerable communities.”