close
close

Facial recognition in police activity receives state fences • Florida Phoenix

Facial recognition in police activity receives state fences • Florida Phoenix

In January 2020, Farmington -Chilles, Michigan, Robert Williams resident spent 30 hours in custody after the algorithm pointed him to a potential match for a suspected robbery for one and a half years.

The City Police Department sent an image from the security staff at Detroit’s clock store to Michigan to go through the technology of face recognition. Williams photo in the State Police in the State Police Database was a possible match, technology reports.

But Williams was not somewhere near the store on the day of the robbery.

The Case of Williams, now regulated by a lawsuit filed in 2021 by the US Union of Civil Liberties and the initiative of the Law School of the Law School of Michigan, was the first public case of unlawful arrest because of the wrong use of facial recognition technology (FRT) in the police.

But it is not worth it on its own. A few more documented cases of erroneous arrests through FRT came out of Detroit in the following years, subsequent to Williams’s arrest and all over the country, at least seven People were mistakenly arrested Once the police have found potential compliance in the depth of FRT databases.

Williams’s claim was a catalyst for changing the way as Detroit’s Police Department can use this technology, and other illegal claims and cases are cited in the proposed legislation concerning technology. Although it can be difficult for legislation of technology that is gaining popularity, confidential defenders say that freely use is a danger to everyone.

“When the police rely on it, rely on them, people’s lives can be turned upside down,” said Nate Wesler, one of the deputy director of the speech project, privacy and technology in national water.

How does the police use the FRT?

The technology of face recognition has become comprehensive in the lives of Americans and can be used for small, personal tasks such as unblocking the phone or in large endeavors, such as moving thousands of people through checking the security of the airport.

Technology is built to evaluate a photo, often called a probe image, against a public photo database. It uses biometric data such as eye scanning, facial geometry, or distance between features to evaluate potential compliance. The FRT software converts data into a unique line of numbers called face print and will present a set of rating potential correspondence from its images database.

When the police use these systems, they often load images from the security camera or the body that wore the body. AI Clearview, which often concludes police contracts and has developed a version specifically for investigation, says it contains more than 50 billion facial images from public web sites, including social media, Mugshots and driving photos.

Katie Kincy, the head of the apparatus and technical policy of the Police project, an organization -oriented police organization, said that if you are an adult in the US, your photo is included in the Clearview database, and scanned when police are looking for FRT matches.

“You don’t have to have no presence on the Internet so you don’t be in this database,” she said.

The use of Federal law enforcement agencies reaches for more than two decades, said Kinsey, but local police have increased their use over the past 10 years.

Usually police use it after a crime, but civil freedoms and confidential problems are exacerbated by the idea that technology can be used to scan the face in real time, and geolocation data are added, she said. Kincy that is common with law enforcement officers to develop the best practices and Legislative proposalsShe said she believed that police forces are wary of real -time use.

Boston’s police tried to use it in search of suspected bombardment of Boston marathon 2013 Technology Defining the culprits, Kinsey said.

Incorrect arrests

The role of FRT in unlawful cases of arrest, as a rule, proceeds from cases where police do not have any reason for a crime, except for the image recorded by the security cameras, – said Margaret Kuver, Professor of Psychology at the College of Criminal Justice John Jay and an expert on eyewitness identification.

Before the technology was accessible, the police needed the investigation, leads to the fact that the suspects – physical evidence as a fingerprint or a statement of an eyewitness. But with access to the security and facial recognition cameras, the police can quickly deal with several possible suspects who are highly likely to match the match.

Having millions of faces in the database, the pool of potential suspects feels endless. Because technology finds matches that look so similar to a photo provided, someone who chooses a suspect in an array of photos can easily make the wrong identification, Kuver said. Without further investigation and traditional police work, to connect the match chosen by technology to the crime scene, the match is useless.

“You are going to increase the number of innocent people who act as suspects and you will reduce the number of guilty people,” Kuver said. “And it is this ACT that will spoil the ratio of positive identification in terms of how many are correct and how many of them are mistaken.”

In seven known cases of unlawful arrest after the FRT matches, the police did not conduct sufficient investigation of further observation, which could prevent the incidents. One Man in Louisiana He spent a week in prison, despite the fact that 40 pounds are easier than a thief, allegedly noticed in the observation footage. A woman who was eight months pregnancy in Detroit was performed in custody within 11 hours After he was wrongly arrested for abduction, despite no mention that the Catholic was pregnant.

When Williams was arrested in January 2020, he was the ninth best match for a person in security personnel, Michael King, a researcher at the Harris Institute of Florida Institute (Fit) for assured information, testified in the ACLU claim. And the detectives did not investigate its location before arrest.

Detroit police used the expiration of the ending license in an array of photos presented by a contractor who prevents damages that was not present at the crime scene. The damaging contractor chose Williams as the best match with security cameras. Without further investigation, Williams in October 2018, the Detroit police arrested him and kept it in custody for 30 hours.

The lawsuit states that Williams was informed only after a few areas of questions that he was there through the match with the technology of face recognition. As part of a settlement that Williams reached the summer of 2024Detroit police had to change the way to use the technology of facial recognition. Currently, the city is watching some of the strictest use of technology throughout the country, which is legally laid on the state basis.

Police can no longer move directly from the results of the face recognition technology to the witness identification procedure, and they cannot apply for arrest warrant based solely on the results of the face recognition database, Wessler said. Since errors or prejudice can occur in technology, as well as its users, fences are important for protection against erroneous arrests, he said.

Developing laws

At the beginning of 2025, 15 states – Washington, Oregon, Montana, Utah, Colorado, Minnesota, Illinois, Alabama, Virginia, Maryland, New Jersey, Massachusetts, New -Gampshire, Vermont and Men – had some legislation. police in police. Some states, such as Montana and Utah, demand that the police are used to use face recognition, while others, such as New Jersey, say that the defendants should be informed of its use in the investigation.

At least seven more states consider clarifying as and when technology can be used – legislators in Georgia, Hawaii, Kentucky, Massachusetts, Minnesota, New -Gampshireand Western Virginia The legislation was introduced.

As All AI technologiesFacial recognition can have bias or cause shortcomings. The FRT has historically deteriorated in the groups of black faces than on white, and showed gender differences. AI learns Get better over timeBut people seem to think that simply involving people in this process, we will catch all the problems, said Wessler.

But people really have something called “automation bias”, “said Wessler,” this fierce tendency to believe in a computer release as many times as you say to someone that the algorithm may be wrong. “

So, when the police rely on the technology of face recognition as their main investigative tool, instead of following senior law enforcement practices, it is “especially insidious” when it goes wrong, Wessler said.

“I often say it is a technology that is dangerous when it works and is dangerous when it doesn’t work,” Wessler said.

Kincy said that in her work with the Police project, she found bilateral support for placing a police fence using this technology. For several meetings with confidential defenders, police forces, legislators and scientists, the Police draft has developed Legislative Control List.

It outlines how police departments can use transparency, testing and standards, officers’ training, procedural boundaries and disclosure of those who are accused of crimes. He also says that the legislation should require suppliers to disclose documentation on their FRT systems, and that the legislation should provide ways to solve violations of their use.

The Police Project gives such recommendations for Congress consideration, and although Kingi said that she believes that federal instructions are important, we may not see that federal legislation is not accepting soon. Meanwhile, we are likely to see that states affect each other, and the latest laws in Maryland and Virginia are an example of a broad approach to the regulation of the FRT in different areas.

Kingy said they claim that technologies are important in their meetings with police. She said she believed that there is a place for FRT and other technologies used by police as Readers of license plates And security cameras, but that does it freely can cause great harm.

“We believe that some of them can absolutely provide the benefits of crime, victims,” ​​Kinsey said. “But the use of these tools, using them according to the rules that are public, transparent and accountability, are not mutually exclusive goals. They can actually happen at the concert.”

You do our job possible.