CNN Business
—
In recent years, several cities and states have passed laws banning or restricting the use of facial-recognition technology by local police. The controversial technology is now making a comeback in a few locations across the country.
New Orleans passed in 2020 an ordinance prohibiting its police department from using facial recognition software. However, the city decided in July that it would allow its officers to request permission from a superior to use facial recognition software in violent crime investigations. In Virginia, however, the state outlawed facial recognition technology by local police and campus police in 2021. A bill was then passed in March that allowed its use by police in certain situations. California’s law in 2020 that temporarily prohibited state and local law enforcements from using facial recognition software in body cameras expired at the end of the year. It was enacted in three years and attempts to make it permanent were unsuccessful in the state’s Senate.
These stunning images were created by AI. Here are the reasons experts are concerned
Federal laws do not govern facial-recognition technology. This has led to states, cities and counties to regulate it in their own ways, especially when it comes to how law enforcement agencies can make use of it.
There are two types of facial recognition software. One compares a photograph of a person with faces in a database looking for a match (the type of software police might use to investigate a crime such as Clearview AI), and the other compares photos of people to another image (the one you see when you open your iPhone with a face).
Although the technology has been increasingly used in the United States over the past few years, it has also been criticised by privacy and digital rights organizations over privacy issues and potential dangers. For example, the technology has been found to be less accurate in identifying people of color and several Black men have been wrongfully detained due to facial recognition.
Adam Schwartz, Senior Staff Attorney at the Electronic Frontier Foundation, believes that the bans and subsequent changes are a “pendulum swing.”
Since 2019, there have been roughly two dozen bans on facial recognition in various types that have been enacted in American communities and some states. Many of them were enacted in 2020. Schwartz pointed out that there was a push to limit police use of surveillance technology in the wake the death of George Floyd in May that year. He said that the pendulum has been swinging in the law-and order direction over the past year.
There are swings in American politics between being afraid or worried about government surveillance and being afraid of criminality. He said that there was a short-term swing in favoring fear of crime, and that the EFF is optimistic that the overall trend toward limiting government surveillance technology use is towards limiting government use.
Six of seven council members voted for the ban on facial-recognition technology in New Orleans in late 2020. It was part of a larger ordinance to regulate surveillance technologies in the city. One member was absent. However, the ordinance that would allow police officers to use facial-recognition tech was voted down by four council members and two others when they were counted in July. One counsel member was absent.
Two years later, the turnaround is achieved after a rise of homicides following a decline in 2016 to 2019.
The new rule allows city police to request facial-recognition software in order to assist investigations into a wide range violent crimes, including murder and rape, kidnapping, and robbery.
LaToya Cantrell, New Orleans mayor, expressed her gratitude for the July 21 vote by the city council in favor facial-recognition technology.
Lesli Harris is a New Orleans council member who opposed July’s ordinance. She is concerned about how this legislation might impact civil rights in the city. Harris stated that facial recognition is difficult for her because she is a woman of colour. She pointed out that studies have shown that technology can be less accurate in recognizing people of color and women of color.
Virginia’s facial-recognition technology ban was put into effect by legislation in July. It was only allowed to be used by the state legislature. The state’s 2022 legislation, which took effect in July, basically reverses the 2021 rule and allows local and campus police to use the technology in certain situations.
Scott Surovell (a Virginia state senator) introduced the new rule. He stated that it was intended chiefly to be a “lead generator” which police would have their own confirmation before arresting a suspect. He also noted that while the 2021 legislation prohibited local police from using facial recognition software, it did not prevent Virginia state law enforcement either from using it or using it for local police.
The 2022 legislation requires police agencies to publish a report every year about their use of facial-recognition technology.
It is still unknown how often facial recognition technology is used in the United States. Clearview AI, which counts over 3,100 US agencies as its customers, has been embracing it for years. Clearview AI alone claims that it has more than 3,100 US agencies, including the FBI, Department of Homeland Security and “hundreds of other local agencies”.
Surovell hopes that more rules will be passed in order to regulate technology in other states. This is similar to how law enforcement uses technologies like radar, breath testing and substance analysis.
He stated, “I believe it’s important that the public have faith in law enforcement’s ability to do their job. That these technologies be regulated is essential and that there be transparency about their use so people are able to assess for themselves if it’s accurate or being abused.”
However, recent developments suggest that it may not be an easy ride.
An amendment that Harris and two other council members supported that would have established guidelines regarding how facial-recognition technology is used by the city’s police force — such as requiring court approval for each use and monthly reports on how it was used — failed to pass in July. Chris Kaiser, the advocacy director for ACLU of Louisiana, stated that he was concerned about the changes in the city’s rules regarding facial-recognition software.
He said, “We don’t understand why your objection to these safeguards.”
On Thursday, the trio of New Orleans councilmembers tried again. Their amendment was amended in several ways. It removed the need for judicial approval before using the technology and required quarterly reports rather than monthly regarding its use. It passed this time.