Technology

In West Lafayette, a ban on facial recognition technology failed, but questions remain about its use in Indiana

Follow us on Twitter

In West Lafayette, a ban on facial recognition technology failed last December, but supporters say they plan to launch another ban attempt this week.

West Lafayette board member David Sanders said he would reintroduce a ban on the technology at Monday night’s board meeting. In an effort to improve the ordinance’s chances of passing after last year’s veto, Sanders said it would include exceptions allowing police to use the technology in violent crimes.

The question is whether there is adequate oversight of the controversial technology – and of the software companies contracting with the state to provide it.

Facial recognition technology uses computer programs to compare individuals captured on video or camera against a database of faces. Law enforcement say it’s an important tool for investigations – but privacy advocates say photo databases can include images taken from social media without the knowledge or consent of a person.

In Michigan driver’s license photos are automatically shared with state police. Misidentifications are rare, but they have led to wrongful arrests.

In December, the West Lafayette City Council voted against banning facial recognition technology – after the city’s mayor announced a veto.

During the order hearing, Councilman David Sanders asked West Lafayette Police Chief Troy Harris what company the department used to conduct facial recognition searches – specifically asking if he had heard talk about a major provider, Clearview AI.

“Clear view? I’ve never heard of that,” Chief Harris replied.

Clearview AI is one of the most controversial companies providing facial recognition technology to law enforcement. The company has been criticized for its practice of shooting photos from places like Facebook without user consent. These photos are then inserted into a database used to identify suspects.

Numerous law enforcement agencies across the country have used Clearview’s software – and privacy advocates fear there has been no proper oversight of the businessthe technology or its use.

Jameson Spivak is an associate at Georgetown Law’s Center on Privacy and Technology, a think tank that studies surveillance and privacy law. He said traditional facial recognition databases involve one-to-one matching – a photo of a suspect is matched against a photo ID or driver’s license photo.

“In the case of Clearview AI, what’s truly unprecedented is that this database includes faces that they’ve scraped from the internet,” he said. “So basically if your photo is on the internet and it’s been tagged, you can potentially be in that database. Probably the vast majority of people in this database have no idea they are there.

Spivak said the Center on Privacy supports a moratorium on the use of facial recognition technology.

“In most cases, this technology was deployed without anyone really knowing about it in the public,” he said. “Often elected officials don’t even know it’s being used because most law enforcement gets this technology not through the city, state, or county budget, but from the federal government. or non-profit police associations.. Basically the city council might not even have known it was being used.

In West Lafayette, police officials said facial recognition requests are sent to either the Indiana Bureau of Motor Vehicles or the Indiana Intelligence Fusion Center. But Investigations Lt. Jonathan Eager said facial recognition technology is “rarely” used.

“Normally we don’t get a decent photo of a suspect in an investigation, so that ends up being a resource we don’t use,” he said.

The Indiana Bureau of Motor Vehicles did not respond to the WBAA’s request for comment.

Captain Ron Galaviz is the Director of Public Information for the Indiana State Police. He said law enforcement across the state sends requests for facial recognition to the Indiana Intelligence Fusion Center. According to Galaviz, photos sent there for facial recognition search must be “legally obtained”.

“We have a policy on our website which is open to the public,” he said. “At the end of the day, we are very aware and want to be very respectful of the rights of these people who move there. We want to operate within the limits of ensuring that people’s rights are not violated. »


State policy on facial recognition use indicates that the technology can be used on an image if there is a “reasonable suspicion” that the subject is involved in or has knowledge of “possible criminal or terrorist activity”.

Galaviz said the claims require a “criminal connection” to be worth pursuing.

“So again, going to a protest or demonstration, a gathering where people have a right to be – if there’s no criminal connection to it, the request won’t be processed,” did he declare.

Galaviz also pointed out that facial recognition matches are not enough to convict someone of a crime. Fusion Center policy states, in all capitals, that facial recognition search results are “investigative leads…NOT TO BE CONSIDERED A POSITIVE IDENTIFICATION OF ANY SUBJECT.”

“When we talk about using this type of technology, it was definitely up to the Fusion Center to have a detailed step-by-step policy – ​​what are the parameters and the limits under which this technology can be used,” Galaviz said.

But privacy advocates like Spivak worry that policies guiding law enforcement’s use of technology aren’t enough to hold police accountable.

“Law enforcement everywhere is saying ‘we’re only using this for an investigative lead,'” he said. “But there’s nothing holding them back.”

When first asked if the Fusion Center uses Clearview AI, Galaviz noted that the facial recognition policy only mentioned Vigilant Solutions – a separate facial recognition technology provider. Privacy advocates say Vigilant Solutions has a more established brand as a license plate identification software company and, most importantly, has undergone federal accuracy testing – something Clearview hasn’t. not done until just last year.

In Indiana Intelligence Fusion Center policy documents, the only facial recognition vendor mentioned is Vigilant Solutions, but a 2021 Buzzfeed investigation found state police interviewed Clearview. more than 5,000 times for a period beginning in 2018 and ending in February 2020. Fusion Center’s Privacy Policy is dated June 1, 2019.

And, according to Clearview herself, the Indiana State Police were the “first paying customer.”

When asked when Indiana State might have moved from Vigilant to Clearview, Galaviz replied via email that “both platforms are used as checks and balances.”

A FOIA request made by the WBAA revealed that in 2020 and 2021, Indiana State Police used Vigilant Solutions 373 times. Clearview AI, according to the FOIA request, was used 3,067 times during the same period – for about eight times as many searches.

When asked to explain why Clearview AI is not mentioned in state policy documents despite being used more, Galaviz simply said that “Clearview provides a broader data set.” When prompted to respond, Galaviz said, “I refer to what I’ve said before about using these platforms, which includes acknowledging the use of both platforms. “

Spivak, along with Georgetown Law, said he could only speculate why Clearview is being used without appearing in state policy documents.

“One of the things I would speculate is that the police are hiding its use because Clearview has become kind of a toxic name,” he said. “Over the past few years, a number of media outlets have discovered that in developing their system, Clearview violated the terms of service of Facebook, YouTube, a bunch of social media platforms because they harvested images from these Web sites.”

And Spivak said the lack of clarity makes it harder to ensure police are following their own policies.

“It leads to confusion and less transparency, and then ultimately less accountability,” he said.

The situation in West Lafayette is not unique. According to Spivak, there has been little effort at the federal level to provide oversight and accountability for facial recognition technology – pushing local governments to take up the issue themselves.

“Into this vacuum where the federal government is doing nothing, states and cities are stepping in to push things through,” he said. “Because the federal government isn’t doing anything about it, and because maybe even the states aren’t doing anything about it, local activists, local politicians have stepped up to try and do something about it.”

Across the country, Spivak said more than twenty cities and two states have adopted moratoriums on the use of facial recognition technology — though those moratoriums sometimes include exclusions for violent crimes.

West Lafayette council member David Sanders said he hoped the changes to the ordinance would help him pass this time around. He said he continued to worry about police use of technology.

“The fact that there’s so much interaction with Clearview AI, and yet it’s not present on their public records, that tells me something,” he said. “He says there’s something to hide and he says they clearly don’t think the public should know they’re using Clearview AI.”