DENVER — As technology develops, facial recognition software is becoming an increasing part of our daily lives, from unlocking an iPhone to complex criminal investigations.
Airports, schools, police departments and even government agencies have begun to use the software to identify people.
Now, Colorado lawmakers are taking a closer look at whether more guardrails are needed for biometric data.
Senate bill 22-113 calls for the creation of the Task Force for the Consideration of Artificial Intelligence, which will take a closer look at the use of the technology by public agencies and offer recommendations for how it should be regulated.
“It’s really focused on putting in some safeguards and some analysis around the use of artificial intelligence for facial recognition,” said Sen. Chris Hansen, D-Denver, one of the bill’s co-sponsors.
Part of the reason Hansen says the bill is necessary stems back to a 2018 study from the Massachusetts Institute of Technology, which found a high error rate in the software properly identifying women with darker complexions.
The study found a .8% error rate in the software identifying white men but a 34.7% error rate in identifying dark-skinned women.
“You can imagine the problems with sort of false identification, false positive, false negative issues and that’s really where we think we need the bill is let’s take some time to carefully evaluate how the public sector is using the technology,” Hansen said.
Advocates for the technology insist it has changed a lot since that 2018 study was conducted and gotten better at recognition.
However, Kerstin Haring, an assistant professor at the University of Denver’s Ritchie School of Engineering and Computer Science, says issues with the technology still exist.
She breaks the problem of bias down into two parts: a lack of data for artificial intelligence to learn from and build algorithms off of and a lack of diverse coders imputing the data.
“Our current data sets, for example, don’t have a lot of black people in there. So, it’s very hard for a machine-learning algorithm to then correctly identify them,” Haring said. “We can’t check our own biases as a coder when we don’t have a diverse representation in who is creating those algorithms.”
There is also a lack of understanding of how and why artificial intelligence reaches its conclusions.
The task force would meet every four months starting in October and would be repealed in 2032. The bill also calls for state and local government agencies that use or plan to use the software to submit an accountability report with information on how it uses the technology.
It would also ban law enforcement agencies from employing the software s to identify, observe, or track an individual without prior probable cause.
Schools would also be prohibited from using the technology for the time being.
“This is not about prohibition, it’s really about carefully considering the use and making sure we’re getting the right results and have the human review,” Hansen said.
Some groups like the County Sheriffs of Colorado have opposed the bill. During a committee hearing Wednesday, a representative from Security Industry Association spoke in opposition to the bill saying that there was a section of the bill that was so vague that it acted as a deterrent to the use of facial recognition technology.
Jake Parker also expressed concerns about making the product available to third parties for testing saying it would limit the availability of some of this technology. Finally, he was opposed to the ban on the technology’s use in schools, saying it could act as an important safety tool.
Sen Hansen did offer several amendments to the bill including more clarification around proper uses of the technology by law enforcement. The bill passed the committee and continues its way through the legislative process.
Copyright 2022 Scripps Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.