On Dec. 5, 2023, Hays County Commissioners Court approved a request from Sheriff Gary Cutler to allow the use of Clearview AI‘s facial recognition software. This new software is a facial recognition technology the Hays County Sheriffs will use for “identifying potential suspects from photographs and recorded videos.”
This embrace of a powerful AI software by law enforcement groups ignores the reality of this problematic AI software and the biases of those using it.
To start with, it must be pointed out that the software is not artificial intelligence but is, according to The University Star, learning models “trained on millions of faces to be able to identify a single person out of a sample of over 12 million individuals.”
However, the software has been proven to have massive problems.
“Face surveillance is the most dangerous of the many new technologies available to law enforcement,” according to a report by the American Civil Liberties Union (ACLU). “Facial analysis algorithms misclassified Black women nearly 35% of the time, while nearly always getting it right for white men.”
An article by The New York Times, said “civil liberty advocates believe Clearview’s expansive database of photos violates privacy, because the images, though public on the web, were collected without people’s consent.” Many nations such as “Canada, Australia, Britain, France, Italy and Greece” have illegalized the software now approved for usage by Hays County due to its privacy violations.
The software, far from being the cutting edge of technology, is instead incredibly problematic. Not only does it fail to accurately judge individuals on their race, but it violates the basic rights of privacy.
It doesn’t help either that many Texas law enforcement groups using AI facial recognition software have deep problems themselves wh
en it comes to fairly treating minorities, with a report by the ACLU stating “police in Texas are not only racially profiling, but are also policing in an inefficient, ineffective way.”
The same report by the ACLU found “six out of every seven law enforcement agencies in Texas reported searching Blacks and Latinos at higher rates.”
These biased law enforcement agencies can, and very often do, make mistakes when investigating cases; these mistakes – supercharged with the power of AI tech – can have horrible effects on innocent individuals.
One example comes from a Harris County man who sued Macy’s and Sunglass Hut after the two companies “relied on error-prone facial recognition technology to falsely accuse him of armed robbery,” according to a report from CBS News Sacramento.
The individual was arrested and imprisoned for two weeks after both corporations’ in-house facial recognition AI matched his face with the actual robber – the only problem was, however, he was in Sacramento at the time of the robbery. Not only was an innocent individual held for two weeks for a crime he did not commit, but according to his lawsuit, he was “[assaulted] by three men [in the prison], leaving permanent injuries.”
To put it all in perspective, a law enforcement agency in a state with a well-documented issue of problematic law enforcement agencies arrested an innocent man. This indirectly caused him to allegedly be a victim of a violent assault, all because of AI facial recognition software that cannot correctly match the face of a robber to the correct individual.
There is only one solution to this problem – for citizens of San Marcos and greater Hays County to oppose the use of AI and the expansion of this technology to other law enforcement agencies. Residents can call the offices of members of the Hays County Commissioners Court and San Marcos City Council and inform them of the problems of AI technology.
The use of AI by law enforcement is incredibly dangerous, but by opposing its greater introduction, residents can minimize the dangers posed by it.
-James Phillips is an international relations junior
The University Star welcomes Letters to the Editor from its readers. All submissions are reviewed and considered by the Editor in Chief and Opinions Editor for publication. Not all letters are guaranteed for publication