https://debbiecoffey.substack.com/p/ai-endgame-ai-and-the-risks-of-facial
June 20, 2025
By Debbie Coffey, AI Endgame
As you know, over the last couple of weeks, there’ve been numerous news stories about protests, and a military parade, in our country. I read a Los Angeles Times article about an incident that occurred during the protests in downtown Los Angeles on June 8, and I wanted to bring this to your attention because it relates to AI, facial recognition technology, and risks to our privacy and civil rights.
Facial recognition technology
Facial recognition is a biometric technology that is used to verify a person’s identity based on their facial features. It maps and analyzes the unique characteristics of a person's face, including the distance between eyes, the shape of the nose, and the contour of the jaw.
This facial data is then used to create a digital “facial template.” A facial template can be compared to millions of images in massive databases containing photos, video, or real-time live feeds from cameras around the world. [1]
“I have all of you on camera. I’m going to come to your house”
The Los Angeles Times news article claimed that as a Los Angeles Police Department (LAPD) helicopter hovered over a crowd of protesters on 1st Street, a booming voice from the loudspeaker of the helicopter cut through the air, stating: “I have all of you on camera. I’m going to come to your house.” [2]
Was this a scare tactic? If so, who authorized this?
Facial recognition technology is not accurate
One reason facial recognition technology is so troubling is because it’s inaccurate.
The Electronic Freedom Federation (EFF) notes “Facial recognition software is particularly bad at recognizing Black people and other ethnic minorities, women, young people, and transgender and nonbinary individuals.”
Facial recognition technology has demonstrated failures in both its design and implementation. Flawed facial recognition technology could falsely implicate people for crimes they didn’t commit, or could target the wrong people. [3]
If facial recognition technology makes errors in recognizing people with darker skin, women, and children, doesn’t this mean it misidentifies at least half the population?
There are reasons to worry about facial recognition technology.
Google Photos once labeled two black people as gorillas. [4]
Amazon’s Rekognition facial-recognition technology misidentified three-time Super Bowl champion Duron Harmon of the New England Patriots, Boston Bruins forward Brad Marchand, as well as 25 other New England pro athletes, as criminals after matching photos of these athletes to a database of mugshots. This was done as a test, arranged by the Massachusetts branch of the American Civil Liberties Union (ACLU). [5]
If facial recognition technology misidentifies world famous celebrities, what’s stopping it from tagging you as a criminal?
A couple of months ago, Juan Carlos Lopez-Gomez, who has U.S. citizenship and a Social Security card, was arrested because ICE suspected him of being an “unauthorized alien,” based upon a biometric “confirmation” of his identity. Immigration and Customs Enforcement kept Lopez-Gomez on ice in a county jail for 30 hours, until they finally discovered that the facial recognition technology had made a mistake. [6]
Would you want someone you love to be misidentified and jailed? Are we all only one software mistake away from being falsely accused and arrested?
How can facial recognition get it wrong?
Some of the reasons facial recognition systems can be inaccurate include:
· Low-quality images from poor lighting conditions or from awkward angles.
· People's facial features change due to aging, weight gain or loss, haircuts, facial hair, or makeup.
· Facial recognition works best when a person directly faces the camera. But real-life conditions include profile views, hats, masks, glasses, or scarves that make identification difficult.
· Facial recognition systems are susceptible to “spoofing,” where individuals try to deceive the system by using photos, videos, or 3D masks to cover their faces.[7]
· When many people are being scanned, like at a sports stadium, there’s a greater risk for misidentification. [8]
Facial recognition technology can create bias
When AI identifies a suspect, it can create a powerful, unconscious bias against the person who was identified by facial recognition technology, and veer the focus of a law enforcement investigation away from other suspects. [9]
Yet, government agencies and businesses worldwide use facial recognition technology.
Another frightening fact is that facial recognition technology can be used by autonomous weapons, including drones. [10]
Joy Buolamwini and the Algorithmic Justice League
Joy Buolamwini, PhD, an MIT Media Lab researcher, is featured in the documentary Coded Bias. Coded Bias sheds light on the threats AI poses to civil rights and democracy. You can watch a trailer here.
“AI systems are increasingly infiltrating our lives, influencing who gets a job, which students get admitted to college, how cars navigate the roads, what medical treatment an individual receives, and even who we date. And while builders of AI systems aim to overcome human limitations, research studies and headlines continue to remind us that these systems come with risks of bias and abuse.”
Buolamwini founded the organization Algorithmic Justice League “to raise public awareness about the impacts of AI…and galvanize researchers, policymakers, and industry practitioners to prevent AI harms.”
Joy Buolamwini also wrote the book “Unmasking AI: My Mission to Protect What Is Human in a World of Machines.”
U.S. government agencies use of facial recognition
In 2023, the Government Accountability Office (GAO) reported that seven law enforcement agencies in the Departments of Homeland Security (DHS) and the Department of Justice (DOJ) used facial recognition services provided by commercial and nonprofit entities. These agencies initially utilized these services (for approximately 60,000 searches) without requiring their staff to undergo facial recognition training. [11]
Facial recognition technology was already known to make mistakes, so this was like the blind leading the blind.
The Transportation Security Administration (TSA), an agency of the Department of Homeland Security, asks people at airports to look into a camera so it can scan their faces (more on this below).
While facial recognition technology is currently used in airports to verify identification, a major concern is that it could be easily repurposed for other uses in the future. For example, facial recognition technology used in “real-time,” with cameras in so many locations, could be used to track anyone, anytime, and anywhere.
In other words, you will have no privacy.
This should concern all Americans, people of other countries, and all political parties.
The merger of all government databases into one big database
The libertarian Future of Freedom Foundation warned “the Trump administration is quietly collaborating with Palantir Technologies, the world’s biggest data-mining and surveillance company co-founded by billionaire Peter Thiel, “to construct a centralized, government-wide surveillance system that would consolidate biometric, behavioral, and geolocation data into a single, weaponized database of Americans’ private information.”
“This isn’t about national security. It’s about control.”
TSA facial scans at airports
You probably haven’t given much thought to what happens to your image after you consent to a TSA facial scan. You might want to consider this.
There are major concerns about the safety of biometric information storage, including the lack of transparency around the database where the information is being stored.
India McKinney, director of federal affairs at the Electronic Frontier Foundation (EFF) stated "It's not about the integrity of your face or driver's license, it's about the database where you have no control…" There's a risk of misidentification, security breaches, plus human or technological errors. [12]
Jennifer King, a privacy and data policy fellow at the Stanford University Institute for Human-Centered Artificial Intelligence, said the TSA has been “a little vague about what they were doing with the data.” Since King is concerned about the use of this data, she declines to give permission to the TSA to do a facial scan.
You can opt out of a facial scan
You can opt out of a facial scan. If you do opt out, the TSA agent should follow standard procedure and simply look at your ID and your face to verify your identity. You shouldn’t lose your place in line if you decline a facial scan.
On TSA’s website, it claims “There is no issue and no delay with a traveler exercising their rights to not participate in the automated biometrics matching technology.”
Although the TSA claims it’s “not currently” using its biometrics technology for surveillance, it has already outlined a plan to expand its biometrics capabilities “to validate and verify an identity and vetting status in real-time.” [13]
Privacy and civil liberties are being trampled
The message from the LAPD’s helicopter loudspeaker didn’t direct the protesters to move to another location, ask them to disperse, or cite any laws that were being broken.
The LAPD literally told protestors they were taking photos of them and would go to their homes.
Was this an attempt to frighten and intimidate protesters?
Civil liberties and digital privacy groups noted that if this Los Angeles Times quote is accurate, it suggests that the LAPD could be using facial recognition to identify and retaliate against protesters.
Jonathan Markovitz, a staff attorney with the ACLU of Southern California, told Mother Jones, “Even if it were a joke, it was clearly designed to make the public afraid to exercise its First Amendment rights to protest…”
Matthew Guariglia, a senior policy analyst at the Electronic Frontier Foundation, stated “You have constitutionally protected rights to protest.” “When you have somebody wielding surveillance in a specific way to try to chill and deter people from protesting, that’s a violation of your constitutional rights.” [14]
If this happened in Los Angeles, it could happen anywhere.
AI facial recognition technologies may now be used against specific groups, but could you or your “group” be targeted in the future?
Find links to all past AI Endgame newsletters HERE.
What you can do:
1) Let your Congressional representatives know that you want strong AI regulations.
Find out how to contact your Congressional representatives here: https://www.house.gov/representatives/find-your-representative
Find out how to contact your Senators here: https://www.senate.gov/senators/senators-contact.htm?Class=1
2) Support (and if you can, make donations) to organizations fighting for AI Safety:
Pause AI https://pauseai.info/
Center for AI Safety https://safe.ai/
Center for Humane Technology https://www.humanetech.com/who-we-are
[1] https://www.geeksforgeeks.org/blogs/problems-in-facial-recognition/
[2] https://www.latimes.com/california/live/national-guard-troops-la-immigration-ice-clashes-paramount#p=i-have-all-of-you-on-camera-im-going-to-come-to-your-house
[3] https://sls.eff.org/technologies/face-recognition
[4] https://www.cnet.com/science/why-facial-recognitions-racial-bias-problem-is-so-hard-to-crack/
[5] https://www.analyticsinsight.net/artificial-intelligence/famous-ai-gone-wrong-examples-in-the-real-world-we-need-to-know
[6] https://slate.com/technology/2025/06/ice-facial-recognition-camera-surveillance-mistake-deported.html
[7] https://www.geeksforgeeks.org/blogs/problems-in-facial-recognition/
[8] https://www.researchgate.net/publication/368361013_When_facial_recognition_does_not_'recognise'_erroneous_identifications_and_resulting_liabilities
[9] https://innocenceproject.org/news/when-artificial-intelligence-gets-it-wrong/
[10] https://interestingengineering.com/innovation/us-facial-recognition-tech-robot-drones
[11] https://www.gao.gov/assets/gao-23-105607.pdf
[12] https://www.usatoday.com/story/travel/news/2025/05/20/tsa-facial-recognition-safety/83726603007/
[13] https://www.huffpost.com/entry/tsa-facial-scan-opt-out_l_67e2f79fe4b075349cd175d6
[14] https://www.motherjones.com/politics/2025/06/los-angeles-ice-protests-helicopter/