Facial recognition technology with global reach puts privacy in peril

Ubiquitous surveillance cameras could be used, possibly by bad actors, to identify people wherever they go.

SHARE Facial recognition technology with global reach puts privacy in peril
IRS_Facial_Recognition.jpg

On Feb. 7, the IRS announced it would transition away from using a third-party service for facial recognition to help authenticate people creating new online accounts.

Susan Walsh/AP

Most people would hesitate to live in a home constructed entirely of windows that put their entire lives on display.

But that’s where we are headed online. Most recently, Clearview AI told investors it expects to be able to use its facial recognition technology to identify almost everyone in the world within a year. Manhattan-based Clearview AI’s technology goes beyond anything Big Brother dreamed of. It’s time for government, particularly at the federal level, to put its foot down as heavily as it can.

Editorials bug

Editorials

The company has not said it would make the technology available to just anyone who asks for it. But the increased threats of cyberattacks in the wake of Russia’s invasion of Ukraine are a reminder that once our facial images are in an enormous database, they might easily fall into the hands of bad actors. And because the images Clearview AI uses are scraped without permission from numerous websites, other companies might spring up to do the same thing.

Last month, the Washington Post reported Clearview AI is telling investors it is on track to have 100 billion facial photos in its database, which comes to about 14 photos per person. The photos are scraped from Facebook, YouTube, Venmo, news media and millions of other websites. Governments, police departments and others can use the technology to identify almost anyone who comes within a surveillance camera’s range, which in some areas is pretty much everywhere. That covers a lot of turf.

Early last year, the Chicago Police Department quietly signed a two-year, $49,875 contract with Clearview AI in hopes of identifying more criminals. The contract ended in May 2020 in the face of criticism.

Illinois is waging a lonely battle against Clearview AI’s facial recognition abuses. The state filed a lawsuit alleging Clearview did not ask for individuals’ permission and inform them how it would use their biometric information, as required under Illinois’ 2008 Biometric Information Privacy Act. Recently, U.S. District Judge Sharon Johnson Coleman declined to issue a summary judgment requested by Clearview AI and upheld most of Illinois’ arguments. The United Kingdom and Australia have fined Clearview for violating their privacy rules.

At the moment, if people don’t want to be tracked electronically, they can leave their cellphones at home. But once facial recognition is everywhere, even that low-tech option won’t work. Do we want authorities to be able to identify every dissident at a rally? Individuals with the technology, including stalkers, could instantly learn the names, addresses and the rest of an electronic profile belonging to someone they happen to see.

If they use eyewear with connectivity, something some tech observers believe might go mainstream this year, targets wouldn’t know their photos had been taken surreptitiously and their identity revealed.

Opinion Newsletter

Opinion This Week


A weekly overview of opinions, analysis and commentary on issues affecting Chicago, Illinois and our nation by outside contributors, Sun-Times readers and the CST Editorial Board.

What if Clearview AI or a similar company sells its services abroad? That could be used to flag someone working undercover on behalf of the United States. Spies have the same privacy concerns ordinary people do but with higher stakes. If someone in a foreign country suspects someone because, say, that person regularly goes to a certain office, authorities there could easily learn who it is if they use Clearview AI’s service to match a current facial image with one the target might have posted on social media as a teenager.

National security experts have a word for that: “terrifying.”

Clearview AI says its patented algorithm has helped find abducted children, identify people with dementia and apprehend drug traffickers, sex offenders and other criminals.There can be a place for facial recognition if it is used to solve crimes within a legal framework that protects the privacy of the innocent.

But we don’t want facial recognition to lead us into a dystopian world where our identities are constantly laid bare. The time to act is now.

Send letters to letters@suntimes.com.

The Latest
The $19.5 million PCC Primary Care Pavilion will offer a gym, dance center, demonstration test kitchen, community meeting spaces and a community garden and urban farm to Austin residents to help lower the life expectancy gap.
“I’m a big believer in earning stuff,” Keuchel said.
The shooting happened at the Warwick Allerton Hotel, officials said.
“The individual who’s in custody right now has five carjacking arrests on his record in the last two years. You could argue he never should have been on the street in the first place,” Ald. Brian Hopkins (2nd) told a packed house at a community safety meeting Monday evening.
The state will hold the Buffalo shooting suspect accountable for this act of terror. The politicians and propagandists propagating the lies and the hate surely won’t consider themselves responsible.