Clearview AI sued in California over ‘most dangerous’ facial recognition database
Separately, the Chicago Police Department stopped using the New York company’s software last year after Clearview AI was sued in Cook County by the ACLU.
ALAMEDA, California — Civil liberties activists are suing a company that provides facial recognition services to law enforcement agencies and private companies around the world, contending that Clearview AI illegally stockpiled data on 3 billion people without their knowledge or permission.
The lawsuit, filed in Alameda County Superior Court in the San Francisco bay area, says the New York company violates California’s constitution and seeks a court order to bar it from collecting biometric information in California and requiring it to delete data on Californians.
The lawsuit says the company has built “the most dangerous” facial recognition database in the nation, has fielded requests from more than 2,000 law enforcement agencies and private companies and has amassed a database nearly seven times larger than the FBI’s.
Separately, the Chicago Police Department stopped using the New York company’s software last year after Clearview AI was sued in Cook County by the American Civil Liberties Union.
The California lawsuit was filed by four activists and the groups Mijente and Norcal Resist. They have supported causes such as Black Lives Matter and been critical of the policies of U.S. Immigration and Customs Enforcement, which has a contract with Clearview AI.
“Clearview has provided thousands of governments, government agencies and private entities access to its database, which they can use to identify people with dissident views, monitor their associations, and track their speech,” the lawsuit says.
It says Clearview AI scrapes dozens of internet sites, such as Facebook, Twitter, Google and Venmo, to gather facial photos. Scraping involves the use of computer programs to automatically scan and copy data, which the lawsuit says is analyzed by Clearview AI to identify individual biometrics such as eye shape and size that are then put into a “faceprint” database that clients can use to identify people.
The images that are scraped include those posted not only by individuals and their family and friends but also those of people who are inadvertently captured in the background of strangers’ photos, according to the lawsuit.
The company also offers its services to law enforcement even in cities that ban the use of facial recognition, the lawsuit says.
Several cities around the country, including the Bay Area cities of Alameda, San Francisco, Oakland and Berkeley, have limited or banned the use of facial recognition technology by local law enforcement.
“Clearview AI complies with all applicable law and its conduct is fully protected by the First Amendment,” attorney Floyd Abrams, who represents the company, said in a written statement.
The company has said it saw law enforcement use of its technology grow by 26% following January’s deadly riot at the U.S. Capitol.
Facial recognition systems have faced criticism because of their mass surveillance capabilities, which raise privacy concerns, and because some studies have shown that the technology is far more likely to misidentify Blacks and other people of color than whites, which has resulted in mistaken arrests.
Clearview AI’s chief executive officer, Hoan Ton-That, said in a written statement that “an independent study has indicated the Clearview AI has no racial bias.
“As a person of mixed race, having non-biased technology is important to me,” he said.
He said the use of accurate facial recognition technology can reduce the chance of wrongful arrests.
The lawsuit said Facebook, Twitter, Google and other social media firms have asked Clearview AI to stop scraping images because doing so violated their terms of service with users.
Clearview AI also is facing other challenges. The ACLU lawsuit in Cook County accuses the company of violating the Illinois Biometric Information Privacy Act, a law that protects current and former residents’ facial and fingerprint identifiers from being used without consent.
Privacy watchdogs in Canada and the European Union have expressed their concerns about Clearview, which halted operations in Canada last year.
But privacy commissioners this year asked the company to remove data on Canadian citizens, with one commissioner saying the system puts all Canadians “continually in a police lineup.”