Lightfoot’s office was blindsided by CPD’s use of controversial facial recognition software — then raised serious concerns
Hacked emails show the city only learned police were using technology developed by Clearview AI — which faces multiple suits claiming it violated the state’s biometrics privacy act — until after inquiries last year by the Sun-Times.
Early last year, the Chicago Police Department quietly started a contract with the developer of a controversial facial recognition tool in an apparent bid to solve more crimes in the city.
Leaked emails now show the arrangement blindsided Mayor Lori Lightfoot’s office, which apparently didn’t learn of the two-year, $49,875 contract with Manhattan-based Clearview AI until weeks after it took effect and days after a New York Times exposé laid bare the app’s alarming capabilities. Clearview AI’s software allows users to identify unknown people by searching their images against a database of billions of photos lifted from popular websites and social media platforms — often without the knowledge of users or even the platforms.
By Jan. 24, 2020, when Lightfoot’s office was apparently informed of the contract, the startup was already facing a lawsuit filed in federal court in Chicago alleging its software violates Illinois’ stringent biometric protection law. Although a police spokesman initially said facial recognition tools like Clearview AI add “jet fuel” to the department’s investigative abilities, emails show that mayoral staffers later raised serious concerns about the potentially illegal new technology.
“CPD should further evaluate whether Clearview’s business practices are consistent with state law and whether, in the face of substantial litigation, this service can reliably serve as part of the City’s law enforcement efforts,” Dan Lurie, Lightfoot’s policy chief, wrote in an email last March to Susan Lee, the deputy mayor of public safety at the time.
Amid mounting legal pressure and scrutiny from activists and lawmakers, Clearview AI ultimately ended its contract with the police department last May, records show.
But privacy advocates have continued to sound alarms about the police department’s use of facial recognition software and the apparent lack of oversight. Police investigators still rely on DataWorks Plus, another facial recognition program the department has used since 2013.
“It’s clear that they’re doing this in secret [and] it’s clear that no one seems to have a handle on this,” said Freddy Martinez, the executive director of the Chicago-based transparency group Lucy Parsons Labs. “So where should the buck stop?”
The mayor’s office declined to answer specific questions but sent a statement saying its concerns about the program “were promptly addressed and the Department no longer utilizes Clearview AI technology.
“It is important to note that the Department does use other facial-matching software to match faces of potential suspects with pre-existing suspects,” the mayor’s office said. “This means that the software is only used once a potential suspect has been identified in a crime, and not as a proactive tool in seeking out potential suspects.”
Emails exposed after hack
Martinez also sits on the board of Distributed Denial of Secrets, a whistleblower group that last month published a trove of hacked city emails, including messages showing that City Hall officials flagged urgent issues with Clearview AI after initially being left in the dark about the contract.
While the Clearview AI contract didn’t take effect until Jan. 1, 2020, records reviewed by the Sun-Times show the procurement process stretched back months.
A document dated Sept. 18. 2019, shows that Anthony Riccio, the police department’s second-in-command at the time, approved using federal anti-terror funding to foot the bill for the new software. On Nov. 14, 2019, the contract earned its final approval.
Days later, emails show that city officials lacked a basic understanding of the police department’s facial recognition capabilities as they worked to respond to a Sun-Times inquiry about the DataWorks contract.
“Do you have any insight here on what type of facial rec software the City uses, what departments use it, how it’s used, who can access it, etc.?” Patrick Mullane, a mayoral spokesman at the time, asked members of Lightfoot’s policy team in a Nov. 20, 2019, email.
“I have no insights, which I think is part of the problem,” Lurie responded.
Ultimately though, the mayor’s office wasn’t informed of the contract with Clearview AI until after the Sun-Times asked whether the police department was using the technology.
“We were under the impression that this contract agreement with Dataworks was the only tech software the Department had,” Mullane wrote to a group of city employees on Jan. 24, 2020. “Earlier today, CPD informed me that they started a contract on Jan. 1, 2020 ... between the Chicago Police Department and Clearview AI Technology, which uses face-matching technology to sort through public photos from social media sites — including Facebook, YouTube, Twitter and Venmo — and elsewhere on the internet.
“This technology has gotten a lot of coverage recently due to its privacy concerns,” Mullane acknowledged, linking to the New York Times story published five days earlier.
Despite the revelation, Lightfoot later offered a defiant response to a series of questions from the Sun-Times.
“CPD does not use facial recognition software,” Lightfoot wrote in an email to Lee and other staffers the following day.
Instead, her staff prepared a statement claiming the department utilized “facial matching technology.” The distinction was dubious, given the fact the police department had a designated facial recognition unit and both DataWorks and Clearview AI market their products as facial recognition tools.
Illinois’ privacy law among the toughest
On Jan. 29, 2020, the Sun-Times first reported on the police department’s deal with Clearview AI — just a week after the tech firm was sued in federal court for allegedly violating Illinois’ Biometric Information Privacy Act. The legislation, which is considered one of the country’s toughest privacy laws, protects current and former residents’ facial and fingerprint identifiers from being used without consent.
Clearview AI now faces multiple legal challenges in Illinois that could prove catastrophic for the company’s business, including a suit filed in Cook County last May by the American Civil Liberties Union that accuses the company of violating the state’s biometric law.
Ed Yohnka, a spokesman for the Illinois chapter of the ACLU, criticized the police department for secretly adding technologies like Clearview AI to its crime-fighting arsenal.
“I think it’s a matter of concern that the most powerful tools in terms of invading our personal privacy go largely unregulated and undebated because it’s done in private,” said Yohnka.
On Feb. 4, 2020, Lucy Parsons Labs, the local ACLU and more than 70 other groups urged Lightfoot to ban the use of facial recognition outright in a letter the group delivered to City Hall. In an apparent response, Lightfoot’s office committed to conducting a review of the police department’s use of Clearview AI.
Then on March 10, 2020, Lurie, Lightfoot’s policy chief, sent a memo outlining the policy team’s concerns with the Clearview AI app, some of which were previously expressed in the letter delivered to the mayor.
In the memo, Lurie pointed to the potential violations of the state’s biometric protection law and the company’s legal exposure and noted that “error rates” in facial recognition searches “are far higher for non-white, non-male faces.” He also complained that the city’s facial recognition policy “lacks sufficient detail.”
“The Face Comparison Policy states that CPD ‘will adopt and follow procedures and practices by which it can ensure and evaluate the compliance of users with the face comparison system requirements,’” Lurie wrote to Lee, copying other staffers. “But those ‘procedures and practices’ are not defined [and] are at the core of any meaningful accountability system.”
The same language isn’t reflected in the police department’s only publicly available policy on facial recognition, which was issued in 2013. A police spokesman didn’t respond to a request for comment on the department’s current policy.
Despite the litany of red flags, the city didn’t actually pull the plug on its contract with Clearview AI. Instead, the company sent a letter to the police department on April 30, 2020, ending the arrangement with no explanation and including a refund check for $33,184.93.
Days later, before the news was made public, police officials painted a different picture after Buzzfeed News reported that Clearview AI was canceling its contracts with Illinois-based entities, which for a time included the secretary of state’s office and the Chicago Cubs. Although the police department didn’t comment in that story, spokesman Howard Ludwig wrote to Mullane that Lt. Patrick Quinn “recommended saying that CPD tried Clearview’s product as part of a [trial] run and decided to go in another direction.
“He wanted to emphasize that the decision was mutually agreeable,” Ludwig wrote in the May 8, 2020, email.