Menu Close

The U.S. Department of Homeland Security (DHS) has been using Clearview AI’s facial recognition tool to try and solve thousands of child exploitation cold cases

The U.S. Department of Homeland Security (DHS) has been using Clearview AI’s facial recognition tool to try and solve thousands of child exploitation cold cases, according to a report by Forbes.

In what insiders called “an unprecedented three-week operation to solve years-old crimes that has led to hundreds of identifications of children and abusers,” the DHS Homeland Security Investigations (HIS) unit used Clearview AI and other algorithmic tools to scan millions of images and videos stored in the HSI and Interpol’s databases of child exploitation material. According to Clearview, these uploaded images are cropped to show only faces, and tagged with a searchable signature.

Clearview AI is already a familiar partner for American security institutions. The HSI has signed contracts with the company worth up to $2 million, following on a deal with the FBI announced at the beginning of 2023, worth $120,000. In March, the founder and CEO of Clearview, Hoan Ton-That, told the BBC that the company had run close to a million biometric searches for U.S. police forces.

As Clearview AI grows, it has also ramped up its innovation efforts. In June, it announced that it had filed for a patent for a new vector indexing process, to allow search capability to scale with the size of its databases. It also made a series of new appointments that suggested an increased focus on military and intelligence markets. And its website invites visitors to read “How Clearview AI Helped Shape the War in Ukraine.”

The firm has received more widespread attention in mainstream U.S. media outlets than your typical biometrics platform. However, coverage typically includes the observation that Clearview’s purported library of more than 30 billion images was acquired by scraping websites such as LinkedIn and Facebook without consent.

Meanwhile, AI watchdogs continue to raise concerns that once governments start using facial recognition for justice operations, it is a slippery slope that leads to human rights abuses, including crackdowns on political protestors and overreach by law enforcement, which has greater impact on people of colour.

Forbes references the case of Porcha Woodruff, an eight-month pregnant woman in Detroit who was wrongly arrested for carjacking when facial recognition supplied by Dataworks Plus made a false match with her eight-year-old mugshot photo. Woodruff’s allegation, which comes with legal action against the police, is the third such claim of facial recognition leading to the wrongful arrest of a Black person in Detroit.

Article: The U.S. Department of Homeland Security (DHS) has been using Clearview AI’s facial recognition tool to try and solve thousands of child exploitation cold cases

Leave a Reply

Your email address will not be published. Required fields are marked *