Find a photo of a dead Russian soldier on social media. Upload it to facial recognition software. Get an identity match from a database of billions of social media images. Identify the deceased’s family and friends. Show them what happened to the victim of Putin’s war and Ukraine’s defense.
This is one of Ukraine’s strategies in trying to inform Russians, who have limited access to non-state-controlled media and information, about the death being wrought by their president’s invasion. On Wednesday, deputy prime minister and head of the Digital Transformation Ministry in Ukraine, Mykhailo Fedorov, confirmed on his Telegram profile that surveillance technology was being used in this way, a matter of weeks after Clearview AI, the New York-based facial recognition provider, started offering its services to Ukraine for those same purposes. Fedorov didn’t say what brand of artificial intelligence was being used in this way, but his department later confirmed to Forbes that it was Clearview AI, which is providing its software for free. They’ll have a good chance of getting some matches: In an interview with Reuters earlier this month, Clearview CEO Hoan Ton-That said the company had a store of 10 billion users’ faces scraped from social media, including 2 billion from Russian Facebook alternative Vkontakte. Fedorov wrote in a Telegram post that the ultimate aim was to “dispel the myth of a ‘special operation’ in which there are ‘no conscripts’ and ‘no one dies.’”
Time: Building a war crimes case against Putin is harder than you think – Analysis
Just a month ago, Clearview AI and facial recognition were the subject of strong criticism. U.S. lawmakers decried its use by the federal government, saying the technology disproportionately targeted Black, Brown and Asian ethnicities and falsely matched them more often when compared to white individuals. They also made clear the existential threat to privacy the software posed. Civil rights organizations like the American Civil Liberties Union don’t believe the technology should be used in any setting, calling for outright bans.
Read more: Forbes