Ukraine Starts Using Facial Recognition To Identify Dead Russians And Tell Their Relatives

admin

imageUkraine’s deputy prime minister says the tech will help provide transparency about how many Russian soldiers are dying in the war.Critics say the use of facial recognition in war zones is a disaster in the making.

Find a photo of a dead Russian soldier on social media.Upload it to facial recognition software.Get an identity match from a database of billions of social media images.

Identify the deceased’s family and friends.Show them what happened to the victim of Putin’s war and Ukraine’s defense.

This is one of Ukraine’s strategies in trying to inform Russians, who have limited access to non-state-controlled media and information, about the death being wrought by their president’s invasion.

On Wednesday, deputy prime minister and head of the Digital Transformation Ministry in Ukraine, Mykhailo Fedorov, confirmed on his Telegram profile that surveillance technology was being used in this way, a matter of weeks after Clearview AI, the New York-based facial recognition provider, started offering its services to Ukraine for those same purposes.

Fedorov didn’t say what brand of artificial intelligence was being used in this way, but his department later confirmed to Forbes that it was Clearview AI, which is providing its software for free.They’ll have a good chance of getting some matches: In an interview with Reuters earlier this month, Clearview CEO Hoan Ton-That said the company had a store of 10 billion users’ faces scraped from social media, including 2 billion from Russian Facebook alternative Vkontakte.Fedorov wrote in a Telegram post that the ultimate aim was to “dispel the myth of a ‘special operation’ in which there are ‘no conscripts’ and ‘no one dies.’”

Just a month ago, Clearview AI and facial recognition were the subject of strong criticism.U.S.lawmakers decried its use by the federal government, saying the technology disproportionately targeted Black, Brown and Asian ethnicities and falsely matched them more often when compared to white individuals .

They also made clear the existential threat to privacy the software posed.Civil rights organizations like the American Civil Liberties Union don’t believe the technology should be used in any setting, calling for outright bans.

The use case in Ukraine, of course, is vastly different from the ones typically seen in the U.S., which try to identify criminal suspects.Identifying dead Russian soldiers might be more acceptable, if the ultimate aim is to let people know their loved ones have died as a result of their leader’s warmongering.

Not to mention that the dead don’t have a right to privacy—not according to U.S.law, anyway.It’s one reason why police are allowed to unlock iPhones or other smart devices of the deceased by holding it up to their face (even if they may not have much success, due to liveness detection).

But should privacy advocates worry about the use of facial recognition in wartime, when it might legitimize the tech for use in other scenarios where the living’s privacy is under threat?

For Ukraine, it believes there is a need to identify dead Russian soldiers, as there’s much contention over the numbers of deceased military personnel.Last week, a Russian newspaper published and subsequently deleted a report claiming nearly 10,000 Russian soldiers had died since the invasion began, far more than had been previously reported.

Later the tabloid claimed it had been hacked and the figures were not correct.Ukraine believes Russia is lying to its citizens about the number of the dead.

But Albert Fox Cahn, founder of the Surveillance Technology Oversight Project, said the introduction of facial recognition into the war could be disastrous, even if Ukraine is using it to tell the truth to Russian citizens.“This is a human rights catastrophe in the making.When facial recognition makes mistakes in peacetime, people are wrongly arrested.When facial recognition makes mistakes in a war zone, innocent people get shot,” he told Forbes .

“I’m terrified to think how many refugees will be wrongly stopped and shot at checkpoints because of facial recognition error.

We should be supporting the Ukrainian people with the air defenses and military equipment they ask for, not by turning this heartbreaking war into a place for product promotion.”

Facial recognition has also been shown to be fallible, falsely matching images of people’s faces to the wrong identity.In the U.S., this has happened at least three times to Black individuals, who were wrongly arrested because their face erroneously matched with footage from surveillance cameras.

As Cahn noted, “When facial recognition inevitably misidentifies the dead, it will mean heartbreak for the living.”

When asked about those concerns or the use of its technology, Hoan Ton-That, CEO of Clearview AI, said, “War zones can be dangerous when there is no way to tell enemy combatants apart from civilians.Facial recognition technology can help reduce uncertainty and increase safety in these situations.”

He said that U.S.-government funded tests had shown that Clearview “can pick the correct face out of a lineup of over 12 million photos at an accuracy rate of 99.85%.” That accuracy “will prevent misidentifications from happening in the field.”

“The Ukrainian officials who have received access to Clearview AI have expressed their enthusiasm, and we await to hear more from them.We are ensuring each person with access to the tool is trained on how to use it safely and responsibly,” he added.

Whatever the morals at play, the use of facial recognition in this war is remarkable in its use as a tool in the propaganda war.Or as Ukraine would put it, the war for truth.

Even Fedorov didn’t think he’d be using the technology for this before the invasion, writing in his Telegram post, “We have all changed.We started doing things we couldn’t even imagine a month ago.”

Thomas Brewster.

Leave a Reply

Your email address will not be published. Required fields are marked *

Next Post

Final agreement reached with Moderna to produce mRNA vaccines in Australia from 2024

Moderna will produce its mRNA vaccines in Australia from 2024, with final deal signed off [Tom Lowrey] A final agreement has been signed off between the federal government, Victorian government and vaccine manufacturer Moderna to produce mRNA vaccines in Australia for the first time. Key points: – The new plant […]
Final agreement reached with Moderna to produce mRNA vaccines in Australia from 2024

Subscribe US Now