Tech News

The world needs deep experts to prevent this chaos

[ad_1]

Recently the military The Myanmar coup government has added serious allegations of corruption to some false cases against Burmese Aung San Suu Kyi. These new charges are based on the statements of a prominent arrested politician who was released in a March video that many in Myanmar suspected were deeply false.

In video, the political prisoner’s voice and face appear distorted and unnatural as he makes a detailed claim to Aung San Suu Kyi for giving him gold and money. Social media users and journalists Myanmar immediately questioned whether the statement was true. This fact shows that the problem will get worse. As real deepfakes improve, the will of the people to dismiss real footage as deepfake increases. What tools and skills will be available to investigate both types of claims and who will use them?

In the video, Phyo Min Thein, the former chief minister of Myanmar’s largest city, Yangon, is sitting in an empty room, apparently reading from a statement. Speaking is weird and he doesn’t look like his normal voice, his face is static and in the poor quality version he first circulated his lips are in sync with his words. Apparently, everyone wanted to believe it was fake. The results on the screen of an online deepfake detector spread rapidly, with a red box and a statement appearing around the politician’s face. 90% plus confidence was a deep recognition. Burmese journalists did not have the judicial capacity to conduct a trial. Past states and current military actions reinforced the reason for the suspicions. Government spokesmen have shared stage images aimed at the Rohingya ethnic group organizers of military coups, meanwhile, have denied on social media that evidence of their killings could be real.

Was the prisoner’s “confession” really false? Along with deepfake researcher Henry Ajder, I consulted with deepfake creators and forensic media specialists. Some pointed out that the video was of low enough quality, that the oral problem people saw could be compression artifacts as soon as evidence of deep falsification. Detection algorithms are also unreliable in low-quality compressed videos. His unnatural voice could be the result of reading the script under extreme pressure. If it is false, it is very good because the throat and chest move at important moments that are synchronized with the words. Investigators and officials in general were skeptical that it was deeply false, although it was not certain. At the moment it is likely to be what human rights activists like me know: a compulsorily or compulsorily recognized camera. Moreover, the substance of the allegations will not be reliable if there is no legal process in the light of the circumstances of the military coup.

Why does that matter? Regardless of whether the video is a forced confession or in-depth falsehood, the results are likely to be the same: words digitally or physically taken out of the prisoner’s mouth by a government coup. However, the use of deepfakes to create non-consensual sexual images today it is overcoming political cases, deepfake and synthetic technology is rapidly improving, proliferating and being marketed, expanding its potential for harmful uses. The case of Myanmar proves the real difference between the ability to make deepfakes, the chances of claiming a real video is a deepfake and our ability to question that.

It also explains the challenges for the public to rely on free online detectors without understanding the strengths and limitations of detection or the second way to guess the misleading result. Deepfakes detection is a technology that is still in its infancy, and the detection tool that can be applied to one approach often does not work in another. We must also be careful with forensics; someone deliberately takes steps to confuse the detection approach. And it’s not always possible to know which detection tools to rely on.

How can conflicts and crises around the world be prevented from being blinded by deepfakes and supposed deepfakes?

We would not be turning ordinary people into profound forgers by analyzing pixels to detect truth from falsehood. Most people will do better based on simpler approaches to media literacy, for example SIFT method, which emphasizes checking other sources or explaining the original context of the videos. In fact encouraging people to become amateur court experts can send people into a rabbit hole in the conspiracy of distrust in the images.



[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button