Sudan Civil War Invaded with AI Voice Cloning Tech

A campaign using artificial intelligence (AI) to impersonate the former leader of Sudan, Omar al-Bashir, has received hundreds of thousands of views on TikTok, the BBC reports.

The impersonation on the social media platform has added online confusion to a country torn apart by civil war.

AI Causing Havoc to Country in Crisis

An account that has remained anonymous has been posting dozens of clips of what it is calling “leaked recordings” of the ex-president since late August, though the voice is fake.

Omar al-Bashir hasn’t been seen in public for a year and is currently believed to be seriously ill. Bashir has been accused of organising war crimes and toppling the military in 2019 and hasn’t been seen in public for a year, though he denies the war crimes accusations.

After civil war fighting broke out in 2019 between the military, currently in charge, and the rival Rapid Support Forces militia group, Bashir’s whereabouts still remain a mystery. This has left a country already in crisis with an added layer of uncertainty.

This leaves campaigns, such as on the TikTok channel, notably damaging to the welfare of Sudan, as fake online news can spread like wildfire through social media.

Henry Farid, a researcher of digital forensics at the University of California, explains this: “It is the democratisation of access to sophisticated audio and video manipulation technology that has me most worried,

Sophisticated actors have been able to distort reality for decades, but now the average person with little to no technical expertise can quickly and easily create fake content.”

“The Voice of Sudan”, Or Is It?

The fake AI recordings are being posted on a channel called The Voice of Sudan.

They appear to be a mixture of old clips from press conferences during coup attempts, news reports and several “leaked recordings” attributed to Bashir. The posts often pretend to be taken from a meeting or phone conversation, even sounding grainy as you might expect from a bad phone line.

In order to check the authenticity of the voice recordings, experts at BBC Monitoring consulted a team of Sudan experts.

Ibrahim Haithar replied that he believed the clips were not likely to be recent.

“The voice sounds like Bashir but he has been very ill for the past few years and doubt he would be able to speak so clearly”, he said.

Of course, this cannot completely rule out the chance that the voice is not Bashir.

So far, the most conclusive piece of evidence came from a user on X, formerly Twitter, who recognised the very first of the Bashir recordings posted in August 2023.

The user realised that this apparently features the leader criticising the commander of the Sudanese army, General Abdel Fattah Burhan.

The Bashir recording matched a Facebook Live broadcast aired two days earlier by a popular Sudanese political commentator, known as Al Insirafi who is believed to live in the U.S. but has never shown his face on camera.

While the pair don’t necessarily sound alike, the scripts are the same. And, when you play both clips together, they do play perfectly in sync. So, it’s unsurprising Al Insirafi’s comments could be used to imitate Bashir.

The evidence subsequently shows how voice conversion software could have been used to mimic Bashir’s speaking. And, after more digging, four more of Bashir’s recordings seem to have been taken from the same blogger’s live broadcasts.

There is no evidence that Bashir is involved, but this doesn’t mean The Voice of Sudan isn’t a serious threat.

The TikTok account is exclusively political and requires deep knowledge of what’s going on in Sudan, but who benefits from this campaign is up for debate. One consistent narrative is criticism of the head of the army, Gen Burhan.

The motivation might be to trick audiences into believing that Bashir has emerged to play a role in the war. Or the channel could be trying to legitimise a particular political viewpoint by using the former leader’s voice. What that angle might be is unclear.

The Voice of Sudan denies misleading the public and says they are not affiliated with any groups. We contacted the account, and received a text reply saying: “I want to communicate my voice and explain the reality that my country is going through in my style.”

The Dangers of AI Voice Software

AI voice software is a powerful tool that allows you to upload a piece of audio, which can be changed into a different voice, but the actions of The Voice of Sudan showcase how concerned we ought to be over the dangers of fake video and audio.

As demonstrated, misuse of the software can lead to waves of disinformation with the potential to spark unrest and disrupt elections.

“What’s alarming is that these recordings could also create an environment where many disbelieve even real recordings,” says Mohamed Suliman, a researcher at Northeastern University’s Civic AI Lab.

It’s therefore more important now than ever that we are careful about believing what we hear, and question whether the recording feels plausible before sharing.

To spot audio-based disinformation, it’s vital to check if the recording was released by a trusted source. However, verifying audio is difficult, particularly when content circulates on messaging apps and especially during times of social unrest, such as that currently being experienced in Sudan.

The technology to create algorithms trained to spot synthetic audio is still in the very early stages of development, whereas the technology to mimic voices is already quite advanced.

So, until such technology is fully formed and out in the world, it is critical to tread carefully when it comes to potentially dangerous audio recordings.

For now, reporting accounts that look suspicious is the most effective step that can be taken. After being contacted by the BBC, the TikTok account has indeed been banned.