ElevenLabs’ AI voice generation ‘very likely’ used in a Russian influence operation

There are numerous well-documented abuses of generative AI, ranging from plagiarism to artistic piracy. Additionally, it seems to be showing up in state influence operations these days.

The Massachusetts-based threat intelligence firm Recorded Future recently reported that commercial AI speech generating technologies, including technology made public by the popular startup ElevenLabs, were “very likely” involved in one recent campaign.

According to the investigation, “Operation Undercut,” a Russian-led effort to weaken European support for Ukraine, made extensive use of artificial intelligence (AI)-generated voiceovers on phony or deceptive “news” videos.

The movies, which were aimed at European viewers, discussed a variety of topics, including the value of military assistance to Ukraine and the corruption of Ukrainian politicians. To illustrate the pointlessness of supplying high-tech armor to Ukraine, one video claimed that “even jammers can't save American Abrams tanks,” referring to equipment employed by US tanks to deflect incoming missiles.

According to the research, the video producers “probably” employed voice-generated AI, such as ElevenLabs technology, to give their work a more authentic appearance. In order to confirm this, the researchers from Recorded Future sent the recordings to ElevenLabs' AI Speech Classifier, which allows anyone to “detect whether an audio clip was created using ElevenLabs,” and it returned a match.

Requests for response from ElevenLabs were not answered. ElevenLabs was the only commercial AI speech production tool mentioned by Recorded Future, despite the fact that it acknowledged the probable usage of several others.

The influence campaign's own organizers unintentionally demonstrated the value of AI voice generation when they uploaded a few videos with real human voiceovers that had “a discernible Russian accent.” The AI-generated voiceovers, on the other hand, had no accents and talked in a variety of European languages, including English, French, German, and Polish.

Recorded Future claims that AI also made it possible for the deceptive videos to be swiftly distributed in a number of European languages, including English, German, French, Polish, and Turkish (all of which ElevenLabs supports, by the way).
The U.S. government sanctioned the Russian group, the Social Design Agency, in March for operating “a network of over 60 websites that impersonated genuine news organizations in Europe, then used bogus social media accounts to amplify the misleading content of the spoofed websites,” according to Recorded Future. At the time, the U.S. State Department claimed that all of this was carried out “on behalf of the Government of the Russian Federation.”

Recorded Future came to the conclusion that the campaign had little overall effect on European public opinion.

ElevenLabs' goods have already been pointed out for suspected abuse. According to Bloomberg, a voice fraud detection company determined that the company's technology was responsible for a robocall that sounded like President Joe Biden and advised voters not to cast their ballots in a January 2024 primary election. ElevenLabs responded by stating that it had introduced new security features, such as the ability to automatically restrict political voices.

ElevenLabs says it uses a variety of measures, including both automatic and human moderation, to enforce its ban on “unauthorized, harmful, or deceptive impersonation.”

Since its inception in 2022, ElevenLabs has grown rapidly. According to a previous TechCrunch story, it may soon be valued at $3 billion after increasing its ARR to $80 million from $25 million less than a year ago. Nat Friedman, the former CEO of Github, and Andreessen Horowitz are among its investors.

Articles on the same topic:

Leave a Reply

Your email address will not be published. Required fields are marked *