Since Elon Musk spent $44 billion on Twitter (now X) last year, the billionaire has been determined to wipe out bots and spammy accounts. Things haven’t gone smoothly. Amid the chaos, in recent weeks Russian trolls have jumped on one of Musk’s own posts and used it to push pro-Kremlin messaging, a new analysis shows.
At the start of October, Musk used his platform to mock Ukrainian president Vlodymr Zelensky with a nine-year-old meme. “When it's been five minutes and you haven't asked for a billion dollars in aid,” says the text above the “Trying to Hold a Fart Next to a Cute Girl in Class” meme, which the X owner posted to his 160 million followers. The original image shows a teenage schoolboy visibly agitated while sitting in class next to a girl. Musk’s version swapped in Zelensky’s face. Ukrainians hit back at Musk, accusing him of trolling and posting Russian propaganda.
Now analysis by a volunteer group of researchers who track Russian-language information operations on X says Russian trolls flocked to Musk’s post and news accounts that reported on his meme. “The Zelensky tweet from Musk seems to be the most commented by Kremlin trolls over nearly three months,” says Antibot4Navalny, the anonymous group of volunteers behind the findings.
In the days after the post, around 160 inauthentic accounts pushing pro-Russian messaging sent some 400 posts praising Musk as a “Russian patriot” and photoshopping him into Russian military uniforms, according to the group. They say the main messaging can be summarized, in part, as “Comrade Musk supports Russia.”
Three independent experts, including one former disinformation researcher at Twitter, reviewed the findings from Antibot4Navalny and concluded that the accounts praising Musk and pushing Russian narratives are likely to be part of a coordinated campaign. They say that while there was a small number of accounts pushing the messaging—the post has 93 million views—the targeting may show the accounts testing their messaging and tactics.
The X press office did not respond to WIRED’s request for comment, which included a sample of the accounts identified by the Antibot4Navalny researchers. “Busy now, please check back later,” its automated response says.
The Antibot4Navalny group monitors inauthentic accounts on X, focusing on the Russian language, and it identifies accounts that may not be genuine by analyzing their behavior and the replies they make to media outlets. Data gathered by the researchers shows the accounts replying both to Musk’s original tweet and also those from Russian-language news accounts, such as BBC Russian and DW Russian. The posts are mostly in Russian, but there are also a few in English. “Among troll replies addressing Musk directly, some are using memes or other images which are sometimes a part of the message,” a Antibot4Navalny representative says.
“Russia thanks you for your excellent work, Elon. And answers with memes,” one English language post says. “As usual, Musk is our comrade: like us, he ROFLs at the junkie,” a translated Russian post says. Others praise Musk for telling the “truth” and mocking the Ukrainian president.
One former Twitter disinformation researcher, granted anonymity to allow them to speak freely without fear of retaliation, looked at a sample of the accounts highlighted by Antibot4Navalny. “Most accounts had multiple signs of inauthenticity,” the former staff member says, pointing out that their analysis was done using public-facing data, and only X would be able to make “hard findings” based on technical data available to the company. The former staff member says the accounts had repeated behavior in reposts and “inconsistent or clearly falsified” personal information. “There was a spectrum for how realistically or well chosen an account's profile pic is,” they say, with some profile images being pulled from elsewhere online.
Martin Innes, codirector of the Security Crime and Intelligence Innovation Institute at Cardiff University, who has led international disinformation research, reviewed a sample of the data with colleagues. He also says there are multiple signs that the accounts may not be genuine. “The accounts examined are newly created, in the main during the period of the Ukraine-Russia war, and exhibit behavior designed to target and polarize opinion, and gain popularity through interaction with larger accounts, many of which represent popular media outlets,” Innes says.
Innes and the Cardiff University researchers say the accounts often have low or zero follower numbers, a lack of identifiable personal details, mostly just reply to other accounts’ posts, and produce anti-Ukraine and anti-Zelensky messaging, which mirror wider Russian narratives. Russia has long used social media to impact politics and divide opinions. In September, an EU report concluded that the “reach and influence” of Kremlin-backed accounts on social media had increased in 2023, particularly highlighting X.
“Since the acquisition of Twitter/X by Musk, there have been a series of policy decisions that any expert in the field of misinformation or disinformation would tell you were bad decisions,” the former Twitter staff member says. They point to X removing labels for accounts funded by governments and firing teams working on disinformation. At the same time, Russian propaganda has also focused on influencing African nations.
Kyle Walter, head of research at misinformation- and disinformation-analysis company Logically, says Musk's account has been targeted in the past, in particular with crypto scams. Walter also reviewed the accounts highlighted by Antibot4Navalny and says they appear to be inauthentic. The claims about Musk supporting Russia have “existed for a while,” Walter says. Throughout Russia’s full-scale invasion of Ukraine, which began in February 2022, Musk’s Starlink satellite internet company has provided crucial internet connections to Ukrainian forces for free and has been praised by its politicians. He was criticized last month over a decision not to allow Starlink’s use around Crimea, the Ukrainian peninsula annexed by Russia in 2014.
When it comes to the Musk posts, the Antibot4Navalny researcher says the accounts could have been looking to boost Russian propaganda talking points or to “create an illusion” for those browsing replies to Musk’s tweet that people support the message. Replying to large accounts like Musk’s has the potential to reach a wider audience, the research says.
Logically’s Walter says that it is a critical time for social media platforms as they head toward a slate of elections in 2024—including those in the US, Canada, UK, India, and the European Union. “Bad actors are testing different tactics, testing different capabilities,” Walter says. “You'll often see a lot of these kinds of unsuccessful campaigns where they're not really getting any engagement, not really getting any views on the content, but it's helping them calibrate their kind of broad efforts for future instances.”
Author: Emily Smith
Last Updated: 1698844322
Views: 1118
Rating: 4.7 / 5 (88 voted)
Reviews: 94% of readers found this page helpful
Name: Emily Smith
Birthday: 1911-07-18
Address: PSC 4584, Box 3592, APO AE 78203
Phone: +4577130071209025
Job: Zoologist
Hobby: Camping, Cycling, Coffee Roasting, Motorcycling, Baking, Chess, Puzzle Solving
Introduction: My name is Emily Smith, I am a cherished, unguarded, irreplaceable, sincere, accomplished, tenacious, forthright person who loves writing and wants to share my knowledge and understanding with you.