Unmasking Russia’s Troll Farm Empire: Inside the Kremlin’s Global Disinformation Machine

August 17, 2025
Unmasking Russia’s Troll Farm Empire: Inside the Kremlin’s Global Disinformation Machine
Kremlin’s Global Disinformation Machine

What Are Troll Farms? The Engine of Disinformation

Troll farms are organized groups of paid online operatives who use fake identities to flood the internet with propaganda and divisive content. Operating from office-like settings, these teams create false profiles that impersonate real people, posting on social media, news comments, and forums. Their goal is to manipulate public opinion – some trolls push messages showing bogus grassroots support for certain ideas, while others spread rumors designed to sow confusion and distrust in institutions newslit.org. Often working in coordinated shifts, troll farm staff post inflammatory comments, misleading “news,” and conspiracies at a massive scale, amplifying extreme viewpoints far beyond organic reach. Many use sockpuppet accounts (multiple online personas), sometimes assisted by social bots (automated accounts), to make it appear as if numerous ordinary people share these fringe opinions en.wikipedia.org. In reality, a handful of operators might be behind hundreds of accounts – a covert propaganda assembly line churning out disinformation on demand.

While the concept sounds like a sci-fi plot, troll farms are very real and have been weaponized by states. Russia in particular has infamously embraced troll farms as a core tool of its information warfare arsenal. By flooding online spaces with pro-Kremlin narratives and hostile commentary, Russian troll farms aim to skew perceptions and undermine the discourse in target countries. Their activities range from election meddling and political agitation abroad to reinforcing propaganda at home. Most of the world first learned of Russian troll farms when news broke of their interference in the 2016 U.S. presidential election newslit.org. However, the tactic predates 2016 and is not exclusive to Russia – other governments and groups have copied similar methods. Still, it is Russian operations that set the template for the modern troll farm: state-sponsored, centrally organized, and global in reach. Social media companies have struggled to curb these fake accounts, occasionally taking down large networks (for instance, Facebook and Twitter removed a Ghana-based Russian troll network in 2020 that had garnered over 300,000 followers while inflaming U.S. racial tensions newslit.org). Yet this likely represents only “the tip of the iceberg” newslit.org. Troll farms continue to evolve, finding new ways to evade detection and exploit online platforms.

Inside Russia’s Troll Factories: Organization and Key Players

Russia’s most notorious troll farm is the Internet Research Agency (IRA), a St. Petersburg-based company once led and financed by oligarch Yevgeny Prigozhin – a close Putin ally often dubbed “Putin’s chef.” The IRA (known in Russian slang as the “trolls from Olgino”) was founded around 2013 and grew into a professionalized operation with hundreds of employees spyscape.com. By 2015, the IRA reportedly had about 400 staff working in 12-hour shifts, including an elite group of ~80 English-proficient trolls dedicated exclusively to targeting the U.S. political system spyscape.com. It resembled an online marketing agency – but devoted to Kremlin objectives. A management team oversaw multiple departments (graphics design, IT, search engine optimization, finance, etc.) and tracked metrics obsessively spyscape.com. According to a U.S. Senate investigation, managers even monitored employees via CCTV and were “obsessed” with page views, likes, and comment quotas as measures of influence spyscape.com.

Life as a Russian troll was described by insiders as a high-pressure desk job. Lyudmila Savchuk, a journalist who went undercover at the IRA, revealed that each troll had strict daily quotas – for example, 5 political posts, 10 non-political posts (to appear authentic), and 150-200 comments on other content per shift spyscape.com. Trolls worked long hours for a modest salary (around 40,000 rubles, roughly $700-800 a month) and were paid in cash spyscape.com. They operated in covert “teams” focusing on different target audiences and topics. For instance, separate groups were assigned to U.S. politics, European issues, Ukraine, etc., each crafting messages tailored to those audiences. Former IRA employees recounted being instructed to watch popular American TV shows like House of Cards to learn U.S. cultural references spyscape.com. They received English language training and guides on American slang to better pose as genuine U.S. commentators spyscape.com. To hide their Russian origins, they used VPNs and proxy servers to mask their location and carefully curated fake personas (complete with stolen or AI-generated profile photos, “hometown” details, and realistic names) spyscape.com. Over weeks and months, these fake accounts would gradually build an identity and following – joining Facebook groups, tweeting about everyday life or sports – before pivoting to political propaganda once credible enough. “Within time, those accounts gain followers and become more influential,” notes one report spyscape.com.

Prigozhin’s IRA did not operate in isolation, but as part of a larger Kremlin-linked influence ecosystem. Russian journalists and U.S. indictments have exposed an umbrella effort called “Project Lakhta,” under which the IRA and related entities worked to “disrupt the U.S. democratic process, spread distrust, incite civil unrest, and polarize Americans” – especially by aggravating racial and social divisions spyscape.com. To enable deniability, this project used a web of shell companies and media fronts. For example, Prigozhin’s holding company “Patriot Media Group” owned the Federal News Agency (RIA FAN) and other pseudo-news sites that pumped out propaganda, while also acting as cover for covert troll operations cloud.google.com. Project Lakhta entities included the IRA itself and nominal “news” sites like Nevskiy News, Economy Today, and International News Agency, all of which were later named in U.S. sanctions for hosting disinformation activities spyscape.com. This blurred mix of overt propaganda outlets and hidden troll teams allowed Russian influence operations to “launder” disinformation – for instance, a troll would seed a false story via a fake persona, which would then be picked up by Patriot Media’s sites as “news,” and finally amplified by other trolls and bots.

Notably, Russia’s trolling operations extended beyond keyboards and screens. The IRA and associates sometimes hired unwitting locals in target countries to stage real-world events that matched their online agitation. U.S. investigators found that in 2016–2017, IRA operatives posing as American activists managed to organize actual political rallies in the United States – even arranging for one protest in New York City supporting Donald Trump and another opposing him on the same day, to maximize division spyscape.com. The Russian trolls impersonated grassroots organizers, recruiting real Americans via Facebook groups and paying them to carry banners or build props, all while those citizens had no idea they were responding to Russian directives spyscape.com. This tactic – leveraging trolls to create “AstroTurf” fake grassroots movements – shows how far these influence operations will go. By combining online disinformation with offline action, they sought to turn online anger into real-life chaos.

The Internet Research Agency became notorious worldwide after its meddling in U.S. politics was laid bare. In 2018, the U.S. Justice Department indicted IRA operatives for criminal interference in the 2016 election, detailing how they created thousands of social media accounts (posing as Americans of all stripes), reached millions of people with memes and fake stories, and even bankrolled political advertisements and rallies businessinsider.com. (The IRA’s use of stolen U.S. identities and financial fraud to fund operations led to additional charges businessinsider.com.) Though Russia denied the accusations, Prigozhin eventually admitted his role: “I’ve never been just a financier of the IRA. I invented it, I created it, I managed it for a long time,” Prigozhin boasted in early 2023, claiming it was founded to “protect the Russian information space from… aggressive anti-Russian propaganda from the West.” spyscape.com. This candid confession from the mastermind underscores the IRA’s dual mission – to attack Russia’s adversaries online while shielding the Kremlin at home.

Aside from the IRA, other troll operations have emerged or evolved in Russia. In 2022, after Russia launched its full-scale invasion of Ukraine, a new St. Petersburg-based outfit calling itself “Cyber Front Z” began openly recruiting “patriotic” contributors to flood the internet with pro-war comments theguardian.com. Based in a rented space inside an old arms factory, this “troll factory” (as described by UK intelligence) was linked to Prigozhin’s network and coordinated via a Telegram channel named “Cyber Front Z” theguardian.com. Cyber Front Z’s operatives targeted Western leaders’ social media pages and news site comment sections, hijacking discussions to praise Putin and demonize Ukraine theguardian.com. According to research cited by the UK government, they even paid certain TikTok influencers to amplify Kremlin talking points theguardian.com. This suggests Russia’s troll operations adapted to new platforms and younger audiences – expanding from Facebook and Twitter onto Instagram, YouTube, and TikTok, where researchers found high concentrations of activity in 2022 theguardian.com. Cyber Front Z members engaged in coordinated “brigading,” mass-commenting to steer online conversations toward the Kremlin’s line, and even manipulating online polls (for example, skewing votes in Western media surveys about support for sanctions on Russia) theguardian.com. Unlike the secretive IRA, Cyber Front Z was brazenly public about its patriotism, justifying online trolling as a civic duty during wartime theguardian.com.

Tools and Tactics: How Russian Trolls Spread Propaganda

Russian troll farms deploy a wide arsenal of techniques to inject and amplify disinformation online. Some of the key tools and tactics include:

  • Fake Personas and Sockpuppet Accounts: Trolls create hundreds of fictitious online identities, complete with stolen or AI-generated profile photos, and pretend to be ordinary citizens en.wikipedia.org. They often even mimic local demographics (for example, posing as Americans of diverse political backgrounds, or as Europeans from specific countries) to blend in. These sockpuppet accounts populate every major platform – Facebook, Twitter (now X), Instagram, YouTube, TikTok, Reddit, and Russian networks like VKontakte spyscape.com. To avoid obvious signs of fakery, trolls will post regular non-political content under these personas (sports banter, food photos, etc.) and carefully copy real social media behaviors. Over time they build credibility, then drop propaganda or misleading content into communities from within.
  • Social Media Amplification & “Brigading”: Troll farms aim to make a few voices sound like a crowd. They coordinate waves of posts and comments so that a certain narrative dominates conversations or trends artificially. For instance, a team of trolls may simultaneously reply to a politician’s tweet with the same talking points, giving the false impression that public sentiment is overwhelmingly one-sided. In comments sections of news articles, they upvote each other and pile on replies to drown out genuine debate theguardian.com. On platforms like Facebook, trolls have run large themed pages (e.g. “Secured Borders” or fake activist groups) that attracted real followers, then subtly injected polarizing propaganda into those feeds. They have also been known to hijack hashtags or launch coordinated hashtag campaigns on Twitter to push them onto trending lists. By choreographing posts across dozens of accounts, troll farms can brigade online polls, forums, and social feeds to make fringe ideas seem popular theguardian.com.
  • Bots and Automated Networks: In addition to human-run personas, Russian operations utilize bot networks – automated or semi-automated accounts – to boost their signal. These bots can retweet or share content thousands of times within minutes, distort trending algorithms, and harass targeted users. During various elections and events, investigators have found swarms of tweets/posts coming from suspected bot accounts linked to Russia washingtonpost.com. For example, Spain reported an “avalanche of bots” spreading fake news during Catalonia’s 2017 independence referendum, with over half the bot accounts traced back to Russian origins washingtonpost.com. Bots often amplify the trolls’ messages by liking and reposting them en masse. The blend of bots and human trolls makes the campaign appear larger and more organic than it is – a force multiplier for disinformation.
  • Disinformation Content & Media Manipulation: Troll farms specialize in producing shareable disinformation – from fake news articles and doctored images to misleading memes and videos. Often they operate pseudo-news sites that churn out propaganda pieces, which trolls then circulate in groups and on comment threads. Russian operatives have forged screenshots of tweets, falsified government documents, and spread conspiracy memes to support their narratives. In recent years, they have also begun using deepfakes and other AI-generated media. A notorious example occurred in March 2022, when a deepfake video of Ukrainian President Volodymyr Zelensky was circulated, falsely showing him telling Ukrainian troops to surrender reuters.com. The video was poorly made and quickly debunked, but experts warned it “could be a harbinger of more sophisticated deceptions to come” reuters.com. Troll networks can latch onto such fake content – or create their own – and aggressively push it to unsuspecting audiences. By seeding viral hoaxes (from false claims of election fraud to phony COVID-19 cures), they aim to mislead and inflame the public.
  • Co-opting Influencers and Local Voices: Another tactic is to launder propaganda through legitimate voices. Russian influence campaigns have been caught recruiting or duping unwitting third-parties – for instance, contacting freelance journalists abroad to write articles echoing Kremlin talking points, or paying social media influencers to promote certain narratives theguardian.com. In 2020, an operation dubbed “Peace Data” saw Russian trolls create a fake online left-wing news outlet and hire real freelance writers (many in the West) to contribute articles, not knowing the platform’s true affiliation. Similarly, UK researchers in 2022 found that some TikTok and Instagram influencers were paid to spread pro-Russia messages about Ukraine theguardian.com. By piggybacking on the credibility of local voices, trolls can disseminate propaganda in a more palatable form, hidden within content that appears independent. This adds a layer of deniability and makes detection harder, since the messaging is coming from seemingly authentic individuals rather than obvious troll accounts.

In combination, these tactics allow Russian troll farms to conduct “information warfare” on multiple fronts – injecting false or biased content, amplifying it artificially, and distorting the online information environment. The ultimate goal is not simply to persuade one person or another of a single lie; it is to erode trust overall – trust in the media, in electoral processes, in societal norms – by polluting discourse with so much disinformation and vitriol that objective truth and healthy debate become hard to find. As one expert put it, these troll operations are a “tool for Putin to control narratives” and to drown out dissenting voices propublica.org. Whether by flooding Russian-language forums with pro-Kremlin consensus or inundating Western social media with conspiracy theories, troll farms seek to overwhelm the truth with noise.

Objectives: Why the Kremlin Runs Troll Farms

Russian troll farm campaigns serve strategic objectives set by the Kremlin, broadly falling into the categories of political influence, destabilization of adversaries, and propaganda warfare. Key goals include:

  • Undermining Elections and Political Processes: A primary mission of Russian trolls has been to interfere in democratic elections abroad. Their now-infamous role in the 2016 U.S. election was aimed at polarizing American voters, suppressing some votes, and boosting preferred outcomes. The IRA’s own internal documents described an effort “to interfere with elections and political processes” in the U.S. propublica.org. Similarly, in European votes, Russian troll activity has sought to promote pro-Kremlin or far-right candidates and sow doubt about pro-EU centrists. British Prime Minister Theresa May warned in late 2017 that Russia was attempting to “undermine free societies and sow discord in the West by deploying its trolls and fake news outlets washingtonpost.com. Spanish authorities accused Russian-linked networks of meddling in Spain’s Catalonia referendum and noted overlaps in disinformation during Brexit and the French elections, all aiming to fracture European unity washingtonpost.com. In essence, by manipulating public opinion during foreign elections or referenda, the Kremlin hopes to install more favorable leaders, weaken international opponents, or simply create chaos (as chaos in rival democracies is itself a win for Putin). U.S. officials have said Russia’s goal is often to “fan divisive narratives” around elections – exploiting sensitive issues like race, immigration, or crime – so that societies turn against themselves reuters.com. Even when specific candidates do not win, the trolls succeed if they convince enough citizens to doubt the election’s integrity or resent their fellow voters.
  • Sowing Division and Distrust: Beyond any single election, Russian troll farms work to inflame existing divisions in target countries over the long term. By constantly pushing polarizing content on social flashpoints (left vs. right, urban vs. rural, ethnic and religious tensions, etc.), they try to rip open the fault lines in a society. A U.S. Senate inquiry noted that one IRA project focused heavily on “promoting socially divisive issues with an emphasis on racial divisions and inequality” in America spyscape.com. The objective is to weaken Russia’s adversaries from within: a nation mired in internal conflicts is less able to present a united front internationally. This strategy has been observed across the West – from promoting secessionist sentiments in Spain’s Catalonia washingtonpost.com, to stoking anti-immigrant and anti-EU outrage in Europe, to amplifying racial discord and vaccine controversies in the US. By spreading conspiracy theories and extreme rhetoric, trolls aim to make people lose trust in mainstream information sources and even turn on their own governments. As Georgia’s (U.S. state) Secretary of State warned in 2024, false stories about voter fraud that suddenly appear on social media are likely “foreign interference attempting to sow discord and chaos” – often traced back to Russian troll farms reuters.com.
  • Supporting Russian Foreign Policy and Military Goals: Troll farms also serve as digital foot soldiers for the Kremlin’s geopolitical agenda. When Russia is engaged in conflicts or crises, the trolls go into overdrive to shape the narrative online. For example, since Russia’s initial aggression in Ukraine in 2014, and especially after the full-scale invasion in 2022, Russian trolls have vigorously spread pro-war propaganda. They amplify claims that justify Russia’s actions (like false accusations of NATO aggression or “Nazis” in Ukraine) while dismissing or denying atrocities committed by Russian forces propublica.org. UK research in 2022 revealed a concerted troll operation to boost support for Putin’s invasion of Ukraine, targeting not just Russian audiences but also Western social media with messages against sanctions and Ukraine’s leadership theguardian.com. The trolls even targeted specific high-profile figures – such as flooding the comments of the UK Prime Minister and other officials with pro-Kremlin talking points theguardian.com – in hopes of swaying public sentiment. In essence, whenever Russia wants to project power or mask its aggressions, it unleashes the trolls to attack the information landscape: to confuse people about the facts on the ground, to promote conspiracy alternatives, and to rally international skeptics to Russia’s side. As Clemson University researcher Darren Linvill observed, Putin doesn’t necessarily need to “win” the information war outright; he just needs to muddy the waters enough to hold his ground, and troll accounts are a tool to do that propublica.org. On the home front, this objective extends to reinforcing the Kremlin’s version of events so Russian citizens remain supportive or at least skeptical of the West.
  • Propping up the Regime Domestically: While much attention focuses on Russian trolls abroad, a significant portion of their effort is actually aimed at domestic Russian audiences. The IRA was originally formed to manage Putin’s image and agenda within Runet (the Russian internet) propublica.org. By flooding Russian social media and comment forums with patriotic messages and attacks on the Kremlin’s critics, troll farms help drown out opposition voices and fabricate a sense of widespread public support for Putin. For example, during Russia’s COVID-19 outbreaks and economic troubles, domestic trolls aggressively promoted the narrative that any failures were due to Western sanctions or sabotage, not government mismanagement. After the Ukraine invasion, Russian-language troll accounts worked to convince fellow citizens that the war was justified and going well, repeating lines from state TV and smearing those who voiced doubt propublica.org. “It’s a way for [Putin] to lie to his own people and control the conversation,” Linvill explains propublica.org. By manipulating public opinion at home, troll farms help maintain social stability and loyalty, shoring up Putin’s regime. In this sense, they are a propaganda arm used internally to counteract any factual information that might undermine the Kremlin’s narrative (such as news of Russian military setbacks or corruption scandals). The Kremlin’s objective here is classic propaganda: to “hold his ground” in the information war domestically, ensuring Russians either believe in his policies or at least doubt any alternative viewpointpropublica.org.

In summary, Russia’s troll farms operate with clear goals: to weaken the Kremlin’s opponents, both externally and internally, by manipulating information. Whether by skewing election debates, splintering societies, boosting Russia’s image, or suppressing dissent, the common theme is information as a weapon. Or as Theresa May put it in a public warning to Moscow: “We know what you are doing. And you will not succeed.” washingtonpost.com

Notable Operations: From US Elections to Ukraine War

Over the past decade, Russian troll farms have been linked to major influence operations around the world, often in tandem with other cyber and intelligence activities. Some of the most significant cases include:

  • 2016 U.S. Election Interference: The IRA’s campaign to meddle in the 2016 American presidential race stands as the quintessential example of troll farm interference. Beginning as early as 2014, the St. Petersburg trolls created a sprawling network of fake personas on Facebook, Twitter, Instagram, YouTube, and Reddit – impersonating liberal activists, conservative voters, Black Lives Matter supporters, southern separatists, Muslim groups, you name it spyscape.com, en.wikipedia.org. These accounts pumped out millions of posts aimed at inflaming both sides of divisive issues (race relations, gun rights, immigration) and disparaging Hillary Clinton while boosting Donald Trump. They even set up Facebook pages for fictitious organizations and bought ads targeting key swing-state demographics businessinsider.com. By Election Day, content from IRA-controlled pages had reached over 126 million Americans on Facebook, according to later disclosures, and their troll tweets were cited by mainstream media. The goal, as described in U.S. indictments, was to “sow discord in the U.S. political system” and ultimately help elect a more Kremlin-friendly candidate washingtonpost.com. U.S. intelligence agencies concluded that this influence operation, directed by the Russian government, marked a new era of information warfare. In 2018, the U.S. Treasury sanctioned the IRA and Prigozhin for this interference businessinsider.com, and Special Counsel Robert Mueller indicted 13 IRA operatives. Although none stood trial (as they remained in Russia), the indictment exposed extensive details of how the troll farm orchestrated rallies, stole American identities, and exploited social media algorithms on an unprecedented scale businessinsider.com, spyscape.com.
  • 2018–2020: Continued U.S. and Africa Operations: Despite the backlash from 2016, Russia’s trolls did not cease activities. Leading up to the 2020 U.S. election, the IRA experimented with outsourcing its English-language trolling to cut-outs in Africa. In one notable case uncovered in 2020, a troll farm in Ghana and Nigeria (allegedly run by Russian handlers) operated dozens of social media accounts masquerading as Black American activists, posting on racial issues to stoke division in the U.S. newslit.org. Facebook and Twitter, tipped off by researchers and CNN, eventually disabled the network newslit.org. Meanwhile, U.S. intelligence warned that Russia continued to spread misleading narratives during the 2020 campaign – though tighter platform monitoring and high public awareness made it harder for trolls to have the same impact as 2016. Notably, the IRA also adapted by creating “franchises”: instead of running all accounts out of St. Petersburg, they provided content and direction to sympathetic or hired individuals abroad who were harder to trace. For example, separate U.S. indictments in 2020 revealed a Russian effort to recruit Americans (including a political activist in Florida) to publish articles written by fake personas and to organize protests. The troll farm playbook was thus evolving, but the objectives remained consistent. U.S. Cyber Command even took the step of hacking the IRA’s servers in late 2018 to disrupt their activities during the midterm elections, briefly knocking the troll factory offline spyscape.com. The reprieve was temporary – reports indicated they regrouped later – but it showed that western governments were increasingly willing to take offensive action against the trolls.
  • 2016–2017 Brexit and European Campaigns: In Europe, Russian trolls targeted pivotal events like the UK’s Brexit referendum in 2016 and the French presidential election in 2017. In Britain, investigations found that Kremlin-linked Twitter accounts (including known IRA handles identified by Twitter) had promoted pro-Brexit messaging and inflammatory immigration stories in the run-up to the vote washingtonpost.com. The scale was smaller than in the U.S., leading analysts to assess that Russian trolls did not decisively swing the Brexit result, but the intent to meddle was evident washingtonpost.com. Simultaneously, during France’s 2017 election, Russian operatives (via trolls and hackers) spread and amplified the so-called “Macron leaks” – a dump of hacked emails from candidate Emmanuel Macron’s campaign – alongside disinformation about Macron, in an effort to aid his far-right opponent Marine Le Pen. French cyber-monitoring found many of the social media accounts pushing #MacronLeaks on Twitter were newly created or traced to Russian influence networks. In Germany, officials braced for similar interference around the 2017 federal election. While a major troll campaign there did not materialize (possibly deterred by the German government’s warnings), Russia’s hand was suspected in other incidents – such as a 2016 propaganda story falsely claiming a Russian-German girl “Lisa” was raped by migrants in Berlin, which was heavily promoted on Russian social media and sparked protests. European leaders took notice: by late 2017, Prime Minister May publicly accused Russia of meddling across Europe, and Spain’s PM Mariano Rajoy revealed data that during the Catalonia 2017 independence crisis, a large portion of fake online activity originated from Russia or Venezuela washingtonpost.com. These efforts all served Moscow’s interest in weakening the EU and NATO by empowering nationalist and separatist movements.
  • Propaganda in Ukraine and Neighboring States: Russia’s use of troll farms first gained traction during its interventions in its near-abroad. When Russia annexed Crimea in 2014 and fomented war in eastern Ukraine, an army of trolls went to work online to justify Moscow’s actions and vilify the Ukrainian government. Russian-language social networks, as well as Western platforms, were barraged with repetitive messages parroting Kremlin narratives: that the Kyiv government was a “fascist junta,” that atrocities (often fabricated) were being committed against Russian-speakers, and that Western sanctions would backfire. These trolls sought to “flood the zone” with the Kremlin’s version of events, making it hard for casual observers to discern truth. Fast-forward to 2022 and the all-out invasion of Ukraine: Russian trolls again activated on a global scale. Within days of the invasion, analysts identified networks of inauthentic accounts across Twitter, TikTok, and Instagram spreading disinformation – such as claims that footage of bombed Ukrainian cities was fake propublica.org. ProPublica and Clemson University researchers tracked dozens of Russian-language accounts that in March 2022 simultaneously posted the same misleading video (falsely “exposing” a staged civilian casualty scene in Ukraine) – a hallmark of a coordinated troll operation propublica.org. These accounts showed clear IRA fingerprints, including working hours aligned with the Moscow time zone and breaks on weekends and Russian holidays propublica.org. As the war progressed, troll content adapted: initially in confusion, then echoing official propaganda by blaming NATO for the war and denying any Russian military failures propublica.org. Trolls also targeted international audiences to weaken support for Ukraine – arguing on Western comment threads that sanctions on Russia hurt Europe more, or amplifying far-right voices in Europe who opposed arming Ukraine theguardian.com. In one UK-funded study, analysts noted the troll operation in St. Petersburg (Cyber Front Z) took cues from conspiracy movements like QAnon in how it spread its messages and tried to rally sympathizers online theguardian.com. The Ukraine war has thus been accompanied by a parallel information war, with Russian troll farms playing a central role in spreading “pro-war lies” globally theguardian.com. Their impact is evident in the persistence of Kremlin-favorable narratives in some segments of public opinion, despite factual reporting to the contrary.
  • COVID-19 Pandemic and Other Global Issues: Russian trolls have opportunistically jumped on global crises and controversies, from the COVID-19 pandemic to social justice protests, as fertile ground for disinformation. During the pandemic, state-aligned troll accounts pushed a myriad of conspiracy theories – claiming the coronavirus was a U.S. bioweapon, promoting anti-vaccine misinformation, and exacerbating distrust in public health institutions thebulletin.org. The goal was not to present a coherent alternative, but to “amplify doubt” and dysfunction in Western societies during a moment of uncertainty thebulletin.org. Similarly, in 2020 as the U.S. saw historic racial justice protests, Russian troll farms redoubled efforts to inflame racial divides – on one hand masquerading as far-right voices condemning the protests, and on the other posing as left-wing activists to encourage more extreme sentiment, thus playing both sides to fuel chaos. These instances show that beyond politics and war, any divisive issue can become a battlefield for troll-driven influence operations. From climate change debates to vaccine mandates, Russian trolls have injected false claims to turn discourse toxic. By worsening polarization on every front, they further Moscow’s aim of a weakened, quarrelsome West.

Expert Insights and Exposés

Numerous investigations by journalists, academics, and intelligence agencies have pulled back the curtain on Russia’s troll farms. Experts emphasize that these operations are a new kind of threat that democracies must grapple with. “2016 was only the beginning,” notes one PBS report – since then, the Kremlin’s online operatives have grown more sophisticated and better at mimicking real people pbs.org.

Researchers who study troll farm tactics have provided striking observations. Darren Linvill, a professor at Clemson University, has spent years analyzing IRA accounts. He notes telltale patterns: “During Russian holidays and on weekends, the activity [of certain troll accounts] dropped off,” indicating the posters were on a salaried work schedule rather than genuine volunteers spyscape.com. His team’s analysis with ProPublica confirmed posts from suspected IRA accounts appeared at “defined times consistent with the IRA workday.” spyscape.com. In other words, real grassroots activists don’t take weekends off in unison – but troll factory employees do. Linvill concluded, “These accounts express every indicator that we have to suggest they originate with the Internet Research Agency.” spyscape.com If by some chance they weren’t IRA, he quipped, “that’s worse, because I don’t know who’s doing it.” propublica.org. His analysis underscored how consistent and professionalized the Russian disinformation machine had become.

Inside accounts have also shed light on operations. Journalist Lyudmila Savchuk’s brave whistleblowing in 2015 provided a first look inside the IRA’s “troll factory” offices. She described an almost surreal environment: young employees in cubicles relentlessly posting under fake personas, with propaganda directives handed down like editorial assignments each morning. “At first I couldn’t believe it was real… it was a kind of culture shock,” another undercover operative told Radio Free Europe rferl.org. They spoke of an assembly-line atmosphere where creativity was less valued than obedience to the daily narrative themes set by supervisors.

Western governments have become more vocal in calling out this activity. In April 2022, Britain’s Foreign Secretary Liz Truss condemned Russia’s new troll campaign around the Ukraine war, stating: “We cannot allow the Kremlin and its shady troll farms to invade our online spaces with their lies about Putin’s illegal war.” theguardian.com. The UK government went as far as publicly funding research into the St. Petersburg operation and sharing the findings with social media platforms to facilitate crackdowns theguardian.com. The same research revealed the trolls’ willingness to innovate – for example, by amplifying legitimate posts from real users that happened to align with Kremlin views, thereby avoiding easy detection since the content wasn’t always fabricated theguardian.com. This cat-and-mouse between troll farms and platform moderators is something intelligence agencies are keenly aware of. In the U.S., the FBI and Department of Homeland Security have repeatedly warned that Russia’s agents adapt quickly to platform bans, reappearing with new accounts and new tactics.

An FBI official in 2020 noted that Russia’s operatives were “persistent and creative”, using cut-outs and proxies to obscure their involvement after social media companies started banning accounts en masse. For instance, after Facebook and Twitter purged thousands of IRA accounts post-2016, the trolls resurfaced via third countries (as seen with the Ghana operation) or by shifting to alternative platforms with looser moderation. Their resilience prompted the U.S. Department of Justice in 2020 to sanction more individuals and even troll-farm-linked sites (like SouthFront and NewsFront) to cut off their funding and hosting.

Think tanks and academics have also highlighted the evolving technological tools at the trolls’ disposal. A 2018 report by the Institute for the Future warned that large-scale trolling should be treated as a human rights abuse because of the harm it does to societies, yet lamented the lack of mechanisms to punish perpetrators newslit.org. Fast forward to 2024, and analysts observe that AI-generated content is the new frontier. In an EU study of disinformation, experts pointed out that earlier Russian tactics “relied on lower-tech strategies such as troll farms and bot networks,” whereas by 2024 campaigns increasingly leverage generative AI to create “hyper-targeted content” that is harder to spot carleton.ca. Russian influence operations have begun using AI to generate realistic deepfake images, videos, and text at scale, enabling even more potent false narratives. The European External Action Service noted in 2023 that many recent Kremlin disinformation campaigns, including the so-called “Doppelgänger” info ops (which clone real news sites to spread fake stories), can be traced back to entities funded by Russian state agencies and now augmented with AI capabilitiesv carleton.ca. This underscores that the troll farm model is not static – it continuously upgrades its techniques, from simple copy-paste memes in 2016 to AI-forged content in 2024 and beyond.

2024–2025: Latest Developments and Outlook

As of 2024 and 2025, Russia’s troll farm operations remain a moving target, reacting to geopolitical events and internal power shifts. One dramatic development was the fate of the Internet Research Agency itself. In June 2023, Yevgeny Prigozhin – the IRA’s founder – staged a brief mutiny against Russia’s military leadership with his Wagner mercenary forces. The rebellion’s failure and Prigozhin’s subsequent death in a suspicious plane crash (August 2023) led to the Kremlin reining in his sprawling enterprises. Reports emerged that by July 2023, the IRA’s operations in St. Petersburg were officially shut down in the aftermath of Prigozhin’s fall from grace businessinsider.com. Indeed, Prigozhin’s own media outfit, Patriot Media, announced it was “leaving the country’s information space” as the government moved to dismantle his influence organs businessinsider.com. Prigozhin even confirmed in an interview shortly before that he had created and run the IRA, apparently seeking to take credit for its “patriotic” mission spyscape.com. By early July 2023, Russian state media reported that the infamous troll farm had been disbanded – a development also noted by Western outletsbusinessinsider.com.

However, this “end” of the IRA did not mean the end of Russian trolling operations. Analysts believe that the Kremlin simply absorbed or restructured these capabilities into other hands. A report by Google’s Threat Analysis Group in March 2024 observed that while direct IRA activity on Google platforms dropped after Prigozhin’s demise, components of Prigozhin-linked influence campaigns “have remained viable” and likely continue under different management cloud.google.com. Mandiant researchers noted that several long-running Russian information operations showed “uneven degrees of change” post-Prigozhin, with some disruption but many assets still active, suggesting the Kremlin redistributed control rather than shutting it all down cloud.google.com. Notably, throughout 2023, Google and Meta continued to take down large numbers of fake accounts tied to Russian disinformation networks. Google reported over 400 enforcement actions in 2023 against IRA-linked influence operations cloud.google.com. And in late 2024, ahead of critical elections, U.S. officials were still sounding the alarm: the FBI and other agencies warned that Russia fully intended to interfere in the 2024 U.S. presidential election using social media manipulation, even if the playbook had to be adjusted without the IRA’s original hubreuters.com.

In fact, election interference attempts have already been observed going into 2024. One stark example came in October 2024, when officials in the U.S. state of Georgia flagged a viral disinformation video falsely purporting to show an “illegal immigrant” bragging about casting multiple votes. The Georgia Secretary of State’s office directly stated: “This is false… It is likely foreign interference attempting to sow discord and chaos on the eve of the election.” reuters.com They urged social media companies to remove the video, noting “Likely it is a production of Russian troll farms.” reuters.com. This incident – coming just before a contentious U.S. election – shows that Russian trolls are still in the game in 2024, using deceptive viral videos on platforms like X (Twitter) to undermine confidence in voting reuters.com. Federal agencies like CISA investigated the incident, and it served as a reminder that even without the IRA’s original leadership, the apparatus of Russian online influence can quickly spin up new ad hoc campaigns when needed.

Looking ahead into 2025, experts assess that Russia’s troll farm operations will continue to adapt rather than disappear. With Moscow entrenched in a prolonged war in Ukraine and facing internal pressures, the Kremlin has a strong incentive to keep wielding the “cheap but effective” weapon of online manipulation. We can expect Russian operators to further embrace emerging technologies – especially AI. European analysts noted that by late 2024, the sheer scale and speed of Russian disinformation had increased thanks to generative AI tools, making fake content easier to produce en masse carleton.ca. This means future troll farm tactics might involve AI-generated “people” (for perfectly realistic profile pics and even deepfake video commentary), as well as algorithmically tailored propaganda targeting specific communities with precision. The arms race between platforms and trolls will likely intensify: as companies improve at banning known troll signatures, the trolls leverage AI to create ever more convincing personas and content.

On the geopolitical stage, Russia’s troll farms (under whatever new guise) will likely target any major events relevant to Kremlin interests. Upcoming elections in Western countries, debates over support for Ukraine, and even issues like energy crises or international conflicts could all become theaters for renewed disinformation offensives. The European Union’s security agencies have beefed up their monitoring of “Foreign Information Manipulation and Interference” (FIMI), releasing periodic threat reports that almost invariably highlight Russian operations. NATO and EU officials warn that Russia will try to fracture Western unity on Ukraine by fueling extremist and isolationist narratives via social media (for example, promoting voices that oppose aiding Ukraine or that support Russia-friendly policies). Indeed, the 2024 European Parliament elections saw a surge of messaging – including cloned fake news sites in a “Doppelgänger” campaign – pushing anti-Ukraine and anti-EU themes, which analysts tied back to Russian disinformation outfits carleton.ca. The influence of such efforts is hard to quantify, but they coincided with gains for far-right, pro-Russian political blocs, indicating some effectcarleton.ca.

In Russia’s domestic sphere, the Kremlin will likely maintain a tight grip on the narrative through any means necessary – including troll farms or their successors. After Prigozhin’s mutiny, Putin learned the risk of letting a private actor control too much of the propaganda machine. It would not be surprising if Russia’s security services (like the FSB or military intelligence) have taken over parts of the trolling operations to ensure loyalty and direct oversight. The tone of Russian domestic trolling since late 2023 has shifted to erase Prigozhin’s memory (given his betrayal) and reinforce Putin’s stature amidst the war’s challenges tandfonline.com. Any Russians voicing war-weariness or critiquing the government online can expect to be swarmed by “patriotic” commenters – in many cases, trolls – attacking them as traitors, thus chilling dissent. This indicates the troll farm tactic remains a pillar of internal propaganda, just more state-centralized.

In conclusion, Russia’s troll farms have evolved from a fringe experiment to a full-blown instrument of state power in the information age. They operate as 21st-century propaganda factories, pumping out lies, fear, and discord at the click of a mouse. Over the past decade, they’ve interfered in elections, fueled conflicts, and polluted countless online discussions. And despite increased exposure and sanctions, they show no signs of stopping – only changing form. As long as open societies rely on online platforms for news and debate, the Kremlin’s trolls will seek to exploit that openness. Combating this threat will require ongoing vigilance, public awareness, and international cooperation. The world has awakened to Russia’s troll farm playbook; the challenge now is to prevent these “digital troopers” from hijacking the global conversation with manipulation and malice.

Former Russian trolls expose misinformation operations

Don't Miss

Organ failure and blood shortages remain critical challenges in medicine.
Lab Rats No More: How Organ-on-a-Chip Technology is Revolutionizing Drug Testing

Lab Rats No More: How Organ-on-a-Chip Technology is Revolutionizing Drug Testing

Every year, over 100 million animals are used in lab