Photo: Midjourney, prompt Josef Šlerka 2025-04-03
Photo: Midjourney, prompt Josef Šlerka 2025-04-03
Russia’s disinformation machine didn’t just meddle in Germany’s 2025 elections — it built an entire fake media ecosystem to do it. Through a network of over 100 bogus news sites and psychological operations like Storm-1516, the Kremlin sowed chaos, amplified far-right narratives, and targeted key political figures with fabricated scandals. This article explains the different online campaigns and their methods.
Just a few days before the extraordinary German elections in February of this year, a two-minute video appeared on social network X showing the destruction of ballots cast for the far-right Alternative for Germany (AfD) party. The video claimed to expose electoral fraud designed to suppress Germany’s far-right. But the video itself was a fake. According to German security services, it was part of a Russian disinformation campaign orchestrated by the group Storm-1516.
The German investigative center Correctiv.org, U.S. organization Newsguard, and Gnida Project concluded that, during 2024, the group identified by security analysts as Storm-1516 created more than 100 websites that posed as news outlets and published numerous pieces of disinformation about German politicians and German politics.
For example, you could learn that Green Party candidate Robert Habeck was accused of abusing a young woman years ago. Other unsubstantiated claims included an article about how Foreign Minister Annalena Baerbock met with a male escort during her trips to Africa, and one that alleged that Marcus Faber, the head of the German parliament’s defense committee, is a Russian agent. Other fake news articles claimed that the German army was planning to mobilize 500,000 men for a military operation in Eastern Europe, and that there was a migration agreement signed by Germany that would allow 1.9 million Kenyans to come to the country.
If we look back not only at the election campaign in Germany but at the entire past year, we will see that the country is yet another target of an extensive cyber offensive that Russia has waged for many years, one that also concerns other European states. This offensive consists of a series of often independent but interconnected operations known mostly by code designations such as Storm-1516 (and another one using the same modus operandi, named Operation CopyCop), Doppelgänger (and its variants such as Operation Overload), UnderCut, and Matryoshka.
Some of them have been active on the global cyber scene for years, while others are relatively new.
Storm-1516 and its CopyCop and other operations
The group known as Storm-1516 is responsible for a network of various actors connected to Russian state bodies, proxy organizations, and accounts distributing the content of various influencers. The group first attracted attention in 2023 while trying to influence American primary campaigns. Since then, it has been mentioned in connection with other influence operations, including activities in Germany. Its actions are often referred to by the media as CopyCop or other names.
At the core of the Storm-1516’s operation in Germany was content production backed by John Mark Dougan, a former Florida police officer who fled to Russia in 2016 to avoid criminal prosecution. According to analysis by Correctiv, he is behind a network of 102 web pages with professional layouts and names like “Berliner Wochenzeitung,” “Hamburger Post,” “Echo der Zeit,” or “Widerhall,” which appear to be classic news websites. However, according to analysis by Newsguard, they primarily published artificial intelligence (AI) content inspired by or paraphrasing from right-wing media such as Compact or the pro-Russian blog Nachdenkseiten. Disinformation and fake news were then mixed into this content. Most of these pages were created at the end of 2024 and are now either no longer publishing or have been shut down.
According to leaked documents, Dougan is connected to the Russian intelligence service GRU. According to the Washington Post, he collaborates with Group 29155, a group linked to a series of incidents of sabotage, murders, and hacking attacks in Europe, including not only the shooting down of Malaysian Airlines civilian flight MH-17 over Ukraine in July 2014 but also the 2014 explosion in Vrbětice.
In the campaign (in this case in Germany) led by Storm-1516, Dougan was supported by another Russian organization, the so-called Fund to Fight Repression (Фонда борьбы с репрессиями). The fund presents itself on the web with the mission “To fight for human rights worldwide, support civil activists, provide legal assistance and financial support to victims of judicial and police arbitrariness and political persecution. The foundation’s activities are aimed at providing information, legal and other assistance to all who have encountered lawlessness from state representatives.” In 2021, the fund was founded by oligarch and Internet Research Agency (Agentstvo internet-issledovanij) troll farm co-founder Yevgeny Prigozhin. Before the German elections, the fund launched several of its own influence operations in coordination with Dougan’s network. Researchers from the organization CeMAS cite in their report examples such as child abuse accusations against Friedrich Merz and Ursula von der Leyen and an article claiming that the Green Party and Ukraine are recruiting young people and immigrants to commit crimes in Germany.
While Dougan’s network and the Fund to Fight Repression are responsible for producing disinformation, according to researchers, its distribution was ensured by a network of both anonymous and authentic accounts on Twitter and Telegram. Experts from CeMAS state that the spread of fake news in Germany was contributed to, for example, by the Telegram account Neues aus Russland 🪆📢 Alina Lipp, which shared false information about a bilateral migration agreement signed between Germany and Kenya under which up to 1.9 million Kenyan workers would be able to come to Germany next year and 750,000 Kenyans would be eligible for expedited naturalization. The information was then directly linked to one of Dougan’s websites. In fact, on September 13, 2024, a bilateral agreement between the two countries was signed by Kenyan Foreign Minister Musalia Mudavadi and German Interior Minister Nancy Faese, stating that Germany would legally accept several thousand qualified Kenyan workers, while Kenya would take back illegal Kenyan migrants. Lipp’s Telegram account has more than 180,000 followers, and this news was seen by over 50,000 users.
Storm-1099 and Doppelgänger
Storm-1516 was not the only Russian group that directed its operations against Germany and used websites with disinformation to do so. Germany also became the target of other campaigns. Among them was the so-called Doppelgänger operation, which was behind the group designated by Microsoft researchers as Storm-1099. The Doppelgänger operation has been ongoing across Europe since at least May 2022. Its name means “double” and captures the essence of its activity: As part of this operation, attackers create numerous fake pages that seem like regular mainstream websites, with URLs that resemble these websites, too. In Germany, doubles appeared of Der Spiegel, Bild, and T-Mobile.
These lookalikes of legitimate websites then share disinformation. Networks of equally fake accounts on X or Facebook are then used to spread content from these fake websites, which, again, looks like it’s coming from the real thing. According to a report by the German Foreign Ministry from January 2024, 50,000 accounts on the social network X participated in one part of the operation.
The Bavarian Office for the Protection of the Constitution (BayLfV) published a detailed analysis in the middle of last year of the infrastructure used for 14 months in part of this operation. The document shows that, through this infrastructure, nearly 8,000 individual campaigns were distributed across a broad network of more than 700 websites. These included fake news sites, cloned versions of legitimate media outlets, and even real news articles that were taken out of context to mislead readers. In just the eight month period analyzed by BayLfV, more than three-quarters of a million users were reached in this way.
Investigative journalists from the Correctiv organization, in cooperation with the Swedish non-profit organization Qurium, revealed that the campaign partially relies on European and German companies for spreading its propaganda. Our outlet also participated in the investigation, which showed that companies with Czech connections were also part of this operation. The people running Döppelganger managed to uncover a chain of Russian PR agencies that create content through a network of hosting companies in Russia, Great Britain, Germany, Finland, to end outlets hosted in Malaysia and Singapore.
The Russian Social Design Agency (SDA) plays a key role. It is a Moscow-based company led by Ilya Gambashidze, who reportedly works for Sergei Kiriyenko, the first deputy head of the Russian President’s office. SDA focuses on content strategy and narrative development. Leaked documents and video recordings show Gambashidze’s offer of services to influence political moods in individual countries.
The Russian IT company Struktura, founded by Nikolai Tupikin, was supposed to take care of providing technological infrastructure for these campaigns, including domain registration, website hosting, and a complex of bots on social networks. Both companies are now on the EU and US sanctions list.
Matryoshka
The third operation that we observed during the past year in Germany and throughout Europe was the so-called Matryoshka operation, which was first documented by the group bot blocker (operating on the X network under the account @Antibot4Navalny) in September 2023. The principles of Matryoshka were then detailed by the French agency VIGINUM in its detailed report.
Unlike Storm-1516 and Doppelgänger, the Matryoshka operation does not focus only on the actual spread and creation of disinformation but also on overwhelming the victims of disinformation campaigns, such as politicians or journalists or fact-checking organizations. The aim is to limit their ability to respond. The campaign itself takes place in two phases. “The first group of accounts, known as ‘seeders,’ publishes false content on the platform. The second group of accounts, called ‘quoters,’ then shares the seeder’s post and a reaction to it. Quoters contact target individuals or organizations and ask them to verify the authenticity or truthfulness of the content published by seeders,” states the Viginum agency report.
According to a June report by the Finnish company Check First, Operation Overload, using Matryoshka tactics, was able to affect 800 organizations from more than 75 countries, among which France and Germany played a key role. Check First cites an example of a video that was made to look as if it had come from the German media BR24. The fictional news clip ridicules a Ukrainian refugee who allegedly worked in a Berlin aquarium and claims that he stole tropical fish there, cooked and ate them, and subsequently suffered poisoning, leading to his hospitalization. The footage includes a photograph of a Ukrainian man identified as Oleg Panasjuk. A reverse image search revealed that this photograph was lifted from a Russian dating site and involves the profile of a person residing in Russia.
The video was then spread on Twitter until an account belonging to the Matryoshka network asked a BBC fact-checker to check it. BR24 responded to the entire campaign with a long explanatory text. However, as Janina Lückoff, an editor and head of the fact-checking team at BR24, told The New Arab, “dealing with fake media content, of course, limits our capacities.”
As The Insider noted, in 2024 a network of accounts linked to Matryoshka launched a campaign promoting a video with anti-Ukrainian statements by academics around the world. All these statements, however, were created using AI-generated images and voices. According to the Russian outlet Agenstvo, the Matryoshka network was activated again at the end of January 2025, when it published 15 fake videos over three days.
Attacks on cyber infrastructure
While Storm-1516, Doppelgänger, and Matryoshka focused primarily on spreading disinformation in Germany, other attacks were oriented toward its cyber infrastructure.
In 2023, the Social Democratic Party of Germany (SPD) became the victim of a cyber attack. Email accounts of the party headquarters were hacked. The German government then accused a GRU unit, specifically the APT28 group (also known as Fancy Bear), of the attack. This incident was, according to findings, part of a broader cyber campaign targeting several European countries, with the attack on the SPD being enabled thanks to a previously unknown security gap in Microsoft Outlook.
At the end of February 2024, researchers from the security company Mandiant identified a new wave of attacks, this time targeting the German Christian Democratic Union (CDU) and potentially other German political parties. The attack was carried out by the APT29 group (also known as Cozy Bear), associated with the Russian Foreign Intelligence Service (SVR). The attackers used phishing emails with a fake invitation to a CDU dinner, which contained a link to a compromised website. After clicking, malware known as ROOTSAW was downloaded onto the user’s device, which subsequently installed another malicious computer code, WINELOADER, allowing the attackers long-term access to the attacked systems.
In response to these incidents, in May 2024 the German government recalled its ambassador in Moscow, Alexander Graf Lambsdorff, for consultations in Berlin. The Office for the Protection of the Constitution and other German security agencies then issued warnings to all parliamentary parties and strengthened protective measures against digital and hybrid threats. Experts agree that these attacks were part of a broader Russian strategy aimed at obtaining political intelligence information and potentially undermining European support for Ukraine, with the level of threat being, according to German officials, “higher than never before.”
This was not the first time Germany was the target of a cyber attack. In 2021, the country was targeted by the Ghostwriter campaign, otherwise known asStorm-0257. The most famous attack, however, was the so-called Bundestag Hack, which began in early 2015, when members of parliament received fraudulent emails that looked like they were from the United Nations. After clicking on a link in the email, malware was installed, which gradually spread through the entire network of the parliament. The attackers thus gained access to parliamentary members’ confidential communications, their schedules, and other sensitive data, including from the computer in the Chancellor’s office. Over 16 GB of data were stolen. According to German authorities, the APT 28 group under the GRU was responsible for the attack.
From noise to signal
According to James Pamment from the Psychological Defence Research Institute at Lund University, the groups and agencies involved in disinformation campaigns and cyber attacks aim to penetrate the national security environment. “At the first level, it looks like spamming. They use paid advertising, automatic comments, and other low-quality methods of spamming digital media with links. It’s like advertising, and they expect 95% of it to be ignored, so it turns into something like background noise. Constant drumming that’s always here,” he said of the psychology of extensive operations in an interview with Investigace.cz.
But according to the expert, this part is followed by two more steps: “They assume that at some point, the user will be provoked to click on one of these links to learn more. The network of web platforms offers something for every target group – if you need to hear their narratives from the Guardian, they have double websites available. If you need to hear them from unconventional media, they have created these brands. They have created reliable intermediaries for almost any target audience,” said Pamment, adding that the real goal is to get the disinformation repeated by politicians, celebrities, influencers, and even voters’ neighbors. If successful, the disinformation becomes a part of the normal discourse. Then those responsible can say they successfully penetrated the information environment.
The Czech version of this story was published on Investigace.cz.
Subscribe to “Goulash”, our newsletter with original scoops and the best investigative journalism from Central Europe, written by Szabolcs Panyi. Get it in your inbox every second Thursday!
Josef Šlerka has worked as a data analyst and reporter at Czech Centre for Investigative Journalism since 2021. He used to head the Czech Fund for Independent Journalism (NFNZ). He is also the head of the Department of New Media Studies at Charles University in Prague.