The Russian online propaganda offensive was launched after Russia’s annexation of Crimea and the outbreak of war in Ukraine. The aftermath of this operation still echoes around the world today.
Organized groups of pro-Kremlin online commentators have been active for at least 17 years, since the beginning of Vladimir Putin’s presidency. In 2003, the name „Kremlin’s web brigades” was first used by Russian researchers, Anna Polyanskaya, Andrey Krivov and Ivan Lomako. Back then, the main purpose of the „brigades” was to praise Vladimir Putin’s presidency.
Over the years, „brigades” took upon themselves to do more than just write comments; they started to write blogs with pro-Kremlin propaganda and artificially boost the ratings of pro-Kremlin media materials. That’s when they began to operate under Russian special services and political administrators from Kremlin-subordinate Presidential Executive Office.
At the same time, also a new type of companies started to spring up in Russia – partially or wholly dependent on the government, offering services for companies, central and regional institutions, and local governments. Their services consisted of taking care for their clients’ positive public image through artificial online activities – often including slandering the clients’ competitors and critics.
Despite continuously expanding activities of „Kremlin’s web brigades”, to this day there is no credible data available on their tasks or numbers during the first decade of existence. It is known that during that time, the brigades were focused on Russian-speaking parts of the internet.
How much for a tweet
It was a Russian division of the Anonymous hackers who made it possible to look inside the Kremlin’s online propaganda machine for the first time. At the turn of 2011/12, they broke into mailboxes of people connected to the Federal Youth Agency (FYA), including Vasily Yakemenko – the chief of FYA and the first leader of Nashi, a pro-Kremlin youth organization.
The released emails confirm that as early as the beginning of current decade, there was already a powerful army of hired internet users operating in Russia, tasked to increase Vladimir Putin’s support ratings. The objective was to carry out discrediting propaganda actions against the opposition and independent media. It was paid for directly by Kremlin from the funds passed on to the Federal Youth Agency, and orchestrated by the chief of FYA Vasily Yakemenko, under the watchful eye of Vladislav Surkov, Kremlin’s propaganda leader.
This amount is twelve times higher than publicly known propaganda expenditures of FYA’s press office in September 2011. FYA’s spendings amounted to 23.6 million RUB (0.8 million USD).
No less than 5.9 million RUB was spent on traditional propaganda: paid articles in printed media (0.38 to 1.2 million RUB per article which equals 13.000 to 40.000 USD) and on most popular websites (12 to 300.000 RUB per article – 400 to 10.000 USD). The rest, ca. 8.61 million RUB (290.000 USD) was spent on more innovative forms of propaganda.
The money has covered such projects as running pro-Putin groups on VKontakte (Russian equivalent of Facebook – at least 2 million RUB = 70.000 USD), pro-Kremlin bloggers (at least 0.7 million RUB – 25.000 USD), Twitter users (at least 0.5 million RUB – 17.000 USD), or active commentators on news websites and social media (at least 1.1 million RUB – 37.000 USD).
Pro-Kremlin bloggers received monthly payments ranging from 4.000 to 30.000 RUB (130-1000 USD), and the more industrious ones would also enjoy free MacBooks an iPads. In order to receive payments and gifts they were required to produce regular reports on their work.
In some exceptional circumstances – whenever it was necessary to bribe a blogger who was popular in independent media or opposition circles – FYA was ready to pay up to 200.000 RUB (7.000 USD) per one pro-Kremlin post. Thanks to government funding, Kremlin managed to take control over content published by hundreds of most popular blogs in whole Russia.
Task-based operation scheme applied to Internet commentators as well. They too had to submit reports on their work to FYA. They were required to comment on articles published on news websites, blogs, and on social media posts, as well as to engage in discussions online. They received daily instructions from their supervisors – on how to praise Putin and slander the opposition.
The norm required to seek reimbursement was 60 comments and 6 discussions a week. Short comments were worth ca. 85 RUB (3 USD), longer comments – 125 RUB (4 USD), and discussions were paid 200 RUB (7 USD) each.
The best players could make as much as 50.000 RUB (1.7000 USD) a month – and an iPad on top of that. However, sometimes FYA would fail to meet its financial obligations, which would lead to conflicts. One of the displeased recruits was a blogger and moderator of a VKontakte group called „I really like Putin”. Instead of a bank transfer, he only received a MacBook – so he threatened to curb down pro-Kremlin propaganda.
Emails obtained by Anonymous also reveal the price-list of popular Russian Twitter users.
One tweet cost between 500 and 12000 RUB (12-400 USD).
Eventually, FYA started hiring full-time employees to work as bloggers, commentators, Twitter and Facebook users. They were required to spread information through multiple sources – for example, to run 5 blogs and 5 Twitter and Facebook accounts. According to the revealed emails, most of the employees were young women.
Plans for early 2012 included expanding online propaganda base to 100 full time users, expecting this number to grow even further over time. In order to lower the risk of discovery that all those people are using one computer, all employees were required to use anonymizing proxy servers.
Apart from articles, blogs, posts, tweets and comments, FYA would also finance manipulating of online upvoting by illegal purchases of likes, and fake traffic generators on websites it wanted to promote. This strategy helped to validate their videos on Youtube, pro-Putin blog notes, social media comments, and other propaganda content.
Emails also show strategies in play behind Twitter-bots (they would automatically answer to tweets on certain topics), and bots that would automatically comment on articles posted online. FYA admitted to having control over 20.000 Twitter accounts and another 20.000 blogs it could use for various purposes.
Obtained correspondence proves that FYA employees were convinced that the U.S. and China resort to similar methods in their communication strategies.
In December 2011, mass social protests erupted in Russia in response to rigged parliamentary elections. In the meantime, opposition managed to gain advantage on the Russian-speaking internet. Despite using innovative methods of manipulation and relatively large resources, the Federal Youth Agency’s work results were not very impressive.
As consequence of the upheaval, on 27 December 2001 Vladislav Surkov, the man responsible for supervising Russian internet propaganda, was dismissed from his post of government chief-of-staff in the Presidential Executive Office.
Troll factories gain speed
Surkov was replaced by Vyacheslav Volodin. His previous main achievement was the pacification of the largest independent Russian news websites. Volodin didn’t forget about Surkov’s online projects, but the internet propaganda center was moved from Moscow to the Republic of Adygea (Yablonovsky and Perekatny urban settlements on the suburbs of Krasnodar), Nizhny Novgorod Oblast (a suburban settlement Zelyony Gorod), and St. Petersburg (a suburban settlement Olgino).
The last of three locations has become the most famous Russian „troll factory” in summer 2013. It operates under the name Internet Research Agency (Agentstvo Internet Issledovaniya – AII). Quite soon after it was established, the Agency was hailed the „Ministry of Truth”. After just one year AII moved its headquarters to an office building at 55 Savushkina Street, where it remains to this day. We don’t know much about two other factories. Neither hackers nor independent journalists have managed to infiltrate their servers so far.
The Internet Research Agency was established and has been funded by Yevgeny Prigozhin, whose career started in catering industry in the 90s. Prigozhin opened an elite restaurant New Island on a riverboat in St. Petersburg, where president Putin celebrated his 50th birthday in 2002. That’s when Prigozhin met Putin – it sparked friendship between the two, and helped Prigozhin’s career speed up.
Prigozhin became Putin’s personal chef and his companies took over catering services in Kremlin; they also started providing food to Russian schools and military bases. The sum of all contracts signed by Prigozhin with the state exceeded 140 billion RUB (ca. 4-5 billion USD).
In late summer 2013 the Internet Research Agency was infiltrated by independent Russian journalists. They were tempted by a job offer looking for „internet operators” responsible for writing comments, social media posts and running blogs for 1200 RUB (30 USD) per day, or 26.000 RUB (650 USD) per month.
In 2014 and 2015 new secrets leaked from the company, released by AII’s former employees – some of them had applied for the job in order to discover and publish sensitive information, others were simply unsatisfied with work in AII.
Significant contribution to our knowledge on the mechanisms running Kremlin’s propaganda was also made by Russian hackers from Shaltai Boltai group. The hackers stole and published mailboxes of people from AII and connected companies. The main source of information were mailboxes of Olga Dzalba, AII’s financial director, and that of Aleman Consulting – a company that analyzed a number of local, national and international media connected to AII. All those projects were financed from the budget of Concord holding company, owned by Yevgeny Prigozhin. Reports from AII activities within respective projects were sent to Vyacheslav Volodin, the new first deputy chief of staff of the Presidential Administration of Russia, responsible for propaganda. This shows that just as Surkov did before him, Volodin had immediate control over the project.
The way AII functioned wasn’t much different from the Federal Youth Agency described above. Both entities had the same objective: to manipulate the online image of Russia and the world so the internet users would see them the way Kremlin wanted. AII’s budget also covers the cost of cooperation with well-known Russian bloggers and journalists, just like FYA did before. However, they are separated by the scale of their actions. AII’s structures are significantly larger and better organized. Apart from Russian speakers, it also employs people specializing in other languages.
According to data available, Yevgeny Prigozhin would spend 40 to 60 million RUB (1 million USD) a month on AII operations. Almost half of this amount was spent on salaries. Initially the company employed 100 people, but each year the numbers grew. In 2014 AII already had 600 employees.
Most of the employees are students, working in 24/7 shifts in specific sections the editorial team was divided into. In order to keep their work secret, employees from respective sections have no contact with each other.
The „Comments” team consists of people whose job is to control ten-odd Facebook, Twitter and VKontakte accounts each, as well as to maintain activity on various forums. They are required to write 100-125 pro-Kremlin comments a day. Profiles vary – they can „belong” to students, blue collar workers or housewives, both from large cities and Russian province.
„Blogger” editorial team is made of people writing at least three blogs each – they are required to reach the daily output of 10-12 posts. The secret „Fake News” editors must rewrite and edit information so it fits pro-Kremlin narrative before it’s further spread on AII-controlled websites and social media pages. There is also a „Visual” editorial team that produces propaganda YouTube videos and pro-Kremlin graphics. Over time, the editorial teams expanded and started hiringonline employees to carry out similar tasks in English, German, French and Ukrainian. And just like in FYA, the Internet Research Agency employees also have to write detailed reports on their work in order to receive payments.
Released documents include sample reports. One of the writers listed all 182 comments he made on a certain day, split into topics he commented on: opposition, Ministry of Defense, USA, EU, Middle East, Vladimir Putin, Ukraine.
A workday in AII would only begin after employees receive their daily technical tasks. Those tasks are based on articles and news from Russia Today. The tasks refer to events that took place over the past few days or are expected to happen very soon. They usually consist of 3 to 7 topics. In some exceptional cases there could be more (like after the assassination of opposition activist Boris Nemtsov) – during that time, trolls received as much as 26 tasks a day.
The first element of the task is the main idea about the event in question. On 28 February 2015 the idea was: „Nemtsov’s assassination was not convenient for Russian authorities, therefore it was an anti-Russian provocation”. After that, the thesis was coined: it’s possible that Ukrainian politicians were involved in Boris Nemtsov’s death. The next step was to include key information in the technical task, meant for AII writers to use in order to make Kremlin’s claim more credible. For example: Nemtsov had friends in Ukraine, which is suspicious enough already, and so on.
Other parts of the task are to include links and articles with this information (including foreign materials, if their perspective is in line with Kremlin’s propaganda), and keywords that might be used to find the topic on Google or Yandex.
Trolls head West
Vyacheslav Volodin’s projects proved to be so effective that Kremlin decided to expand the reach of its propaganda activities to more languages and countries. According to a Russian journalist specializing in Kremlin’s diversion operations, Ilya Klishin, preparations for informational diversion directed at Russia’s neighbors and Western countries began several months before Maidan and Russia’s annexation of Crimea. Strategy was once again developed by Volodin, and foreign markets were targeted by „troll factories”, such as the Internet Research Agency in St. Petersburg.
The first coordinated AII actions were directed at Ukrainian internet. What made the task even easier was that it had to be carried out in Russian. AII created the Kharkov News Agency which got its own budget and tasks. In the beginning, Kharkow News Agency employed 21 people.
At the same time, in October 2013 AII created its first division outside Russia – in Simferopol, the capital of Crimea. It hired a group of ten-odd local subcontractors. It’s not impossible that Kharkov News Agency was one of the initiatives that facilitated annexation of Crimea later. However, Ukrainian internet users quickly noticed pro-Russian commentators online and figured out their connection to troll factories.
Thanks to Shaltay Boltay hackers, we also know about another AII-supervised international project that was launched in April 2014. In order to disseminate Russian propaganda over American internet, AII hired 25 employees. In the beginning, they were required to characterize American internet and develop operations strategy. Reports included such information as demographical structure of major American social media platforms (e.g. Facebook and Twitter), characteristics of groups supporting Barack Obama on Facebook, most typical behaviors of American internet users or most popular Twitter hashtags. AII employees also managed to describe some of key American news websites (The Blaze, The Huffington Post, Fox News, Politico, WorldNetDaily), detailing their owners, target groups, editorial policies, as well as their attitudes towards Barack Obama, and towards Russia.
Politico was described as a news platform that allows sarcastic comments about America, while wnd.com was characterized as an outlet that doesn’t accept offending USA and Americans, or too strong criticism of Barack Obama. The Huffington Post was listed as difficult to post comments on, since they were required to be upvoted with likes. It led to establishing 100 AII accounts just to operate on that platform.
Narrow scope of the analysis – only 5 news websites – was due to the experimental nature of this project, although AII did expect it to develop over time. Initial budget on the project was 75.000 USD per month.
Another AII project was to develop Russian propaganda on American Twitter, Facebook and blogs. Russian trolls would impersonate American citizens, such as David – music-loving student and athlete from New York, Allan – professional photographer and amateur cartographer from Chicago, or Arnelia – Boston-born artist and dog lover.
Troll factory created hundreds of fake identities. Each had a hobby, posted photos online and would gradually expand their contact network.
The plan for the first month on American Facebook was to buy 20 existing profiles on black market – they would have their own original past – and to create 50 brand new profiles. Trolls considered user profiles with elaborate backstory their basic work tool – these were used to spread Russian propaganda. New profiles were supposed to support that group. Each participant of the American project controlled 6 Facebook profiles. They also had to control 10 Twitter accounts that would post up to 50 tweet a day.
The troll factory in Petersburg is still in operation, spreading Kremlin propaganda in Russian as well as in other languages. Information leak in 2013-15 made AII improve its security measures. Today, it’s more difficult to find reliable information on its activities than it used to be.
Russian propaganda subcontractors
Apart from the Internet Research Agency, Vyacheslav Volodin managed to develop some other projects as well. For example – Kremlin started seeking subcontractors all over the world, in order to implement propaganda messages more effectively. According to Russian journalists, the subcontractors were found in such countries as Germany, India and Thailand – all with extensive Russian minorities.
However, this activity was quickly picked up on in the West. The first to notice were online moderators of The Guardian in 2014. According to Guardian journalists, Russian trolls concentrated their efforts on supporting Kremlin’s policies, and on attacking Ukraine and the West. Their comments were remarkably formulaic and it was clear they were not written by British readers. What is more, most accounts were created in the same timeframe – between February and April 2014. Pro-Russian comments would receive disproportionally large amounts of likes that increased their visibility. German journalists also made complaints about waves of organized Kremlin support on local information websites.
Troll campaigns in Poland
Kremlin’s diversion strategy was also aimed at Polish internet. The offensive was launched at the turn of 2013 and 2014.
Pro-Kremlin commenters of Polish news websites were the first to resume the offensive. Their messages started to dominate comments sections of the largest Polish news platforms, such as Onet, WP or Gazeta.pl. The objective was to impose the Russian version of Ukrainian Euromaidan on Polish public opinion. The next step was an attempt to defend Russian authorities and their policy towards Ukraine. In following months, the trolls started to insist there were no Russian troops in Crimea and Donbas at all. Finally, they began to write about legitimacy of Crimea’s annexation and aggression on Donbas in Ukraine.
Again, those comments had one more common denominator, other than their pro-Russian character: they were generic enough to suggest they were written in accordance to their technical task outlines, as described above. Those comments would also receive an overwhelming amount of likes and positive responses. Over time, comments maintaining pro-Kremlin narrative seeped through to other news platforms, repeating the same action pattern. In March 2014 Newsweek Polska examined this phenomenon thoroughly, discovering that IP and geolocation of pro-Kremlin commentators in 80% of cases point to users writing from Germany, USA, Switzerland or Greece. Troll factories have also been known for using anonymizing proxy servers that make it impossible to track the person’s real location – it’s one of the most typical strategies, which they have been using since 2010. Over time the numbers of anti-EU, anti-NATO and anti-American comments started to grow. Pro-Kremlin narrative, which was openly rejected by internet users in Poland, was gradually discontinued. Also, Russian troll factories modified their strategies for naming their profiles. In order to increase their influence over right-wing circles, many trolls started to hide behind clearly „patriotic” names.
In case of social media, Russian propaganda has been activated and steadily intensified since late 2013. People who are in control of Facebook profiles seem to be making a lot of mistakes, which allows to clearly determine that profile owners are not who they claim to be. The first thing that looked conspicuous when pro-Kremlin users initiated discussions on the subject of Russia or Ukraine, was their mediocre level of Polish language and using Russianisms (such as frequently spelling words with „v” instead of „w” – „v” is not used in Polish alphabet). Their friends’ lists were also suspicious – they either had no friends at all or they gathered friends from all over the world (India, Pakistan, Turkey, Brazil). There were also no pictures available of the supposed profile owner. Their Facebook walls were filled with clickbait links, music, random photos and videos, sometimes shared from Russian websites.
Most out of ten-odd pro-Russian Facebook profiles slinging mud at Ukraine were established between 22 and 27 November 2013 – during first days of Euromaidan. Over time, pro-Russian profiles would evolve and develop their own strategy to make themselves more difficult to tell apart from real Polish users. Those profiles were also taken over by people with good knowledge of Polish to avoid language mistakes.
Profiles with long background stories were also put to action. They were most likely bought by AII just like it was done during the American project. Troll factories also started creating new profiles – however, this time it was done systematically and their creators made sure it would not be possible to track any connection between one another. They started uploading photos of real people – even though some of identities were stolen from the internet. Indian and Pakistani friends began to disappear from friends lists. Nowadays it is very difficult, if not impossible, to tell real and fake profiles apart.
Pro-Kremlin profiles were now attributed to certain socio-political groups, in order to be useful in any particular environment. Initially, most profiles were designed to be people from small and medium-sized towns and cities. Trolls profiled themselves as middle-aged people from provincial areas. However, the increase of radical right tendencies in Poland has encouraged entire legions of commenters using intense patriotic symbolism while presenting pro-Russian opinions and information. Since 2016 another type of profiles has gained increasing popularity among pro-Russian commentators: elderly people who’ve seen a lot in life, which gives them the right to judge Putin’s Russia (positively) and UE, NATO or Ukraine (negatively).
Pro-Russian profiles have also taken it upon themselves to threaten those who actively hunt them. Troll hunters receive emails that are supposed to discredit them, but they also receive threats of physical harm or even death.
Another segment of Polish internet that witnessed sudden increase of pro-Kremlin narrative since the turn of 2013/14 are not social media or comment sections of websites, but the websites as such and blogosphere. The most famous example is unexpected pro-Russian about-face of the Eastern Europe news website Kresy.pl. During Euromaidan, the platform took a turn of attitude: from anti-Kremlin to anti-Ukrainian and pro-Russian. Similar narrative was intensified on a far-right website xportal.pl. The largest group of incentivized websites and blogs to intensify their pro-Kremlin narrative were those long committed to conspiracy theories (of all kinds: both Polish and global). Examples include: marucha.wordpress.com, zezorro.blogspot.com, mufti.polacy.eu.org, monitor-polski.pl, dakowski.pl, piotrbein.wordpress.com, eugeniuszsendecki.neon24.pl, rafzen.wordpress.com, alexjones.pl, wiernipolsce.wordpress.com, wirtualnapolonia.com, zapalowski.eu, wicipolskie.org, prawy.pl, or racjapolskiejlewicy.pl.
Another pro-Russia useful group are websites which used to present politically neutral information in the past, but they suddenly became Kremlin’s propaganda megaphones (for example: zmianynaziemi.pl, adziekan69.blogspot.com, kochanezdrowie.blogspot.com). Some blogs were started especially in order to propagate pro-Russian narratives (such as zygmuntbialas.blog.pl, anty-news.waw.pl, tragediadonbasu.pl). Most of those blogs and websites neither disclose their sources of financing nor authorship of their articles.
This rapid and unnatural increase in activity of pro-Kremlin internet users was noticed by Polish special services in 2014. The problem is mentioned in various documents, including the Report on the State of Cybersecurity in the Republic of Poland in 2014 by the The Governmental Computer Security Incident Response Team, and the activity report of the Internal Security Agency in 2014.
Government analysts noticed increased activity of internet users in online debates on topics of Russia and Ukraine, as well as the emergence of a new phenomenon – trolling. They recommended to simply pay the pro-Russian users no attention and to avoid responding or talking to them.
What’s the purpose of all this effort? Why does Russia hire armies of online trolls? Their main objective might not necessarily be convincing us to the Russian vision of the world. It might simply be flooding Polish internet with pro-Kremlin content in order to have us believe that this is the narrative that has won.
Central Europe’s leading English language investigative platform.