Science and Technology
Stanford Internet Observatory
News Type

Today the Stanford Internet Observatory published a white paper on GRU online influence operations from 2014 to 2019. The authors conducted this research at the request of the United States Senate Select Committee on Intelligence (SSCI) and began with a data set consisting of social media posts provided to the Committee by Facebook.  Facebook attributed the Pages and posts in this data set to the Main Directorate of the General Staff of the Armed Forces of the Russian Federation (Главное управление Генерального штаба Вооружённых сил Российской Федерации), known as the GU, or by its prior acronym GRU. It removed the content in or before 2018. The data provided by Facebook to SSCI consisted of 28 folders, each corresponding to at least one unique Facebook Page. These Pages were in turn tied to discrete GRU-attributed operations. Some of these Pages and operations were significant; others were so minor they scarcely had any data associated with them at all.

While some content related to these operations has been unearthed by investigative journalists, a substantial amount has not been seen by the public in the context of GRU attribution. The SIO white paper is intended to provide an overview of the GRU tactics used in these operations and to offer key takeaways about the distinct operational clusters observed in the data. Although the initial leads were provided by the Facebook data set, many of these Pages have ties to material that remains accessible on the broader internet, and we have attempted to aggregate and archive that broader expanse of data for public viewing and in service to further academic research.

Several key takeaways appear in the analysis:

  • Traditional narrative laundering operations updated for the internet age. Narrative laundering – the technique of moving a certain narrative from its state-run origins to the wider media ecosystem through the use of aligned publications, “useful idiots,” and, perhaps, witting participants – is an "active-measures" tactic with a long history. In this white paper we show how narrative laundering has been updated for the social-media era. The GRU created think tanks and media outlets to serve as initial content drops, and fabricated personas — fake online identities — to serve as authors. A network of accounts additionally served as distributors, posting the content to platforms such as Twitter and Reddit. In this way, GRU-created content could make its way from a GRU media property to an ideologically aligned real independent media website to Facebook to Reddit — a process designed to reduce skepticism in the original unknown blog.


The website for NBene Group, a GRU-attributed think tank. In one striking example of how this content can spread, an NBene Group piece about the annexation of Crimea was cited in an American military law journal article. 

  • The emergence of a two-pronged approach: narrative and memetic propaganda by different entities belonging to a single state actor. The GRU aimed to achieve influence by feeding its narratives into the wider mass-media ecosystem with the help of think tanks, affiliated websites, and fake personas. This strategy is distinct from that of the Internet Research Agency, which invested primarily in a social-first memetic (i.e., meme-based) approach  to achieve influence, including ad purchases, direct engagement with users on social media, and content crafted specifically with virality in mind. Although the GRU conducted operations on Facebook, it either did not view maximizing social audience engagement as a priority or did not have the wherewithal to do so. To the contrary, it appears to have designed its operation to achieve influence in other ways. 

  • A deeper understanding of hack-and-leak operations. GRU hack-and-leak operations are well known. This tactic — which has been described in detail in the Mueller Report — had a particularly remarkable impact on the 2016 U.S. Election, but the GRU conducted other hack-and-leak operations between 2014 and 2019 as well. One of the salient characteristics of this tactic is the need for a second party (such as Wikileaks, for example) to spread the results of a hack-and-leak operation, since it is not effective to leak hacked documents without having an audience. In this white paper we analyze the GRU’s methods for disseminating the results of its hack-and-leak operations. While its attempts to do so through its own social media accounts were generally ineffective, it did have success in generating media attention (including on RT), which led in turn to wider coverage of the results of these operations. Fancy Bear’s own Facebook posts about its hack-and-leak attack on the World Anti-Doping Agency (WADA), for example, received relatively little engagement, but write-ups in Wired and The Guardian ensured that its operations got wider attention. 

Some of the most noteworthy operations we analyze in this white paper include:

  • Inside Syria Media Center (ISMC), a media entity that was created as part of the Russian government’s multifarious influence operation in support of Syrian President Bashar al-Assad. Although ISMC claimed to be “[c]ollecting information about the Syrian conflict from ground-level sources,” its actual function was to boost Assad and discredit Western forces and allies, including the White Helmets. Our analysis of the ISMC Facebook Page shows exceptionally low engagement — across 5,367 posts the average engagement was 0.1 Likes per post — but ISMC articles achieved wider attention when its numerous author personas (there were six) reposted them on other sites. We counted 142 unique domains that reposted ISMC articles. This process happened quickly; a single article could be reposted on many alternative media sites within days of initial publication on the ISMC website. We observe that, while both Internet Research Agency (IRA) and GRU operations covered Syria, the IRA only rarely linked to the ISMC website.


The Quora profile for Sophie Mangal, one of the personas that authored and distributed ISMC content.


  • APT-28, also known as Fancy Bear, is a cyber-espionage group identified by the Special Counsel Investigation as GRU Units 26165 and 74455. This entity has conducted cyber attacks in connection with a number of Russian strategic objectives, including, most famously, the DNC hack of 2016. The Facebook data set provided to SSCI included multiple Pages related to hacking operations, including DCLeaks and Fancy Bears Hack Team, a sports-related Page.  This activity included a hack-and-leak attack on WADA, almost certainly in retaliation for WADA’s recommendation that the International Olympic Committee ban the Russian team from the 2016 Olympics in Rio de Janeiro. The documents leaked (and, according to WADA, altered) by Fancy Bears purported to show that athletes from EU countries and the US were cheating by receiving spurious therapeutic use exemptions. Our analysis of these Pages looks at their sparse engagement on social platforms and the stark contrast to the substantial coverage in mainstream press. It also notes the boosting of such operations by Russian state-linked Twitter accounts, RT, and Sputnik. 

  • CyberBerkut, Committee of Soldiers’ Mothers of Ukraine, and “For an Exit from Ukraine,” a network of Pages targeting Ukraine, which has been subject to an aggressive disinformation campaign by the Russian government since the Euromaidan revolution in 2014. Our investigation of these Pages highlights the degree to which apparently conflicting messages can be harnessed together in support of a single overarching objective. (This also suggests a parallel with the tactics of the IRA, which frequently boosted groups on opposite sides of contentious issues.) Among the multiple, diverging operational vectors we analyzed were attempts to sow disinformation intended to delegitimize the government in Kyiv; to leverage a Ukrainian civil-society group to undermine public confidence in the army; and to convince Ukrainians that their  country was “without a future” and that they were better off emigrating to Poland. While the Pages we analyzed worked with disparate themes, their content was consistently aimed at undermining the government in Kyiv and aggravating tensions between Eastern and Western Ukraine. 

Considered as a whole, the data provided by Facebook — along with the larger online network of websites and accounts that these Pages are connected to — reveal a large, multifaceted operation set up with the aim of artificially boosting narratives favorable to the Russian state and disparaging Russia’s rivals. Over a period when Russia was engaged in a wide range of geopolitical and cultural conflicts, including Ukraine, MH17, Syria, the Skripal Affair, the Olympics ban, and NATO expansion, the GRU turned to active measures to try to make the narrative playing field more favorable. These active measures included social-media tactics that were repetitively deployed but seldom successful when executed by the GRU. When the tactics were successful, it was typically because they exploited mainstream media outlets; leveraged purportedly independent alternative media that acts, at best, as an uncritical recipient of contributed pieces; and used fake authors and fake grassroots amplifiers to articulate and distribute the state’s point of view. Given that many of these tactics are analogs of those used in Cold-War influence operations, it seems certain that they will continue to be refined and updated for the internet era, and are likely to be used to greater effect. 


The linked white paper and its conclusions are in part based on the analysis of social-media content that was provided to the authors by the Senate Select Committee on Intelligence under the auspices of the Committee’s Technical Advisory Group, whose Members serve to provide substantive technical and expert advice on topics of importance to ongoing Committee activity and oversight. The findings, interpretations, and conclusions presented herein are those of the authors, and do not necessarily represent the views of the Senate Select Committee on Intelligence or its Membership.

All News button

The Freeman Spogli Institute for International Studies, the Center for International Security & Cooperation, and the Hoover Institution are honored to co-sponsor the 2015 Drell Lecture with The Honorable Ashton B. Carter, 25th U.S. Secretary of Defense, who will speak on "Rewiring the Pentagon: Charting a New Path on Innovation and Cybersecurity." The event will include welcoming remarks by Stanford University President John Hennessy. The talk will be followed by a Q&A session with Carter moderated by Amy Zegart, co-director of the CISAC and senior fellow at Hoover. Questions will be collected from the audience as well as from Twitter, using the hashtag #SecDefAtStanford. 


Drell Lecture Recording: NA


Drell Lecture Transcript: NA


Speaker's Biography: Secretary Carter was the 2014-2015 Payne Distinguished Visitor at the Freeman-Spogli Institute for International Studies until he left upon his nomination by the White House. Ash Carter served in numerous jobs in the Department of Defense, and as the twenty-fifth Secretary of Defense under President Obama. 



Cemex Auditorium

655 Knight Way

Stanford University

Ashton Carter 25th United States Secretary of Defense Speaker United States Department of Defense
Former SK Center Fellow at the Freeman Spogli Institute for International Studies

Yong Suk Lee was the SK Center Fellow at the Freeman Spogli Institute for International Studies and Deputy Director of the Korea Program at the Walter H. Shorenstein Asia-Pacific Research Center at Stanford University. He served in these roles until June 2021.

Lee’s main fields of research are labor economics, technology and entrepreneurship, and urban economics. Some of the issues he has studied include technology and labor markets, entrepreneurship and economic growth, entrepreneurship education, and education and inequality. He is also interested in both the North and South Korean economy and has examined how economic sanctions affect economic activity in North Korea, and how management practices and education policy affect inequality in South Korea. His current research focuses on how the new wave of digital technologies, such as robotics and artificial intelligence affect labor, education, entrepreneurship, and productivity.

His research has been published in both economics and management journals including the Journal of Urban Economics, Journal of Economic Geography, Journal of Business Venturing, Journal of Health Economics, and Labour Economics. Lee also regularly contributes to policy reports and opinion pieces on contemporary issues surrounding both North and South Korea.

Prior to joining Stanford, Lee was an assistant professor of economics at Williams College in Massachusetts. He received his Ph.D. in Economics from Brown University, a Master of Public Policy from Duke University, and a Bachelor's degree and master's degree in architecture from Seoul National University. Lee also worked as a real estate development consultant and architecture designer as he transitioned from architecture to economics.

While at APARC, Dr. Lee led and participated in several research projects, including Stanford-Asia Pacific Innovation; Digital Technologies and the Labor Market; Entrepreneurship, Technology, and Economic Development; The Impact of Robotics on Nursing Home Care in Japan; Education and Development in the Digital Economy; and New Media and Political Economy.

Former Deputy Director of the Korea Program at Shorenstein APARC

Stanford University
Encina Hall, C236
Stanford, CA 94305-6165

Senior Research Scholar at the Center for International Security and Cooperation
Hank J. Holland Fellow in Cyber Policy and Security, Hoover Institution

Dr. Herb Lin is senior research scholar for cyber policy and security at the Center for International Security and Cooperation and Hank J. Holland Fellow in Cyber Policy and Security at the Hoover Institution, both at Stanford University.  His research interests relate broadly to policy-related dimensions of cybersecurity and cyberspace, and he is particularly interested in the use of offensive operations in cyberspace as instruments of national policy and in the security dimensions of information warfare and influence operations on national security.  In addition to his positions at Stanford University, he is Chief Scientist, Emeritus for the Computer Science and Telecommunications Board, National Research Council (NRC) of the National Academies, where he served from 1990 through 2014 as study director of major projects on public policy and information technology, and Adjunct Senior Research Scholar and Senior Fellow in Cybersecurity (not in residence) at the Saltzman Institute for War and Peace Studies in the School for International and Public Affairs at Columbia University; and a member of the Science and Security Board of the Bulletin of Atomic Scientists. In 2016, he served on President Obama’s Commission on Enhancing National Cybersecurity.  Prior to his NRC service, he was a professional staff member and staff scientist for the House Armed Services Committee (1986-1990), where his portfolio included defense policy and arms control issues. He received his doctorate in physics from MIT.

Avocationally, he is a longtime folk and swing dancer and a lousy magician. Apart from his work on cyberspace and cybersecurity, he is published in cognitive science, science education, biophysics, and arms control and defense policy. He also consults on K-12 math and science education.

Subscribe to Science and Technology