Intelligence
0
Research Scholar, Stanford Internet Observatory
riana.jpg

Riana Pfefferkorn is a Research Scholar at the Stanford Internet Observatory. She investigates the U.S. and other governments' policies and practices for forcing decryption and/or influencing the security design of online platforms and services, devices, and products, both via technical means and through the courts and legislatures. Riana also studies novel forms of electronic surveillance and data access by U.S. law enforcement and their impact on civil liberties. 

Previously, Riana was the Associate Director of Surveillance and Cybersecurity at the Stanford Center for Internet and Society, where she remains an affiliate. Prior to joining Stanford, she was an associate in the Internet Strategy & Litigation group at the law firm of Wilson Sonsini Goodrich & Rosati, and a law clerk to the Honorable Bruce J. McGiverin of the U.S. District Court for the District of Puerto Rico. During law school, she interned for the Honorable Stephen Reinhardt of the U.S. Court of Appeals for the Ninth Circuit.

Riana has spoken at various legal and security conferences, including Black Hat and DEF CON's Crypto & Privacy Village. She is frequently quoted in the press, including the New York Times, the Washington Post, and NPR. Riana is a graduate of the University of Washington School of Law and Whitman College.

Complete list of publications and recent blog posts here.

Authors
Stanford Internet Observatory
News Type
Blogs
Date
Paragraphs

Today the Stanford Internet Observatory published a white paper on GRU online influence operations from 2014 to 2019. The authors conducted this research at the request of the United States Senate Select Committee on Intelligence (SSCI) and began with a data set consisting of social media posts provided to the Committee by Facebook.  Facebook attributed the Pages and posts in this data set to the Main Directorate of the General Staff of the Armed Forces of the Russian Federation (Главное управление Генерального штаба Вооружённых сил Российской Федерации), known as the GU, or by its prior acronym GRU. It removed the content in or before 2018. The data provided by Facebook to SSCI consisted of 28 folders, each corresponding to at least one unique Facebook Page. These Pages were in turn tied to discrete GRU-attributed operations. Some of these Pages and operations were significant; others were so minor they scarcely had any data associated with them at all.

While some content related to these operations has been unearthed by investigative journalists, a substantial amount has not been seen by the public in the context of GRU attribution. The SIO white paper is intended to provide an overview of the GRU tactics used in these operations and to offer key takeaways about the distinct operational clusters observed in the data. Although the initial leads were provided by the Facebook data set, many of these Pages have ties to material that remains accessible on the broader internet, and we have attempted to aggregate and archive that broader expanse of data for public viewing and in service to further academic research.

Several key takeaways appear in the analysis:

  • Traditional narrative laundering operations updated for the internet age. Narrative laundering – the technique of moving a certain narrative from its state-run origins to the wider media ecosystem through the use of aligned publications, “useful idiots,” and, perhaps, witting participants – is an "active-measures" tactic with a long history. In this white paper we show how narrative laundering has been updated for the social-media era. The GRU created think tanks and media outlets to serve as initial content drops, and fabricated personas — fake online identities — to serve as authors. A network of accounts additionally served as distributors, posting the content to platforms such as Twitter and Reddit. In this way, GRU-created content could make its way from a GRU media property to an ideologically aligned real independent media website to Facebook to Reddit — a process designed to reduce skepticism in the original unknown blog.

Image

The website for NBene Group, a GRU-attributed think tank. In one striking example of how this content can spread, an NBene Group piece about the annexation of Crimea was cited in an American military law journal article. 

  • The emergence of a two-pronged approach: narrative and memetic propaganda by different entities belonging to a single state actor. The GRU aimed to achieve influence by feeding its narratives into the wider mass-media ecosystem with the help of think tanks, affiliated websites, and fake personas. This strategy is distinct from that of the Internet Research Agency, which invested primarily in a social-first memetic (i.e., meme-based) approach  to achieve influence, including ad purchases, direct engagement with users on social media, and content crafted specifically with virality in mind. Although the GRU conducted operations on Facebook, it either did not view maximizing social audience engagement as a priority or did not have the wherewithal to do so. To the contrary, it appears to have designed its operation to achieve influence in other ways. 

  • A deeper understanding of hack-and-leak operations. GRU hack-and-leak operations are well known. This tactic — which has been described in detail in the Mueller Report — had a particularly remarkable impact on the 2016 U.S. Election, but the GRU conducted other hack-and-leak operations between 2014 and 2019 as well. One of the salient characteristics of this tactic is the need for a second party (such as Wikileaks, for example) to spread the results of a hack-and-leak operation, since it is not effective to leak hacked documents without having an audience. In this white paper we analyze the GRU’s methods for disseminating the results of its hack-and-leak operations. While its attempts to do so through its own social media accounts were generally ineffective, it did have success in generating media attention (including on RT), which led in turn to wider coverage of the results of these operations. Fancy Bear’s own Facebook posts about its hack-and-leak attack on the World Anti-Doping Agency (WADA), for example, received relatively little engagement, but write-ups in Wired and The Guardian ensured that its operations got wider attention. 

Some of the most noteworthy operations we analyze in this white paper include:

  • Inside Syria Media Center (ISMC), a media entity that was created as part of the Russian government’s multifarious influence operation in support of Syrian President Bashar al-Assad. Although ISMC claimed to be “[c]ollecting information about the Syrian conflict from ground-level sources,” its actual function was to boost Assad and discredit Western forces and allies, including the White Helmets. Our analysis of the ISMC Facebook Page shows exceptionally low engagement — across 5,367 posts the average engagement was 0.1 Likes per post — but ISMC articles achieved wider attention when its numerous author personas (there were six) reposted them on other sites. We counted 142 unique domains that reposted ISMC articles. This process happened quickly; a single article could be reposted on many alternative media sites within days of initial publication on the ISMC website. We observe that, while both Internet Research Agency (IRA) and GRU operations covered Syria, the IRA only rarely linked to the ISMC website.

Image

The Quora profile for Sophie Mangal, one of the personas that authored and distributed ISMC content.

 

  • APT-28, also known as Fancy Bear, is a cyber-espionage group identified by the Special Counsel Investigation as GRU Units 26165 and 74455. This entity has conducted cyber attacks in connection with a number of Russian strategic objectives, including, most famously, the DNC hack of 2016. The Facebook data set provided to SSCI included multiple Pages related to hacking operations, including DCLeaks and Fancy Bears Hack Team, a sports-related Page.  This activity included a hack-and-leak attack on WADA, almost certainly in retaliation for WADA’s recommendation that the International Olympic Committee ban the Russian team from the 2016 Olympics in Rio de Janeiro. The documents leaked (and, according to WADA, altered) by Fancy Bears purported to show that athletes from EU countries and the US were cheating by receiving spurious therapeutic use exemptions. Our analysis of these Pages looks at their sparse engagement on social platforms and the stark contrast to the substantial coverage in mainstream press. It also notes the boosting of such operations by Russian state-linked Twitter accounts, RT, and Sputnik. 

  • CyberBerkut, Committee of Soldiers’ Mothers of Ukraine, and “For an Exit from Ukraine,” a network of Pages targeting Ukraine, which has been subject to an aggressive disinformation campaign by the Russian government since the Euromaidan revolution in 2014. Our investigation of these Pages highlights the degree to which apparently conflicting messages can be harnessed together in support of a single overarching objective. (This also suggests a parallel with the tactics of the IRA, which frequently boosted groups on opposite sides of contentious issues.) Among the multiple, diverging operational vectors we analyzed were attempts to sow disinformation intended to delegitimize the government in Kyiv; to leverage a Ukrainian civil-society group to undermine public confidence in the army; and to convince Ukrainians that their  country was “without a future” and that they were better off emigrating to Poland. While the Pages we analyzed worked with disparate themes, their content was consistently aimed at undermining the government in Kyiv and aggravating tensions between Eastern and Western Ukraine. 

Considered as a whole, the data provided by Facebook — along with the larger online network of websites and accounts that these Pages are connected to — reveal a large, multifaceted operation set up with the aim of artificially boosting narratives favorable to the Russian state and disparaging Russia’s rivals. Over a period when Russia was engaged in a wide range of geopolitical and cultural conflicts, including Ukraine, MH17, Syria, the Skripal Affair, the Olympics ban, and NATO expansion, the GRU turned to active measures to try to make the narrative playing field more favorable. These active measures included social-media tactics that were repetitively deployed but seldom successful when executed by the GRU. When the tactics were successful, it was typically because they exploited mainstream media outlets; leveraged purportedly independent alternative media that acts, at best, as an uncritical recipient of contributed pieces; and used fake authors and fake grassroots amplifiers to articulate and distribute the state’s point of view. Given that many of these tactics are analogs of those used in Cold-War influence operations, it seems certain that they will continue to be refined and updated for the internet era, and are likely to be used to greater effect. 

Image

The linked white paper and its conclusions are in part based on the analysis of social-media content that was provided to the authors by the Senate Select Committee on Intelligence under the auspices of the Committee’s Technical Advisory Group, whose Members serve to provide substantive technical and expert advice on topics of importance to ongoing Committee activity and oversight. The findings, interpretations, and conclusions presented herein are those of the authors, and do not necessarily represent the views of the Senate Select Committee on Intelligence or its Membership.

 
All News button
1
0
renee-diresta.jpg

Renée DiResta is the Research Manager at the Stanford Internet Observatory. She investigates the spread of malign narratives across social networks, and assists policymakers in understanding and responding to the problem. She has advised Congress, the State Department, and other academic, civic, and business organizations, and has studied disinformation and computational propaganda in the context of pseudoscience conspiracies, terrorism, and state-sponsored information warfare.

You can see a full list of Renée's writing and speeches on her website: www.reneediresta.com or follow her on twitter @noupside.

 

Research Manager, Stanford Internet Observatory
Authors
News Type
News
Date
Paragraphs

The House Permanent Select Committee on Intelligence held a public hearing on Thursday, March 28, 2019, as part of its investigation into Russian influence during and after the 2016 election campaign.

The hearing, "Putin’s Playbook: The Kremlin’s Use of Oligarchs, Money and Intelligence in 2016 and Beyond” included testimony by Michael McFaul, former U.S. Ambassador to Russia and Director of the Freeman Spogli Institute at Stanford University.


Download Complete Testimony (PDF 263 KB)

EXCERPT

To contain and thwart the malicious effects of “Putinism,” the United States government and the American people must first understand the nature of the threat. This testimony focuses onthe nexus of political and economic power within Russia under Putin’s leadership, and how these domestic practices can be used abroad to advance Putin’s foreign policy agenda. Moreover, it is important to underscore that crony capitalism, property rights provided by the state, bribery, and corruption constitute only a few of many different mechanisms used by Putin in his domestic authority and foreign policy abroad.

This testimony proceeds in three parts. Section I describes the evolution of Putin’s system of government at home, focusing in particular on the relationship between the state and big business. Section II illustrates how Putin seeks to export his ideas and practices abroad. Section III focuses on Putin’s specific foreign policy objective of lifting sanctions on Russian individuals and companies.

Watch the C-SPAN recording of the testimony


Media Contact: Ari Chasnoff, Assistant Director for Communications, 650-725-2371, chasnoff@stanford.edu

All News button
1
0
rsd18_083_0009a.jpg

Alex Stamos is a cybersecurity expert, business leader and entrepreneur working to improve the security and safety of the Internet through his teaching and research at Stanford University. Stamos is the director of the Stanford Internet Observatory at the Cyber Policy Center, a part of the Freeman-Spogli Institute for International Studies, where he is also a research scholar.

Prior to joining Stanford, Alex served as the Chief Security Officer of Facebook. In this role, Stamos led a team of engineers, researchers, investigators and analysts charged with understanding and mitigating information security risks to the company and safety risks to the 2.5 billion people on Facebook, Instagram and WhatsApp. During his time at Facebook, he led the company’s investigation into manipulation of the 2016 US election and helped pioneer several successful protections against these new classes of abuse. As a senior executive, Alex represented Facebook and Silicon Valley to regulators, lawmakers and civil society on six continents, and has served as a bridge between the interests of the Internet policy community and the complicated reality of platforms operating at billion-user scale. In April 2017, he co-authored “Information Operations and Facebook”, a highly cited examination of the influence campaign against the US election, which still stands as the most thorough description of the issue by a major technology company.

Before joining Facebook, Alex was the Chief Information Security Officer at Yahoo, rebuilding a storied security team while dealing with multiple assaults by nation-state actors. While at Yahoo, he led the company’s response to the Snowden disclosures by implementing massive cryptographic improvements in his first months. He also represented the company in an open hearing of the US Senate’s Permanent Subcommittee on Investigations.

In 2004, Alex co-founded iSEC Partners, an elite security consultancy known for groundbreaking work in secure software development, embedded and mobile security. As a trusted partner to world’s largest technology firms, Alex coordinated the response to the “Aurora” attacks by the People’s Liberation Army at multiple Silicon Valley firms and led groundbreaking work securing the world’s largest desktop and mobile platforms. During this time, he also served as an expert witness in several notable civil and criminal cases, such as the Google Street View incident and pro bono work for the defendants in Sony vs George Hotz and US vs Aaron Swartz. After the 2010 acquisition of iSEC Partners by NCC Group, Alex formed an experimental R&D division at the combined company, producing five patents.

A noted speaker and writer, he has appeared at the Munich Security Conference, NATO CyCon, Web Summit, DEF CON, CanSecWest and numerous other events. His 2017 keynote at Black Hat was noted for its call for a security industry more representative of the diverse people it serves and the actual risks they face. Throughout his career, Alex has worked toward making security a more representative field and has highlighted the work of diverse technologists as an organizer of the Trustworthy Technology Conference and OURSA.

Alex has been involved with securing the US election system as a contributor to Harvard’s Defending Digital Democracy Project and involved in the academic community as an advisor to Stanford’s Cybersecurity Policy Program and UC Berkeley’s Center for Long-Term Cybersecurity. He is a member of the Aspen Institute’s Cyber Security Task Force, the Bay Area CSO Council and the Council on Foreign Relations. Alex also serves on the advisory board to NATO’s Collective Cybersecurity Center of Excellence in Tallinn, Estonia.

Stamos worked under Prof. David Patterson while earning a BS in Electrical Engineering and Computer Science at UC Berkeley. He lives in the Bay Area with his wife and three children.

Director, Stanford Internet Observatory
-

Through the Hack the Pentagon program, The Department of Defense (DoD) had asked Synack to look for vulnerabilities left undetected by traditional security solutions in one of their highly complex and sensitive systems. The DoD was going to push the limits of security beyond that of most enterprises, and the results were surprising. Hear from Synack CEO Jay Kaplan how the government can benefit from bug bounty programs, what Hack the Pentagon revealed about DoD security, and why more and more organizations are employing red team penetration testing. 

Jay Kaplan co-founded Synack after serving in several security-related capacities at the Department of Defense, including the DoD’s Incident Response and Red Team. Prior to founding Synack, Jay was a Senior Cyber Analyst at the National Security Agency (NSA), where his focus was supporting counterterrorism-related intelligence operations. Jay received a BS in Computer Science with a focus in Information Assurance and a MS in Engineering Management from George Washington University studying under a DoD/NSA-sponsored fellowship. Jay holds a number of security certifications from ISC(2) and GIAC.

Encina Hall, E008 (garden level)

Jay Kaplan CEO Synack
Seminars

Are you interested in cybersecurity? Have you wanted to learn offensive cyber techniques  but don't know where to get started? The Applied Cybersecurity team is hosting an introductory workshop to get people going with practicing exploitation and offensive cyber techniques in an ethical setting. In particular, we will focus on gaining familiarity with techniques used for competing in Capture the Flag (CTF)* competitions. We'll be hosting the first workshop this Friday, in preparation for the Hitcon CTF next week. Bring a laptop! This workshop will assume no prerequisite experience with hacking or cybersecurity so please attend regardless of how unfamiliar you are with the topic. For this workshop, we will focus on web vulnerabilities, binary reversing, and some basic cryptography challenges. Note that experience equivalent to CS107 will be useful. Food will be provided! RSVP here: https://goo.gl/forms/M5yzuQasIZpL4Ovy1

Shriram 366

Subscribe to Intelligence
Top