Information Technology
Authors
Stanford Internet Observatory
News Type
Blogs
Date
Paragraphs

Today the Stanford Internet Observatory published a white paper on GRU online influence operations from 2014 to 2019. The authors conducted this research at the request of the United States Senate Select Committee on Intelligence (SSCI) and began with a data set consisting of social media posts provided to the Committee by Facebook.  Facebook attributed the Pages and posts in this data set to the Main Directorate of the General Staff of the Armed Forces of the Russian Federation (Главное управление Генерального штаба Вооружённых сил Российской Федерации), known as the GU, or by its prior acronym GRU. It removed the content in or before 2018. The data provided by Facebook to SSCI consisted of 28 folders, each corresponding to at least one unique Facebook Page. These Pages were in turn tied to discrete GRU-attributed operations. Some of these Pages and operations were significant; others were so minor they scarcely had any data associated with them at all.

While some content related to these operations has been unearthed by investigative journalists, a substantial amount has not been seen by the public in the context of GRU attribution. The SIO white paper is intended to provide an overview of the GRU tactics used in these operations and to offer key takeaways about the distinct operational clusters observed in the data. Although the initial leads were provided by the Facebook data set, many of these Pages have ties to material that remains accessible on the broader internet, and we have attempted to aggregate and archive that broader expanse of data for public viewing and in service to further academic research.

Several key takeaways appear in the analysis:

  • Traditional narrative laundering operations updated for the internet age. Narrative laundering – the technique of moving a certain narrative from its state-run origins to the wider media ecosystem through the use of aligned publications, “useful idiots,” and, perhaps, witting participants – is an "active-measures" tactic with a long history. In this white paper we show how narrative laundering has been updated for the social-media era. The GRU created think tanks and media outlets to serve as initial content drops, and fabricated personas — fake online identities — to serve as authors. A network of accounts additionally served as distributors, posting the content to platforms such as Twitter and Reddit. In this way, GRU-created content could make its way from a GRU media property to an ideologically aligned real independent media website to Facebook to Reddit — a process designed to reduce skepticism in the original unknown blog.

Image


The website for NBene Group, a GRU-attributed think tank. In one striking example of how this content can spread, an NBene Group piece about the annexation of Crimea was cited in an American military law journal article. 

  • The emergence of a two-pronged approach: narrative and memetic propaganda by different entities belonging to a single state actor. The GRU aimed to achieve influence by feeding its narratives into the wider mass-media ecosystem with the help of think tanks, affiliated websites, and fake personas. This strategy is distinct from that of the Internet Research Agency, which invested primarily in a social-first memetic (i.e., meme-based) approach  to achieve influence, including ad purchases, direct engagement with users on social media, and content crafted specifically with virality in mind. Although the GRU conducted operations on Facebook, it either did not view maximizing social audience engagement as a priority or did not have the wherewithal to do so. To the contrary, it appears to have designed its operation to achieve influence in other ways. 

  • A deeper understanding of hack-and-leak operations. GRU hack-and-leak operations are well known. This tactic — which has been described in detail in the Mueller Report — had a particularly remarkable impact on the 2016 U.S. Election, but the GRU conducted other hack-and-leak operations between 2014 and 2019 as well. One of the salient characteristics of this tactic is the need for a second party (such as Wikileaks, for example) to spread the results of a hack-and-leak operation, since it is not effective to leak hacked documents without having an audience. In this white paper we analyze the GRU’s methods for disseminating the results of its hack-and-leak operations. While its attempts to do so through its own social media accounts were generally ineffective, it did have success in generating media attention (including on RT), which led in turn to wider coverage of the results of these operations. Fancy Bear’s own Facebook posts about its hack-and-leak attack on the World Anti-Doping Agency (WADA), for example, received relatively little engagement, but write-ups in Wired and The Guardian ensured that its operations got wider attention. 

Some of the most noteworthy operations we analyze in this white paper include:

  • Inside Syria Media Center (ISMC), a media entity that was created as part of the Russian government’s multifarious influence operation in support of Syrian President Bashar al-Assad. Although ISMC claimed to be “[c]ollecting information about the Syrian conflict from ground-level sources,” its actual function was to boost Assad and discredit Western forces and allies, including the White Helmets. Our analysis of the ISMC Facebook Page shows exceptionally low engagement — across 5,367 posts the average engagement was 0.1 Likes per post — but ISMC articles achieved wider attention when its numerous author personas (there were six) reposted them on other sites. We counted 142 unique domains that reposted ISMC articles. This process happened quickly; a single article could be reposted on many alternative media sites within days of initial publication on the ISMC website. We observe that, while both Internet Research Agency (IRA) and GRU operations covered Syria, the IRA only rarely linked to the ISMC website.

Image


The Quora profile for Sophie Mangal, one of the personas that authored and distributed ISMC content.

 

  • APT-28, also known as Fancy Bear, is a cyber-espionage group identified by the Special Counsel Investigation as GRU Units 26165 and 74455. This entity has conducted cyber attacks in connection with a number of Russian strategic objectives, including, most famously, the DNC hack of 2016. The Facebook data set provided to SSCI included multiple Pages related to hacking operations, including DCLeaks and Fancy Bears Hack Team, a sports-related Page.  This activity included a hack-and-leak attack on WADA, almost certainly in retaliation for WADA’s recommendation that the International Olympic Committee ban the Russian team from the 2016 Olympics in Rio de Janeiro. The documents leaked (and, according to WADA, altered) by Fancy Bears purported to show that athletes from EU countries and the US were cheating by receiving spurious therapeutic use exemptions. Our analysis of these Pages looks at their sparse engagement on social platforms and the stark contrast to the substantial coverage in mainstream press. It also notes the boosting of such operations by Russian state-linked Twitter accounts, RT, and Sputnik. 

  • CyberBerkut, Committee of Soldiers’ Mothers of Ukraine, and “For an Exit from Ukraine,” a network of Pages targeting Ukraine, which has been subject to an aggressive disinformation campaign by the Russian government since the Euromaidan revolution in 2014. Our investigation of these Pages highlights the degree to which apparently conflicting messages can be harnessed together in support of a single overarching objective. (This also suggests a parallel with the tactics of the IRA, which frequently boosted groups on opposite sides of contentious issues.) Among the multiple, diverging operational vectors we analyzed were attempts to sow disinformation intended to delegitimize the government in Kyiv; to leverage a Ukrainian civil-society group to undermine public confidence in the army; and to convince Ukrainians that their  country was “without a future” and that they were better off emigrating to Poland. While the Pages we analyzed worked with disparate themes, their content was consistently aimed at undermining the government in Kyiv and aggravating tensions between Eastern and Western Ukraine. 

Considered as a whole, the data provided by Facebook — along with the larger online network of websites and accounts that these Pages are connected to — reveal a large, multifaceted operation set up with the aim of artificially boosting narratives favorable to the Russian state and disparaging Russia’s rivals. Over a period when Russia was engaged in a wide range of geopolitical and cultural conflicts, including Ukraine, MH17, Syria, the Skripal Affair, the Olympics ban, and NATO expansion, the GRU turned to active measures to try to make the narrative playing field more favorable. These active measures included social-media tactics that were repetitively deployed but seldom successful when executed by the GRU. When the tactics were successful, it was typically because they exploited mainstream media outlets; leveraged purportedly independent alternative media that acts, at best, as an uncritical recipient of contributed pieces; and used fake authors and fake grassroots amplifiers to articulate and distribute the state’s point of view. Given that many of these tactics are analogs of those used in Cold-War influence operations, it seems certain that they will continue to be refined and updated for the internet era, and are likely to be used to greater effect. 

Image

The linked white paper and its conclusions are in part based on the analysis of social-media content that was provided to the authors by the Senate Select Committee on Intelligence under the auspices of the Committee’s Technical Advisory Group, whose Members serve to provide substantive technical and expert advice on topics of importance to ongoing Committee activity and oversight. The findings, interpretations, and conclusions presented herein are those of the authors, and do not necessarily represent the views of the Senate Select Committee on Intelligence or its Membership.

 
All News button
1
0
renee-diresta.jpg

Renée DiResta is the former Research Manager at the Stanford Internet Observatory. She investigates the spread of malign narratives across social networks, and assists policymakers in understanding and responding to the problem. She has advised Congress, the State Department, and other academic, civic, and business organizations, and has studied disinformation and computational propaganda in the context of pseudoscience conspiracies, terrorism, and state-sponsored information warfare.

You can see a full list of Renée's writing and speeches on her website: www.reneediresta.com or follow her @noupside.

 

Former Research Manager, Stanford Internet Observatory
0
Eloise Duvillier

Eloise Duvillier is the Program Manager of the Program on Democracy and the Internet at the Cyber Policy Center. She previously was a HR Program Manager and acting HR Business Partner at Bytedance Inc, a rapidly-growing Chinese technology startup. At Bytedance, she supported the globalization of the company by driving US acquisition integrations in Los Angeles and building new R&D teams in Seattle and Silicon Valley. Prior to Bytedance, she led talent acquisition for Baidu USA LLC’s artificial intelligence division. She began her career in the nonprofit industry where she worked in foster care, HIV education and emergency response during humanitarian crises, as well as helping war-torn communities rebuild. She graduated from University of California, Berkeley with a bachelor’s degree in Development Studies, focusing on political economics in unindustrialized societies.

Program Manager, Program on Democracy and the Internet
News Type
News
Date
Paragraphs

Midterm elections pose an opportunity for hackers interested in disrupting the democratic process

Voter registration systems provide an additional target for hackers intending to disrupt the US midterm elections; if voting machines themselves are too disperse or too obvious a target, removing voters from the rolls could have a similar effect. in Esquire, Jack Holmes explains that election security experts consider this one of many nightmare scenarios facing the American voting public—and thus, American democracy itself—on the eve of the 2018 midterm elections. (Allison Berke, Executive Director of the Stanford Cyber Initiative, quoted.)

All News button
1
Authors
News Type
News
Date
Paragraphs

Stamos joins the Hoover Institution and the Center for International Security and Cooperation at the Freeman Spogli Institute for International Studies. Former Facebook chief security officer, Alex Stamos, to bring rich real-world perspective on cybersecurity and technology policy.

 

Stanford University’s Freeman Spogli Institute for International Studies and the Hoover Institution announced today the appointment of Alex Stamos as a William J. Perry Fellow at the Center for International Security and Cooperation (CISAC), Cyber Initiative fellow, and Hoover visiting scholar.

Stamos, a computer security expert and the outgoing chief security officer at Facebook, will engage in teaching, research and policy engagement through CISAC and the Hoover Institution's Cyber Policy Program as well as the Stanford Cyber Initiative. Drawing on his considerable experience in the private sector, he will teach a graduate level course about the basics of cyber offense and defense to students without technical backgrounds as part of the Ford Dorsey Master’s in International Policy program at the Freeman Spogli Institute, which houses CISAC.

"With our country facing unprecedented challenges in digital interference with the democratic process and numerous other cybersecurity issues, Alex’s experience and perspective are a welcome addition to our group of fellows,” said Freeman Spogli Institute Director Michael McFaul.

In his role, Stamos will also engage in research projects aimed at public policy initiatives as a member of the Faculty Working Group on Information Warfare. The working group will develop, discuss and test concepts and theories about information warfare, as well as conduct applied research on countermeasures to identify and combat information warfare. The working group will also develop policy outreach in briefings to government officials, public seminars and workshops, Congressional testimony, online and traditional media appearances, op-eds and other forms of educating the public on combatting information warfare.

“We are thrilled that Alex is devoting even more energy to our cyber efforts,” said CISAC Co-Director Amy Zegart. “He's been a vital partner to the Stanford cyber policy program for several years and his Stanford "hack lab"--which he piloted in Spring 2018--is a cutting-edge class to train students in our new master’s cyber policy track. He brings extraordinary skills and a unique perspective that will enrich our classes, research, and policy programs."

Over the past three years, the Hoover Institution and CISAC have jointly developed the Stanford Cyber Policy Program.  Its mission is to solve the most important international cyber policy challenges by conducting policy-driven research across disciplines, serving as a trusted convener across sectors, and teaching the next generation. The program is led by Dr. Amy Zegart and Dr. Herbert Lin. Stamos has participated on the advisory board of the program since its inception.

“We look forward to working with Alex on some of the key cyber issues facing our world today," said Tom Gilligan, director of the Hoover Institution. "He brings tremendous experience and perspective that will contribute to Hoover’s important research addressing our nation’s cyber security issues.”

“I am excited to join Stanford and for the opportunity to share my knowledge and expertise with a new generation of students--and for the opportunity to learn from colleagues and students across many disciplines at the university,” said Stamos.

A graduate of the University of California, Berkeley, Stamos studied electrical engineering and computer science. He later co-founded a successful security consultancy, iSEC Partners, and in 2014 he joined Yahoo as its chief information security officer. Stamos joined Facebook as chief security officer in June 2015, where he led Facebook’s internal investigation into targeted election-related influence campaigns via the social media platform.

###

About CISAC: Founded in 1983, CISAC has built on its research strengths to better understand an increasingly complex international environment. It is part of Stanford's Freeman Spogli Institute for International Studies (FSI). CISAC’s mission is to generate knowledge to build a safer world through teaching and inspiring the next generation of security specialists, conducting innovative research on security issues across the social and natural sciences, and communicating our findings and recommendations to policymakers and the broader public. 

About the Hoover Institution: The Hoover Institution, Stanford University, is a public policy research center devoted to the advanced study of economics, politics, history, and political economy—both domestic and foreign—as well as international affairs. With its eminent scholars and world-renowned Library & Archives, the Hoover Institution seeks to improve the human condition by advancing ideas that promote economic opportunity and prosperity and secure and safeguard peace for America and all mankind.

About the Stanford Cyber Initiative:  Working across disciplines, the Stanford Cyber Initiative aims to understand how technology affects security, governance, and the future of work.

Media contact: Katy Gabel, Center for International Security and Cooperation: 650-725-6488, kgabel@stanford.edu

 

All News button
1
0
daphne-keller-headshot.jpg

Daphne Keller's work focuses on platform regulation and Internet users' rights. She has testified before legislatures, courts, and regulatory bodies around the world, and published both academically and in popular press on topics including platform content moderation practices, constitutional and human rights law, copyright, data protection, and national courts' global takedown orders. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2020, Daphne was the Director of Intermediary Liability at Stanford's Center for Internet and Society. She also served until 2015 as Associate General Counsel for Google, where she had primary responsibility for the company’s search products. Daphne has taught Internet law at Stanford, Berkeley, and Duke law schools. She is a graduate of Yale Law School, Brown University, and Head Start.

Other Affiliations and Roles:

PUBLICATIONS LIST

Director of Program on Platform Regulation, Cyber Policy Center
Lecturer, Stanford Law School
Date Label
-

Through the Hack the Pentagon program, The Department of Defense (DoD) had asked Synack to look for vulnerabilities left undetected by traditional security solutions in one of their highly complex and sensitive systems. The DoD was going to push the limits of security beyond that of most enterprises, and the results were surprising. Hear from Synack CEO Jay Kaplan how the government can benefit from bug bounty programs, what Hack the Pentagon revealed about DoD security, and why more and more organizations are employing red team penetration testing. 

Jay Kaplan co-founded Synack after serving in several security-related capacities at the Department of Defense, including the DoD’s Incident Response and Red Team. Prior to founding Synack, Jay was a Senior Cyber Analyst at the National Security Agency (NSA), where his focus was supporting counterterrorism-related intelligence operations. Jay received a BS in Computer Science with a focus in Information Assurance and a MS in Engineering Management from George Washington University studying under a DoD/NSA-sponsored fellowship. Jay holds a number of security certifications from ISC(2) and GIAC.

Encina Hall, E008 (garden level)

Jay Kaplan CEO Synack
Seminars
Authors
Allison Berke
News Type
Blogs
Date
Paragraphs

Can Bitcoin thrive without China? 

Bitcoin started the month of September trading at an all-time high of $4,950. By implementing Segregated Witness, or SegWit, Bitcoin allowed more transactions to take place and signaled confidence that Bitcoin would scale. On September 4, the Chinese central bank banned trading in initial coin offerings (ICOs), leading to rumors that China was considering banning Bitcoin trading altogether. Those rumors were confirmed on September 14, and Bitcoin exchanges operating in China were told to cease trading for now. This article explores what happened next, and what the future of Bitcoin is without its largest mining pools...

 

All News button
1

Are you interested in cybersecurity? Have you wanted to learn offensive cyber techniques  but don't know where to get started? The Applied Cybersecurity team is hosting an introductory workshop to get people going with practicing exploitation and offensive cyber techniques in an ethical setting. In particular, we will focus on gaining familiarity with techniques used for competing in Capture the Flag (CTF)* competitions. We'll be hosting the first workshop this Friday, in preparation for the Hitcon CTF next week. Bring a laptop! This workshop will assume no prerequisite experience with hacking or cybersecurity so please attend regardless of how unfamiliar you are with the topic. For this workshop, we will focus on web vulnerabilities, binary reversing, and some basic cryptography challenges. Note that experience equivalent to CS107 will be useful. Food will be provided! RSVP here: https://goo.gl/forms/M5yzuQasIZpL4Ovy1

Shriram 366

News Type
News
Date
Paragraphs

The Stanford Applied Cyber Team took 1st place in the Collegiate Penetration Testing Competition (CPTC) Western Regionals.

After 8 hours of intense penetration testing on Saturday, October 7th, at Uber HQ in San Francisco, the Stanford team returned to campus and authored a 52 page findings and remediation report, finishing up at 3AM and then returning to the CPTC competition venue to deliver their recommendations by 8AM Sunday.

Demonstrating moxie and professionalism under fire, the team consisting of Paul Crews, Albert Liang, Kate Stowell, Travis Lanham, Wilson Nguyen, Colleen Dai, and coach Alex Keller have qualified for the CPTC Nationals November 3-5 in Rochester, NY.

 

All News button
1
Subscribe to Information Technology