-

Abstract: A Supply and Demand Framework for YouTube Politics (with Joseph Phillips)

Youtube is the most used social network in the United States. However, for a combination of sociological and technical reasons, there exist little quantitative social science research on the political content on Youtube, in spite of widespread concern about the growth of extremist YouTube content. An emerging journalistic consensus theorizes the central role played by the video "recommendation engine," but we believe that this is premature. Instead, we propose the "Supply and Demand" framework for analyzing politics on YouTube. We discuss a number of novel technological affordances of YouTube as a platform and as a collection of videos, and how each might drive supply of or demand for extreme content. We then provide large-scale longitudinal descriptive information about the supply of and demand for alternative political content on YouTube. We demonstrate that viewership of far-right videos peaked in 2017.

Image
Kevin Munger
Kevin Munger is Assistant Professor of Political Science and Social Data Analytics, Penn State University. Ph.D., New York University, 2018. His research looks at social media and other contemporary internet technology has changed political communication. He has published research on the subject using a variety of methodologies, including textual analysis, field experiments, longitudinal surveys and qualitative theory. His research has appeared in leading journals like the American Journal of Political Science, Political Behavior, Political Communication, and Political Science Research & Methods. His present interests include cohort conflict in American politics and developing new methods for social science in a rapidly changing world.

-

renee diresta Renne DiResta
Abstract: Disinformation campaigns and black propaganda are not new, but they are evolving. Media coverage of disinformation and propaganda has focused primarily on the social-first memetic operations of the Internet Research Agency and its targeting of the United States 2016 presidential election. This talk examines a broader collection of influence operations, all affiliated with one state adversary – Russia – but leveraging distinctly different tactics. It investigates a 'playbook' that is far more expansive (and evolving) than previously understood, and assesses disinformation campaigns along several axes. We explore narrative vs memetic pathways, long-term vs discrete actions, and a collection of goals ranging from persuasion to distraction. This talk also discusses how online influence operations are deployed in conjunction hack-and-leak campaigns, and community infiltration. 

Renee DiResta Bio >

 

E207, Encina Hall 

0
renee-diresta.jpg

Renée DiResta is the former Research Manager at the Stanford Internet Observatory. She investigates the spread of malign narratives across social networks, and assists policymakers in understanding and responding to the problem. She has advised Congress, the State Department, and other academic, civic, and business organizations, and has studied disinformation and computational propaganda in the context of pseudoscience conspiracies, terrorism, and state-sponsored information warfare.

You can see a full list of Renée's writing and speeches on her website: www.reneediresta.com or follow her @noupside.

 

Former Research Manager, Stanford Internet Observatory
Renee DiResta Research Manager Stanford Internet Observatory
Seminars
Authors
News Type
Q&As
Date
Paragraphs

The science of cyber risk looks at a broad spectrum of risks across a variety of digital platforms. Often though, the work done within the field is limited by a failure to explore the knowledge of other fields, such as behavioral science, economics, law, management science, and political science. In a new Science Magazine article, “Cyber Risk Research Impeded by Disciplinary Barriers,” cyber risk experts and researchers at Stanford University make a compelling case for the importance of a cross-disciplinary approach. Gregory Falco, security researcher at the Program on Geopolitics, Technology, and Governance, and lead author of the paper, talked recently with the Cyber Policy Center about the need for a holistic approach, both within the study of cyber risk, and at a company level when an attack occurs.

CPC: Your recent perspective paper in Science Magazine highlights the issue of terminology when it comes to how organizations and institutions define a cyber attack. Why is it so important to have consistent naming when we are talking about cyber risk?

Falco: With any scientific discipline or field, there is a language for engaging with other experts. If there’s no consistent language or at least dialect for communication around cyber risk, it’s difficult to engage with scholars from different disciplines. For example: The phrase “cyber event” is contested and the threshold for what an organization considers to be a cyber event varies substantially. Some organizations consider someone pinging their network as a cyber event, others only consider something a cyber event once an intrusion has been publicly disclosed. So there’s a disparity when comparing metrics of cyber events from organization to organization because of the different thresholds of what’s considered an event.

CPC: We’ve all been sent one of those emails letting us know our data may have been compromised and your paper points out it’s nearly impossible to put foolproof protections into place; attacks are inevitable. Given that, how should companies weigh the various ways they can protect themselves?

Falco: The first exercise each organization should go through when they decide to be serious about cyber risk is to prioritize their assets. What is business critical? What is safety critical? Then, like all other risks, a cost-benefit analysis must be done for each asset based on its priority. If the asset is safety-critical, then resources should be allocated to help protect that asset or at least ensure its resilience. Trade-offs are inevitable, no company has unlimited resources. But starting with an understanding of where the priorities are, is critical.

CPC: In companies, cyber security often falls entirely to the Chief Information Security Officer (CISO). Your paper argues that’s shortsighted. What is gained when a company takes a more holistic approach?

Falco: Distributing responsibility across the organization catalyzes a security culture. A security culture is one where there is a constant vigilance or at least broad awareness of cybersecurity concerns throughout the organization. Fostering a security culture is often suggested as a mechanism to help reduce cyber risk in organizations. The problem with not distributing responsibility is that when something happens, it’s too easy to resort to finger-pointing at the CISO, and that’s counterproductive. Efforts after an attack should be on responding and being resilient, not finding the scapegoat.

CPC: Cyber risk largely focuses on prevention, but your paper argues that it’s what happens after an attack in that needs greater attention. Why is that?

Falco: Every organization will be attacked. However organizations can differentiate themselves from a cyber risk standpoint by appropriately managing the situation after an attack. Some of the most significant damages to organizations can be reputational if communication after an attack is unclear or botched. Poor communication after an attack can result in major regulatory fines or valuation adjustments as seen in cases like Yahoo and that can have major business implications. Communications aren’t the only important element of post-attack response. A thorough post-mortem of the organization’s response to the attack can be an important learning experience and a way to plan for future attacks.

CPC: Protecting against cyber attacks and the losses that go with them can obviously be costly for companies. You make a case for collaboration among different fields, say among data scientists and economists. How can that be encouraged?

Falco: We argue that cross-disciplinary collaboration rarely happens organically. Therefore, we call on funding agencies like the NSF or DARPA to specify a preference for cross disciplinary research when funding cyber risk projects. Typically, this isn’t currently a feature of calls for proposals, but for cyber risk programs it should be. We encourage researchers to explore cyber risk questions at the margins of their discipline. Those questions may lend themselves to potential overlap with other disciplines and foster a starting point for cross-disciplinary collaboration.

For more on these topics, see a full list of recent publications from the Cyber Policy Center and the Program on Geopolitics, Technology, and Governance.

Hero Image
Gregory Falco Rod Searcey
All News button
1
-

Image
Ashish Goel
Abstract:

While the Internet has revolutionized many aspects of our lives, there are still no online alternatives for making democratic decisions at large scale as a society. In this talk, we will describe algorithmic and market-inspired approaches towards large scale decision making that our research group is exploring. We will start with a model of opinion dynamics that can potentially lead to polarization, and relate that to commonly used recommendation algorithms. We will then describe the algorithms behind Stanford's participatory budgeting platform, and the lessons that we learnt from deploying this platform in over 70 civic elections. We will use this to motivate the need for a modern theory of social choice that goes beyond voting on candidates. We will then describe ongoing practical work on an automated moderator bot for civic deliberation (in collaboration with Jim Fishkin's group), and ongoing theoretical work on deliberative approaches to decision making. We will conclude with a summary of open directions, focusing in particular on fair advertising. 

Ashish Goel Bio

Lunch Seminar Series Flyer
  • E207, Encina Hall
  • 616 Jane Stanford Way, Stanford, CA 94305
 
Ashish Goel Professor of Management Science and Engineering
Seminars
Paragraphs

The Program on Democracy and the Internet runs the work of the Kofi Annan Commission on Elections and Democracy in the Digital Age which will produce guidelines to support democracies, particularly those of the global south. 

In the span of just two years, the widely shared utopian vision of the internet’s impact on governance has turned decidedly pessimistic.  The original promise of digital technologies was unapologetically democratic: empowering the voiceless, breaking down borders to build cross-national communities, and eliminating elite referees who restricted political discourse. 

That promise has been undercut by concern that the most democratic features of the internet are, in fact, endangering democracy itself.  Democracies pay a price for internet freedom, under this view, in the form of disinformation, hate speech, incitement, and foreign interference in elections.  They also become captive to the economic power of certain platforms, with all the accompanying challenges to privacy and speech regulation that these new, powerful information monopolies have posed.

As it forges ahead in its mandate, the Kofi Annan Commission on Elections and Democracy in the Digital Age must consider these many challenges, as well as the opportunities they present. Professor Nathaniel Persily, a member of the Kofi Annan Commission, he has produced a framing paper for its work, available for download.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Authors
-

Image
Kate Starbird
Abstract:

This talk describes the disinformation campaign targeting the Syria Civil Defense (or “White Helmets”), a humanitarian response group that works in rebel held areas of Syria. The White Helmets provide medical aid, search, and rescue to people affected by the civil war in Syria. They also document the impacts of atrocities — including airstrikes and chemical weapons attacks — perpetrated by the Syrian regime and their Russian allies. For several years, the White Helmets have been the target of a campaign to undermine and delegitimize their work. In this talk, I describe a multi-study research effort that reveals how this multi-dimensional, cross-platform campaign “works” — including a look at the media ecosystems that support the campaign, the networks of actors who collaborate to produce and spread its narratives (including government agents and “unwitting crowds” of online activists), and the “work” that these actors participate in, using the affordances of social media platforms to connect, recruit, organize, promote their messages, attack opposing messages, and otherwise advance the goals of their campaign. 

Kate Starbird Bio

 

 

Authors
News Type
News
Date
Paragraphs

Did the Russian-affiliated groups that interfered with the 2016 U.S. presidential election want to be caught?

“There’s a reason why they paid for Facebook ads in rubles,” Nathaniel Persily, who is a senior fellow at FSI and co-director of the Cyber Policy Center, told FSI Director Michael McFaul on the World Class podcast. “They wanted to be open and notorious.”

Since the election, Americans have become more suspicious of fake news, but they have also become suspicious of real news and journalists in general. Another problem with the Russians’ success in influencing the 2016 election, said Persily, is that Americans will automatically assume that the Russians will do the same thing during the 2020 race.

“Everyone is going to be looking for nefarious influences and shouting them from the rooftops, and that actually serves the [bad actors’] purposes just as much,” Persily said. “Many of the attempts in 2016 were about fostering division and doubt, and I think there’s a lot of appetite for doubt right now in America.”

Sign up for the FSI Newsletter to get stories like this delivered straight to your inbox.

Since 2016, Facebook, Twitter and Google have made some important changes to the way they handle advertising, including adding a requirement that all candidate ads and other ads of “national legislative importance” be identified as advertisements on users’ feeds.

But there are no standardized rules or regulations that dictate how tech companies should handle advertisements or posts that contain disinformation, Persily said, and because of this, it is up to those respective companies to make those decisions themselves  — and they aren’t always in agreement. For example, when a video of Nancy Pelosi that was slowed down to make her seem drunk was posted in late May on YouTube and Facebook, YouTube took the video down, but Facebook decided to leave it up.

“The standards that are going to be developed in test cases like these — under conditions which are not as politically incendiary as an election — are going to be the ones that will be rolled out and applied in elections in the U.S. and around the world,” Persily said.

When it comes to election security, the 2020 presidential race will be the next big test for the U.S. government and private-sector companies. But other countries should also be on the lookout for activity from foreign agents and actors in their elections.

“The 2016 election was not just an event, it was a playbook that was written by the Russians,” warned Persily. “That playbook is usable for future elections in the United States as well as around the world, whether it’s between India and Pakistan or China and Taiwan.”

 

Hero Image
world class logo soundcloud notag 1
All News button
1
0
Eloise Duvillier

Eloise Duvillier is the Program Manager of the Program on Democracy and the Internet at the Cyber Policy Center. She previously was a HR Program Manager and acting HR Business Partner at Bytedance Inc, a rapidly-growing Chinese technology startup. At Bytedance, she supported the globalization of the company by driving US acquisition integrations in Los Angeles and building new R&D teams in Seattle and Silicon Valley. Prior to Bytedance, she led talent acquisition for Baidu USA LLC’s artificial intelligence division. She began her career in the nonprofit industry where she worked in foster care, HIV education and emergency response during humanitarian crises, as well as helping war-torn communities rebuild. She graduated from University of California, Berkeley with a bachelor’s degree in Development Studies, focusing on political economics in unindustrialized societies.

Program Manager, Program on Democracy and the Internet
Subscribe to Russia and Eurasia