Paragraphs

Despite pressure from President Donald Trump and Attorney General William Barr, Apple continues to stand its ground and refuses to re-engineer iPhones so law enforcement can unlock the devices. Apple has maintained that it has done everything required by law and that creating a "backdoor" would undermine cybersecurity and privacy for iPhone users everywhere.

Apple is right to stand firm in its position that building a "backdoor" could put user data at risk.

At its most basic, encryption is the act of converting plaintext (like a credit card number) into unintelligible ciphertext using a very large, random number called a key. Anyone with the key can convert the ciphertext back to plaintext. Persons without the key cannot, meaning that even if they acquire the ciphertext, it should still be impossible for them to discover the meaning of the underlying plaintext.

Full Text at CNN

 

 

 

 

All Publications button
1
Publication Type
Commentary
Publication Date
Authors
-

Abstract:

Considerable scholarship has established that algorithms are an increasingly important part of what information people encounter in everyday life. Much less work has focused on studying users’ experiences with, understandings of, and attitudes about how algorithms may influence what they see and do. The dearth of research on this topic may be in part due to the difficulty in studying a subject about which there is no known ground truth given that details about algorithms are proprietary and rarely made public. In this talk, I will report on the methodological challenges of studying people’s algorithm skills based on 83 in-person interviews conducted in five countries. I will also discuss the types of algorithm skills identified from our data. The talk will advocate for more such scholarship to accompany existing system-level analyses of algorithms’ social implications and offers a blue print for how to do this.

Image
Eszter Hargittai
About the Speaker:

Eszter Hargittai is Professor and Chair of Internet Use and Society at the Institute of Communication and Media Research, University of Zurich. Previously, she was the Delaney Family Professor in the Communication Studies Department at Northwestern University. In 2019, she was elected Fellow of the International Communication Association and also received the William F. Ogburn Mid-Career Achievement Award from the American Sociological Association’s section on Communication, Information Technology and Media Sociology. For over two decades, she has been researching people’s Internet uses and skills, and how these relate to questions of social inequality.

 

Paragraphs

Protecting Electoral Integrity in the Digital Age | The Report of the Kofi Annan Commission on Elections and Democracy in the Digital Age

New information and communication technologies (ICTs) pose difficult challenges for electoral integrity. In recent years foreign governments have used social media and the Internet to interfere in elections around the globe. Disinformation has been weaponized to discredit democratic institutions, sow societal distrust, and attack political candidates. Social media has proved a useful tool for extremist groups to send messages of hate and to incite violence. Democratic governments strain to respond to a revolution in political advertising brought about by ICTs. Electoral integrity has been at risk from attacks on the electoral process, and on the quality of democratic deliberation.

The relationship between the Internet, social media, elections, and democracy is complex, systemic, and unfolding. Our ability to assess some of the most important claims about social media is constrained by the unwillingness of the major platforms to share data with researchers. Nonetheless, we are confident about several important findings.

All Publications button
1
Publication Type
Annual Reports
Publication Date
Authors
Alex Stamos
-

On January 11, 2020 Taiwan held its presidential and legislative elections. Many observers expected the People’s Republic of China (PRC) to run an online disinformation campaign during the lead-up to the election in support of their preferred candidate, Han Kuo-yu, who was challenging incumbent Tsai Ing-wen. Such concerns were increased by demonstrated PRC online disinformation targeting the Hong Kong protests, and claims by an alleged PRC spy saying he led disinformation efforts targeting Taiwan during the 2018 elections. 

In this talk, we delve into case studies that highlight the role social media plays in disinformation at large in the Taiwanese information environment. We examine that while the fears of disinformation were generally not realized, we did find evidence of coordinated inauthentic behavior on Facebook, in particular on fan Pages and Groups for the two candidates. Our findings hold implications for researchers trying to distinguish authentic hyper-partisan domestic activism from coordinated disinformation. 

Image
Carly Miller

Carly Miller is a social science researcher at the Stanford Internet Observatory. In addition to covering the Taiwanese election, she assists the team in other digital forensic research and thinking about how researchers external to social media platforms think about disinformation campaign and concepts such as attribution. Before coming to Stanford, Carly was a Team Lead at the Human Rights Investigations Lab at Berkeley Law School where she worked to unearth patterns of various bad actors’ media campaigns. Carly received her BA with honors in political science from the University of California, Berkeley in May 2019.

 

Image
Vanessa Molter

 

Vanessa Molter is a Research Assistant at SIO and a Master in International Policy candidate at Stanford University, where she focuses on International Security in East Asia. At SIO, she monitors and writes on the Taiwanese social media environment. Previously, she has studied Taiwanese security affairs at the Institute for National Defense and Security Research in Taipei, Taiwan, a government-affiliated defense think-tank. Vanessa is fluent in Mandarin and holds a B.S. in International Business and East Asian studies from Tubingen University, Germany.

 

-

Abstract: A Supply and Demand Framework for YouTube Politics (with Joseph Phillips)

Youtube is the most used social network in the United States. However, for a combination of sociological and technical reasons, there exist little quantitative social science research on the political content on Youtube, in spite of widespread concern about the growth of extremist YouTube content. An emerging journalistic consensus theorizes the central role played by the video "recommendation engine," but we believe that this is premature. Instead, we propose the "Supply and Demand" framework for analyzing politics on YouTube. We discuss a number of novel technological affordances of YouTube as a platform and as a collection of videos, and how each might drive supply of or demand for extreme content. We then provide large-scale longitudinal descriptive information about the supply of and demand for alternative political content on YouTube. We demonstrate that viewership of far-right videos peaked in 2017.

Image
Kevin Munger
Kevin Munger is Assistant Professor of Political Science and Social Data Analytics, Penn State University. Ph.D., New York University, 2018. His research looks at social media and other contemporary internet technology has changed political communication. He has published research on the subject using a variety of methodologies, including textual analysis, field experiments, longitudinal surveys and qualitative theory. His research has appeared in leading journals like the American Journal of Political Science, Political Behavior, Political Communication, and Political Science Research & Methods. His present interests include cohort conflict in American politics and developing new methods for social science in a rapidly changing world.

-

Abstract: The problem of online disinformation is only getting worse. Social media may well play a role in the US 2020 presidential election and other major political events. But that doesn’t even begin to describe what future propaganda will look like. As Samuel Woolley shows, we will soon be navigating new technologies such as human-like automated voice systems, machine learning, ‘deep-fake’ AI-edited videos and images, interactive memes, virtual reality and augmented reality. In stories both deeply researched and compellingly written, Woolley describes this future, and explains how the technology can be manipulated, who might control it and its impact on political strategy. Finally, Woolley proposes strategic responses to this threat with the ultimate goal of empowering activists and pushing technology builders to design for democracy.

Image
Samuel Woolley
Samuel Woolley is a researcher with a focus on emerging media technologies, propaganda and politics. His work looks at how automation, algorithms and AI are leveraged for both democracy and control. His forthcoming book, The Reality Game: How the Next Wave of Technology Will Break the Truth, will be released in January of 2020 by PublicAffairs/Hachette. It explores the future of digital disinformation and provides a pragmatic roadmap for how society can respond.

Woolley is an assistant professor in the School of Journalism at the Moody College of Communication at the University of Texas-Austin. He is the Program Director of disinformation research at the Center for Media Engagement (CME) at UT. He holds a PhD from the University of Washington-Seattle. His academic work has appeared in the Journal of Information Technology and Politics, the International Journal of Communication, the Routledge Handbook of Media, Conflict and Security, A Networked Self: Platforms, Stories, Connections and The Political Economy of Robots.  He is one of the founders of the Computational Propaganda Research Project, now based at the Oxford Internet Institute, University of Oxford. Woolley is also the founder of the Digital Intelligence Lab at the Institute for the Future (IFTF)–a 50-year-old think-tank based in Palo Alto, CA.

Lectures
-

Image
Robert Bauer
Abstract: Please join the Cyber Policy Center for a conversation on online political advertising, election law, and the 2020 election, with Robert Bauer, Professor of Practice and Distinguished Scholar in Residence at NYU Law, and Co-Director of NYU’s Legislative and Regulatory Process Clinic, with Professor Nathaniel Persily, Co-Director of the Cyber Policy Center. Bauer served as White House Counsel to President Obama, and returned to private practice in June 2011. In 2013, the President named Bauer to be Co-Chair of the Presidential Commission on Election Administration. Bauer was General Counsel to Obama for America, the President’s campaign organization, in 2008 and 2012. Bauer has also served as co-counsel to the New Hampshire State Senate in the trial of Chief Justice David A. Brock (2000) and counsel to the Democratic Leader in the trial of President William Jefferson Clinton (1999). He is the authors on books on campaign finance law and articles on various topics for law review and periodicals.

Robert Bauer bio >

Robert Bauer
Lectures
-

renee diresta Renne DiResta
Abstract: Disinformation campaigns and black propaganda are not new, but they are evolving. Media coverage of disinformation and propaganda has focused primarily on the social-first memetic operations of the Internet Research Agency and its targeting of the United States 2016 presidential election. This talk examines a broader collection of influence operations, all affiliated with one state adversary – Russia – but leveraging distinctly different tactics. It investigates a 'playbook' that is far more expansive (and evolving) than previously understood, and assesses disinformation campaigns along several axes. We explore narrative vs memetic pathways, long-term vs discrete actions, and a collection of goals ranging from persuasion to distraction. This talk also discusses how online influence operations are deployed in conjunction hack-and-leak campaigns, and community infiltration. 

Renee DiResta Bio >

 

E207, Encina Hall 

0
renee-diresta.jpg

Renée DiResta is the former Research Manager at the Stanford Internet Observatory. She investigates the spread of malign narratives across social networks, and assists policymakers in understanding and responding to the problem. She has advised Congress, the State Department, and other academic, civic, and business organizations, and has studied disinformation and computational propaganda in the context of pseudoscience conspiracies, terrorism, and state-sponsored information warfare.

You can see a full list of Renée's writing and speeches on her website: www.reneediresta.com or follow her @noupside.

 

Former Research Manager, Stanford Internet Observatory
Renee DiResta Research Manager Stanford Internet Observatory
Seminars
Authors
News Type
Q&As
Date
Paragraphs

The science of cyber risk looks at a broad spectrum of risks across a variety of digital platforms. Often though, the work done within the field is limited by a failure to explore the knowledge of other fields, such as behavioral science, economics, law, management science, and political science. In a new Science Magazine article, “Cyber Risk Research Impeded by Disciplinary Barriers,” cyber risk experts and researchers at Stanford University make a compelling case for the importance of a cross-disciplinary approach. Gregory Falco, security researcher at the Program on Geopolitics, Technology, and Governance, and lead author of the paper, talked recently with the Cyber Policy Center about the need for a holistic approach, both within the study of cyber risk, and at a company level when an attack occurs.

CPC: Your recent perspective paper in Science Magazine highlights the issue of terminology when it comes to how organizations and institutions define a cyber attack. Why is it so important to have consistent naming when we are talking about cyber risk?

Falco: With any scientific discipline or field, there is a language for engaging with other experts. If there’s no consistent language or at least dialect for communication around cyber risk, it’s difficult to engage with scholars from different disciplines. For example: The phrase “cyber event” is contested and the threshold for what an organization considers to be a cyber event varies substantially. Some organizations consider someone pinging their network as a cyber event, others only consider something a cyber event once an intrusion has been publicly disclosed. So there’s a disparity when comparing metrics of cyber events from organization to organization because of the different thresholds of what’s considered an event.

CPC: We’ve all been sent one of those emails letting us know our data may have been compromised and your paper points out it’s nearly impossible to put foolproof protections into place; attacks are inevitable. Given that, how should companies weigh the various ways they can protect themselves?

Falco: The first exercise each organization should go through when they decide to be serious about cyber risk is to prioritize their assets. What is business critical? What is safety critical? Then, like all other risks, a cost-benefit analysis must be done for each asset based on its priority. If the asset is safety-critical, then resources should be allocated to help protect that asset or at least ensure its resilience. Trade-offs are inevitable, no company has unlimited resources. But starting with an understanding of where the priorities are, is critical.

CPC: In companies, cyber security often falls entirely to the Chief Information Security Officer (CISO). Your paper argues that’s shortsighted. What is gained when a company takes a more holistic approach?

Falco: Distributing responsibility across the organization catalyzes a security culture. A security culture is one where there is a constant vigilance or at least broad awareness of cybersecurity concerns throughout the organization. Fostering a security culture is often suggested as a mechanism to help reduce cyber risk in organizations. The problem with not distributing responsibility is that when something happens, it’s too easy to resort to finger-pointing at the CISO, and that’s counterproductive. Efforts after an attack should be on responding and being resilient, not finding the scapegoat.

CPC: Cyber risk largely focuses on prevention, but your paper argues that it’s what happens after an attack in that needs greater attention. Why is that?

Falco: Every organization will be attacked. However organizations can differentiate themselves from a cyber risk standpoint by appropriately managing the situation after an attack. Some of the most significant damages to organizations can be reputational if communication after an attack is unclear or botched. Poor communication after an attack can result in major regulatory fines or valuation adjustments as seen in cases like Yahoo and that can have major business implications. Communications aren’t the only important element of post-attack response. A thorough post-mortem of the organization’s response to the attack can be an important learning experience and a way to plan for future attacks.

CPC: Protecting against cyber attacks and the losses that go with them can obviously be costly for companies. You make a case for collaboration among different fields, say among data scientists and economists. How can that be encouraged?

Falco: We argue that cross-disciplinary collaboration rarely happens organically. Therefore, we call on funding agencies like the NSF or DARPA to specify a preference for cross disciplinary research when funding cyber risk projects. Typically, this isn’t currently a feature of calls for proposals, but for cyber risk programs it should be. We encourage researchers to explore cyber risk questions at the margins of their discipline. Those questions may lend themselves to potential overlap with other disciplines and foster a starting point for cross-disciplinary collaboration.

For more on these topics, see a full list of recent publications from the Cyber Policy Center and the Program on Geopolitics, Technology, and Governance.

Hero Image
Gregory Falco Rod Searcey
All News button
1
Subscribe to The Americas