Institutions and Organizations
0
giles-headshot.jpg

Christopher Giles is a researcher and open-source investigator focusing on information operations, and monitoring conflict and human rights issues.

Prior to joining Stanford University in 2021, Christopher reported on disinformation for BBC News, covering the COVID-19 pandemic and the 2020 U.S. election, where his reporting was “highly commended” by the Royal Statistical Society’s Journalism Awards. Christopher is a recipient of the Knight-Hennessy Scholarship and is pursing graduate studies in international policy and journalism.

Researcher, Stanford Internet Observatory
Authors
Melissa De Witte, Taylor Kubota, Ker Than
Taylor Kubota
Ker Than
News Type
News
Date
Paragraphs

During a speech at Stanford University on Thursday, April 21, 2022, former U.S. President Barack Obama presented his audience with a stark choice: “Do we allow our democracy to wither, or do we make it better?”

Over the course of an hour-long address, Obama outlined the threat that disinformation online, including deepfake technology powered by AI, poses to democracy as well as ways he thought the problems might be addressed in the United States and abroad.

“This is an opportunity, it’s a chance that we should welcome for governments to take on a big important problem and prove that democracy and innovation can coexist,” Obama said.

Obama, who served as the 44th president of the United States from 2009 to 2017, was the keynote speaker at a one-day symposium, titled “Challenges to Democracy in the Digital Information Realm,” co-hosted by the Stanford Cyber Policy Center and the Obama Foundation on the Stanford campus on April 21.

The event brought together people working in technology, policy, and academia for panel discussions on topics ranging from the role of government in establishing online trust, the relationship between democracy and tech companies, and the threat of digital authoritarians.

Obama told a packed audience of more than 600 people in CEMEX auditorium – as well as more than 250,000 viewers tuning in online – that everyone is part of the solution to make democracy stronger in the digital age and that all of us – from technology companies and their employees to students and ordinary citizens – must work together to adapt old institutions and values to a new era of information. “If we do nothing, I’m convinced the trends that we’re seeing will get worse,” he said.

Introducing the former president was Michael McFaul, director at the Freeman Spogli Institute for International Studies and U.S. ambassador to Russia under Obama, and Stanford alum and Obama Foundation fellow, Tiana Epps-Johnson, BA ’08.

Epps-Johnson, who is the founder and executive director of the Center for Tech and Civic Life, recalled her time answering calls to an election protection hotline during the 2006 midterm election. She said the experience taught her an important lesson, which was that “the overall health of our democracy, whether we have a voting process that is fair and trustworthy, is more important than any one election outcome.”

Stanford freshman Evan Jackson said afterward that Obama’s speech resonated with him. “I use social media a lot, every day, and I’m always seeing all the fake news that can be spread easily. And I do understand that when you have controversy attached to what you’re saying, it can reach larger crowds,” Jackson said. “So if we do find a way to better contain the controversy and the fake news, it can definitely help our democracy stay powerful for our nation.”

The Promise and Perils Technology Poses to Democracy


In his keynote, Obama reflected on how technology has transformed the way people create and consume media. Digital and social media companies have upended traditional media – from local newspapers to broadcast television, as well as the role these outlets played in society at large.

During the 1960s and 1970s, the American public tuned in to one of three major networks, and while media from those earlier eras had their own set of problems – such as excluding women and people of color – they did provide people with a shared culture, Obama said.

Moreover, these media institutions, with established journalistic best practices for accuracy and accountability, also provided people with similar information: “When it came to the news, at least, citizens across the political spectrum tended to operate using a shared set of facts – what they saw or what they heard from Walter Cronkite or David Brinkley.”

Fast forward to today, where everyone has access to individualized news feeds that are fed by algorithms that reward the loudest and angriest voices (and which technology companies profit from). “You have the sheer proliferation of content, and the splintering of information and audiences,” Obama observed. “That’s made democracy more complicated.”

Facts are competing with opinions, conspiracy theories, and fiction. “For more and more of us, search and social media platforms aren’t just our window into the internet. They serve as our primary source of news and information,” Obama said. “No one tells us that the window is blurred, subject to unseen distortions, and subtle manipulations.”

The splintering of news sources has also made all of us more prone to what psychologists call “confirmation bias,” Obama said. “Inside our personal information bubbles, our assumptions, our blind spots, our prejudices aren’t challenged, they are reinforced and naturally, we’re more likely to react negatively to those consuming different facts and opinions – all of which deepens existing racial and religious and cultural divides.”

But the problem is not just that our brains can’t keep up with the growing amount of information online, Obama argued. “They’re also the result of very specific choices made by the companies that have come to dominate the internet generally, and social media platforms in particular.”

The former president also made clear that he did not think technology was to blame for many of our social ills. Racism, sexism, and misogyny, all predate the internet, but technology has helped amplify them.

“Solving the disinformation problem won’t cure all that ails our democracies or tears at the fabric of our world, but it can help tamp down divisions and let us rebuild the trust and solidarity needed to make our democracy stronger,” Obama said.

He gave examples of how social media has fueled violence and extremism around the world. For example, leaders from countries such as Russia to China, Hungary, the Philippines, and Brazil have harnessed social media platforms to manipulate their populations. “Autocrats like Putin have used these platforms as a strategic weapon against democratic countries that they consider a threat,” Obama said.

He also called out emerging technologies such as AI for their potential to sow further discord online. “I’ve already seen demonstrations of deep fake technology that show what looks like me on a screen, saying stuff I did not say. It’s a strange experience people,” Obama said. “Without some standards, implications of this technology – for our elections, for our legal system, for our democracy, for rules of evidence, for our entire social order – are frightening and profound.”

‘Regulation Has to Be Part of the Answer’


Obama discussed potential solutions for addressing some of the problems he viewed as contributing to a backsliding of democracy in the second half of his talk.

In an apt metaphor for a speech delivered in Silicon Valley, Obama compared the U.S. Constitution to software for running society. It had “a really innovative design,” Obama said, but also significant bugs. “Slavery. You can discriminate against entire classes of people. Women couldn’t vote. Even white men without property couldn’t vote, couldn’t participate, weren’t part of ‘We the People.’”

The amendments to the Constitution were akin to software patches, the former president said, that allowed us to “continue to perfect our union.”

Similarly, governments and technology companies should be willing to introduce changes aimed at improving civil discourse online and reducing the amount of disinformation on the internet, Obama said.

“The internet is a tool. Social media is a tool. At the end of the day, tools don’t control us. We control them. And we can remake them. It’s up to each of us to decide what we value and then use the tools we’ve been given to advance those values,” he said.

The former president put forth various solutions for combating online disinformation, including regulation, which many tech companies fiercely oppose.

“Here in the United States, we have a long history of regulating new technologies in the name of public safety, from cars and airplanes to prescription drugs to appliances,” Obama said. “And while companies initially always complain that the rules are going to stifle innovation and destroy the industry, the truth is that a good regulatory environment usually ends up spurring innovation, because it raises the bar on safety and quality. And it turns out that innovation can meet that higher bar.”

In particular, Obama urged policymakers to rethink Section 230, enacted as part of the United States Communications Decency Act in 1996, which ​​stipulates that generally, online platforms cannot be held liable for content that other people post on their website.

But technology has changed dramatically over the past two decades since Section 230 was enacted, Obama said. “These platforms are not like the old phone company.”

He added: “In some cases, industry standards may replace or substitute for regulation, but regulation has to be part of the answer.”

Obama also urged technology companies to be more transparent in how they operate and “at minimum” should share with researchers and regulators how some of their products and services are designed so there is some accountability.

The responsibility also lies with ordinary citizens, the former president said. “We have to take it upon ourselves to become better consumers of news – looking at sources, thinking before we share, and teaching our kids to become critical thinkers who know how to evaluate sources and separate opinion from fact.”

Obama warned that if the U.S. does not act on these issues, it risks being eclipsed in this arena by other countries. “As the world’s leading democracy, we have to set a better example. We should be able to lead on these discussions internationally, not [be] in the rear. Right now, Europe is forging ahead with some of the most sweeping legislation in years to regulate the abuses that are seen in big tech companies,” Obama said. “Their approach may not be exactly right for the United States, but it points to the need for us to coordinate with other democracies. We need to find our voice in this global conversation.”

 

Transcript of President Obama's Keynote

Read More

Image of social media icons and a hand holding a phone
Blogs

Full-Spectrum Pro-Kremlin Online Propaganda about Ukraine

Narratives from overt propaganda, unattributed Telegram channels, and inauthentic social media accounts
Full-Spectrum Pro-Kremlin Online Propaganda about Ukraine
All News button
1
Subtitle

At a conference hosted by the Cyber Policy Center and Obama Foundation, former U.S. President Barack Obama delivered the keynote address about how information is created and consumed, and the threat that disinformation poses to democracy.

0
dan-bateyko-headshot.jpeg

Dan Bateyko is the Special Projects Manager at the Stanford Internet Observatory.

Dan worked previously as a Research Coordinator for The Center on Privacy & Technology at Georgetown Law, where he investigated Immigration and Customs Enforcement surveillance practices, co-authoring American Dragnet: Data-Drive Deportation in the 21st Century.

He has worked at the Berkman Klein Center for Internet & Society, the Dangerous Speech Project, and as a research assistant for Amanda Levendowski, whom he assisted with legal scholarship on facial surveillance.

In 2016, he received a Thomas J. Watson Fellowship. He spent his fellowship year talking with people about digital surveillance and Internet infrastructure in South Korea, China, Malaysia, Germany, Ghana, Russia, and Iceland.

His writing has appeared in Georgetown Tech Law Review, Columbia Journalism Review, Dazed Magazine, The Internet Health Report, Council on Foreign Relations' Net Politics, and Global Voices. He is a 2022 Internet Law & Policy Foundry Fellow.

Dan received his Masters of Law & Technology from Georgetown University Law Center (where he received the IAPP Westin Scholar Book Award for excellence in Privacy Law), and his B.A. from Middlebury College.

Special Projects Manager at the Stanford Internet Observatory

Shorenstein APARC

Encina Hall

Stanford University

0
APARC Predoctoral Fellow, 2021-2022
Stanford Internet Observatory Postdoctoral Fellow, 2022-2023
tongtong_zhang.jpg

Tongtong Zhang joins the Walter H. Shorenstein Asia-Pacific Research Center (APARC) as APARC Predoctoral Fellow for the 2021-2022 academic year. She is a Ph.D candidate at the department of Political Science at Stanford University. Her research focuses on authoritarian deliberation and responsiveness in China.

All Publications button
1
Publication Type
Conference Memos
Publication Date
Subtitle
The Project on Middle East Political Science partnered with Stanford University’s Center for Democracy, Development and the Rule of Law and its Global Digital Policy Incubator for an innovative two week online seminar to explore the issues surrounding digital activism and authoritarianism. This workshop was built upon more than a decade of our collaboration on issues related to the internet and politics in the Middle East, beginning in 2011 with a series of workshops in the “Blogs and Bullets” project supported by the United States Institute for Peace and the PeaceTech Lab. This new collaboration brought together more than a dozen scholars and practitioners with deep experience in digital policy and activism, some focused on the Middle East and others offering a global and comparative perspective. POMEPS STUDIES 43 collects essays from that workshop, shaped by two weeks of public and private discussion.
Authors
Larry Diamond
Eileen Donahoe
Shelby Grossman
Renée DiResta
Josh A. Goldstein
Authors
News Type
News
Date
Paragraphs

The House Permanent Select Committee on Intelligence held a public hearing on Thursday, March 28, 2019, as part of its investigation into Russian influence during and after the 2016 election campaign.

The hearing, "Putin’s Playbook: The Kremlin’s Use of Oligarchs, Money and Intelligence in 2016 and Beyond” included testimony by Michael McFaul, former U.S. Ambassador to Russia and Director of the Freeman Spogli Institute at Stanford University.


Download Complete Testimony (PDF 263 KB)

EXCERPT

To contain and thwart the malicious effects of “Putinism,” the United States government and the American people must first understand the nature of the threat. This testimony focuses onthe nexus of political and economic power within Russia under Putin’s leadership, and how these domestic practices can be used abroad to advance Putin’s foreign policy agenda. Moreover, it is important to underscore that crony capitalism, property rights provided by the state, bribery, and corruption constitute only a few of many different mechanisms used by Putin in his domestic authority and foreign policy abroad.

This testimony proceeds in three parts. Section I describes the evolution of Putin’s system of government at home, focusing in particular on the relationship between the state and big business. Section II illustrates how Putin seeks to export his ideas and practices abroad. Section III focuses on Putin’s specific foreign policy objective of lifting sanctions on Russian individuals and companies.

Watch the C-SPAN recording of the testimony


Media Contact: Ari Chasnoff, Assistant Director for Communications, 650-725-2371, chasnoff@stanford.edu

All News button
1
0
Eloise Duvillier

Eloise Duvillier is the Program Manager of the Program on Democracy and the Internet at the Cyber Policy Center. She previously was a HR Program Manager and acting HR Business Partner at Bytedance Inc, a rapidly-growing Chinese technology startup. At Bytedance, she supported the globalization of the company by driving US acquisition integrations in Los Angeles and building new R&D teams in Seattle and Silicon Valley. Prior to Bytedance, she led talent acquisition for Baidu USA LLC’s artificial intelligence division. She began her career in the nonprofit industry where she worked in foster care, HIV education and emergency response during humanitarian crises, as well as helping war-torn communities rebuild. She graduated from University of California, Berkeley with a bachelor’s degree in Development Studies, focusing on political economics in unindustrialized societies.

Program Manager, Program on Democracy and the Internet
0
kellerdaphne.jpg

Daphne Keller's work focuses on platform regulation and Internet users' rights. She has testified before legislatures, courts, and regulatory bodies around the world, and published both academically and in popular press on topics including platform content moderation practices, constitutional and human rights law, copyright, data protection, and national courts' global takedown orders. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2020, Daphne was the Director of Intermediary Liability at Stanford's Center for Internet and Society. She also served until 2015 as Associate General Counsel for Google, where she had primary responsibility for the company’s search products. Daphne has taught Internet law at Stanford, Berkeley, and Duke law schools. She is a graduate of Yale Law School, Brown University, and Head Start.

Other Affiliations and Roles:

PUBLICATIONS LIST

Director of Program on Platform Regulation, Cyber Policy Center
Lecturer, Stanford Law School
Stanford Law School Neukom Building, Room N230 Stanford, CA 94305
650-725-9875
0
James B. McClatchy Professor of Law at Stanford Law School
Senior Fellow, Freeman Spogli Institute
Professor, by courtesy, Political Science
Professor, by courtesy, Communication
headshot_3.jpg

Nathaniel Persily is the James B. McClatchy Professor of Law at Stanford Law School, with appointments in the departments of Political Science, Communication, and FSI.  Prior to joining Stanford, Professor Persily taught at Columbia and the University of Pennsylvania Law School, and as a visiting professor at Harvard, NYU, Princeton, the University of Amsterdam, and the University of Melbourne. Professor Persily’s scholarship and legal practice focus on American election law or what is sometimes called the “law of democracy,” which addresses issues such as voting rights, political parties, campaign finance, redistricting, and election administration. He has served as a special master or court-appointed expert to craft congressional or legislative districting plans for Georgia, Maryland, Connecticut, New York, North Carolina, and Pennsylvania.  He also served as the Senior Research Director for the Presidential Commission on Election Administration. In addition to dozens of articles (many of which have been cited by the Supreme Court) on the legal regulation of political parties, issues surrounding the census and redistricting process, voting rights, and campaign finance reform, Professor Persily is coauthor of the leading election law casebook, The Law of Democracy (Foundation Press, 5th ed., 2016), with Samuel Issacharoff, Pamela Karlan, and Richard Pildes. His current work, for which he has been honored as a Guggenheim Fellow, Andrew Carnegie Fellow, and a Fellow at the Center for Advanced Study in the Behavioral Sciences, examines the impact of changing technology on political communication, campaigns, and election administration.  He is codirector of the Stanford Cyber Policy Center, Stanford Program on Democracy and the Internet, and Social Science One, a project to make available to the world’s research community privacy-protected Facebook data to study the impact of social media on democracy.  He is also a member of the American Academy of Arts and Sciences, and a commissioner on the Kofi Annan Commission on Elections and Democracy in the Digital Age.  Along with Professor Charles Stewart III, he recently founded HealthyElections.Org (the Stanford-MIT Healthy Elections Project) which aims to support local election officials in taking the necessary steps during the COVID-19 pandemic to provide safe voting options for the 2020 election. He received a B.A. and M.A. in political science from Yale (1992); a J.D. from Stanford (1998) where he was President of the Stanford Law Review, and a Ph.D. in political science from U.C. Berkeley in 2002.   

Co-director, Cyber Policy Center
CV
Encina Hall, C427 616 Jane Stanford Way Stanford, CA 94305-6055
(650) 721-5345 (650) 724-2996
0
2015_eileen-donahoe_04_web.jpg

Eileen Donahoe is the Executive Director of the Global Digital Policy Incubator (GDPI) at Stanford University, FSI/Cyber Policy Center. GDPI is a global multi-stakeholder collaboration hub for development of policies that reinforce human rights and democratic values in digitized society. Areas of current research: AI & human rights; combatting digital disinformation; governance of digital platforms. She served in the Obama administration as the first US Ambassador to the UN Human Rights Council in Geneva, at a time of significant institutional reform and innovation. After leaving government, she joined Human Rights Watch as Director of Global Affairs where she represented the organization worldwide on human rights foreign policy, with special emphasis on digital rights, cybersecurity and internet governance. Earlier in her career, she was a technology litigator at Fenwick & West in Silicon Valley. Eileen serves on the National Endowment for Democracy Board of Directors; the Transatlantic Commission on Election Integrity; the World Economic Forum Future Council on the Digital Economy; University of Essex Advisory Board on Human Rights, Big Data and Technology; NDI Designing for Democracy Advisory Board; Freedom Online Coalition Advisory Network; and Dartmouth College Board of Trustees. Degrees: BA, Dartmouth; J.D., Stanford Law School; MA East Asian Studies, Stanford; M.T.S., Harvard; and Ph.D., Ethics & Social Theory, GTU Cooperative Program with UC Berkeley. She is a member of the Council on Foreign Relations.

 

Executive Director, Global Digital Policy Incubator
Subscribe to Institutions and Organizations
Top