Authors
News Type
Q&As
Date
Paragraphs

Graham Webster leads the DigiChina Project, which translates and explains Chinese technology policy for an English-language audience so that debates and decisions regarding cyber policy are factual and based on primary sources of information.  

Housed within Stanford’s Program on Geopolitics, Technology, and Governance (GTG) and in partnership with New America, DigiChina and its community of experts have already published more than 80 translations and analyses of public policy documents, laws, regulations and political speeches and are creating an open-access knowledge base for policy-makers, academics, and members of the tech industry who need insight into the choices China makes regarding technology.

Q. Why is this work important?

A lot of tech is produced in China so it’s important to understand their policies. And in Washington, D.C., you hear a lot of people say, “Well, you can’t know what China’s doing on tech policy. It’s all a secret.” But while China’s political system is often opaque, if you happen to read Chinese, there’s a lot that’s publicly available and can explain what the Chinese government is thinking and planning.

With our network of experts, DigiChina works at the intersection of two policy challenges. One is how do we deal with high technology, and the questions around economic competitiveness, personal autonomy and the security risks that our dependence on tech creates.

The other challenge is, from a US government, business or values perspective, what needs to be done about the increased prominence and power of the Chinese government and its economic, technological and military capabilities.

These questions cut across tech sectors from IT infrastructure to data-driven automation, and cutting-edge developments in quantum technology, biotech, and other fields of research.

Q: How was DigiChina started?

A number of us were working at different organizations, think tanks, consultancies and universities and we all had an interest in explaining the laws and the bureaucratic language to others who aren’t Chinese policy specialists or don’t have the language skills.  

We started working informally at first and then reached out to New America, which is an innovative type of think tank combining research, innovation, and policy thinking on challenges arising during this time of rapid technological and social change. Under the New America umbrella, and through partnerships with the Leiden Asia Centre, a leading European research center based at the University of Leiden, and the Ethics and Governance of Artificial Intelligence Initiative at Harvard and MIT, we were able to build out the program and increase the number of experts in our network.

Q: Who is involved in DigiChina and what types of expertise do you and others bring to the project?

More than 40 people have contributed to DigiChina publications so far, and it’s a pretty diverse group. There are professors and think tank scholars, students and early-career professionals, and experienced government and industry analysts. Everyone has a different part of the picture they can contribute, and we reach out to other experts both in China and around the world when we need more context.

As for me, I was working at Yale Law School’s China Center when I was roped into what would become DigiChina and had spent several years in Beijing and New Haven working more generally on US-China relations and Track 2 dialogues, where experts and former officials from the two countries meet to take on tough problems. As a journalist and graduate student, I had long studied technology and politics in China, and I took on a coordinating role with DigiChina as I turned back to that pursuit full time.

Stanford is an ideal home because the university is a powerhouse in Chinese studies and an epicenter of global digital development.
Graham Webster
Editor in Chief, DigiChina Project

Q. Are there other organizations involved as well?

We have a strong tie to the Leiden Asia Centre at the University of Leiden in the Netherlands, where one of DigiChina’s cofounders, Rogier Creemers, is a professor, and where staff and student researchers have contributed to existing and forthcoming work. We coordinate with a number of other groups on translations, and the project benefits greatly from the time and knowledge contributed by employees of various institutions. I hope that network will increasingly be a resource for contributors and their colleagues.

The project is currently supported by the Ford Foundation, which works to strengthen democratic values, promote international cooperation and advance human achievement around the world. A generous grant from Ford will keep the lights on for two years, giving us the ability to build our open-access resource and, with further fundraising, the potential to bring on more in-house editorial and research staff.

We hope researchers and policy thinkers, regardless of their approaches or ideologies, can use our translations to engage with the real and messy evolution of Chinese tech policy.
Graham Webster
Editor in Chief, DigiChina Project

Q. Do you have plans to grow the project?

We are working to build an accessible online database so researchers and scholars can review primary source documents in both the original Chinese and in English. And we are working toward a knowledge base with background entries on key institutions, legal concepts, and phrases so that a broader audience can situate things like Chinese legal language in their actual context. Providing access to this information is especially important now and in the near future, whether we have a second Trump Administration or a Biden Administration in the United States.

On any number of policy challenges, effective measures are going to depend on going beyond caricatures like an “AI arms race,” “cyber authoritarianism,” or “decoupling,” which provide useful frameworks for debate but can tend to prejudge the outcomes of a huge number of developments. We hope researchers and policy thinkers, regardless of their approaches or ideologies, can use this work to engage with the real and messy evolution of Chinese tech policy.

Hero Image
Graham Webster Q&A
All News button
1
Subtitle

Webster explains how DigiChina makes Chinese tech policy accessible for English speakers

Paragraphs

THE EMERGENCE OF A DIGITAL SPHERE where public debate takes place raises profound questions about the connection between online information and polarization, echo chambers, and filter bubbles. Does the information ecosystem created by social media companies support the conditions necessary for a healthy democracy? Is it different from other media? These are particularly urgent questions as the United States approaches a contentious 2020 election during the COVID-19 pandemic.

The influence of technology and AI-curated information on America’s democratic process is being examined in the eight-week Stanford University course, “Technology and the 2020 Election: How Silicon Valley Technologies Affect Elections and Shape Democracy.” This issue brief focuses on the class session on “Echo Chambers, Filter Bubbles, and Polarization,” with guest experts Joan Donovan and Joshua Tucker.

All Publications button
1
Publication Type
Policy Briefs
Publication Date
Authors
Marietje Schaake
Paragraphs

THE 2020 ELECTION IN THE UNITED STATES will take place on November 3 in the midst of a global pandemic, economic downturn, social unrest, political polarization, and a sudden shift in the balance of power in the U.S Supreme Court. On top of these issues, the technological layer impacting the public debate, as well as the electoral process itself, may well determine the election outcome. The eight-week Stanford University course, “Technology and the 2020 Election: How Silicon Valley Technologies Affect Elections and Shape Democracy,” examines the influence of technology on America’s democratic process, revealing how digital technologies are shaping the public debate and the election.

The eight-week Stanford University course, “Technology and the 2020 Election: How Silicon Valley Technologies Affect Elections and Shape Democracy,” examines the influence of technology on America’s democratic process, revealing how digital technologies are shaping the public debate and the election...

 

All Publications button
1
Publication Type
Policy Briefs
Publication Date
Authors
Marietje Schaake
-

Image
election debrief event stanford

The US 2020 elections have been fraught with challenges, including the rise of "fake news” and threats of foreign intervention emerging after 2016, ongoing concerns of racially-targeted disinformation, and new threats related to the COVID-19 pandemic. Digital technologies will have played a more important role in the 2020 elections than ever before.

On November 4th at 10am PST, join the team at the Stanford Cyber Policy Center, in collaboration with the Freeman Spogli Institute, Stanford’s Institute for Human-Centered Artificial Intelligence, and the Stanford Center on Philanthropy and Civil Society, for a day-after discussion of the role of digital technologies in the 2020 Elections.  Speakers will include Nathaniel Persily, faculty co-director of the Cyber Policy Center and Director of the Program on Democracy and the Internet, Marietje Schaake, the Center’s International Policy Director and International Policy Fellow at Stanford’s Institute for Human-Centered Artificial Intelligence, Alex Stamos, Director of the Cyber Center’s Internet Observatory and former Chief Security Officer at Facebook and Yahoo, Renee DiResta, Research Manager at the Internet Observatory, Andrew Grotto, Director of the Center’s Program on Geopolitics, Technology, and Governance, and Rob Reich, Faculty Director of the Center for Ethics in Society, in conversation with Kelly Born, the Center’s Executive Director.

Please note that we will also have a YouTube livestream available for potential overflow or for anyone having issues connecting via Zoom: https://youtu.be/H2k62-JCAgE

 

0
renee-diresta.jpg

Renée DiResta is the former Research Manager at the Stanford Internet Observatory. She investigates the spread of malign narratives across social networks, and assists policymakers in understanding and responding to the problem. She has advised Congress, the State Department, and other academic, civic, and business organizations, and has studied disinformation and computational propaganda in the context of pseudoscience conspiracies, terrorism, and state-sponsored information warfare.

You can see a full list of Renée's writing and speeches on her website: www.reneediresta.com or follow her @noupside.

 

Former Research Manager, Stanford Internet Observatory
Rob Reich
0
marietje.schaake

Marietje Schaake is a non-resident Fellow at Stanford’s Cyber Policy Center and at the Institute for Human-Centered AI. She is a columnist for the Financial Times and serves on a number of not-for-profit Boards as well as the UN's High Level Advisory Body on AI. Between 2009-2019 she served as a Member of European Parliament where she worked on trade-, foreign- and tech policy. She is the author of The Tech Coup.


 

Non-Resident Fellow, Cyber Policy Center
Fellow, Institute for Human-Centered Artificial Intelligence
Date Label
-

Image
October 28 event reset reclaiming the internet for civil society

Image
Reset: Reclaiming the Internet for Civil Society book cover
Digital technologies are linked to a growing number of social and political maladies, including political repression, disinformation, and polarization. Accountability for these technologies is weak, allowing authoritarian rulers and bad actors to exploit the information landscape for their gain. A largely unregulated surveillance industry, innovations in technologies of remote control, dark PR firms, and “hack-for-hire” services feeding off rivers of poorly secured personal data also muddy the waters. This set of serious, democracy-unfriendly challenges calls for a deeper reexamination of our communications ecosystem. 

On October 28th at 10am Pacific Time, join the team at the Stanford Cyber Policy Center, in collaboration with author Ronald J. DeibertEileen Donahoe, Executive Director of the Global Digital Policy Incubator at Stanford University’s Cyber Policy Center, and Larry Diamond, co-lead for the Global Digital Policy Incubator in conversation with Kelly Born, the Center’s Executive Director.

 

 

CDDRL
Stanford University
Encina Hall, C147
616 Jane Stanford Way
Stanford, CA 94305-6055

(650) 724-6448 (650) 723-1928
0
Mosbacher Senior Fellow in Global Democracy at the Freeman Spogli Institute for International Studies
William L. Clayton Senior Fellow at the Hoover Institution
Professor, by courtesy, of Political Science and Sociology
diamond_encina_hall.png MA, PhD

Larry Diamond is the William L. Clayton Senior Fellow at the Hoover Institution, the Mosbacher Senior Fellow in Global Democracy at the Freeman Spogli Institute for International Studies (FSI), and a Bass University Fellow in Undergraduate Education at Stanford University. He is also professor by courtesy of Political Science and Sociology at Stanford, where he lectures and teaches courses on democracy (including an online course on EdX). At the Hoover Institution, he co-leads the Project on Taiwan in the Indo-Pacific Region and participates in the Project on the U.S., China, and the World. At FSI, he is among the core faculty of the Center on Democracy, Development and the Rule of Law, which he directed for six and a half years. He leads FSI’s Israel Studies Program and is a member of the Program on Arab Reform and Development. He also co-leads the Global Digital Policy Incubator, based at FSI’s Cyber Policy Center. He served for 32 years as founding co-editor of the Journal of Democracy.

Diamond’s research focuses on global trends affecting freedom and democracy and on U.S. and international policies to defend and advance democracy. His book, Ill Winds: Saving Democracy from Russian Rage, Chinese Ambition, and American Complacency, analyzes the challenges confronting liberal democracy in the United States and around the world at this potential “hinge in history,” and offers an agenda for strengthening and defending democracy at home and abroad.  A paperback edition with a new preface was released by Penguin in April 2020. His other books include: In Search of Democracy (2016), The Spirit of Democracy (2008), Developing Democracy: Toward Consolidation (1999), Promoting Democracy in the 1990s (1995), and Class, Ethnicity, and Democracy in Nigeria (1989). He has edited or coedited more than fifty books, including China’s Influence and American Interests (2019, with Orville Schell), Silicon Triangle: The United States, China, Taiwan the Global Semiconductor Security (2023, with James O. Ellis Jr. and Orville Schell), and The Troubling State of India’s Democracy (2024, with Sumit Ganguly and Dinsha Mistree).

During 2002–03, Diamond served as a consultant to the US Agency for International Development (USAID) and was a contributing author of its report, Foreign Aid in the National Interest. He has advised and lectured to universities and think tanks around the world, and to the World Bank, the United Nations, the State Department, and other organizations dealing with governance and development. During the first three months of 2004, Diamond served as a senior adviser on governance to the Coalition Provisional Authority in Baghdad. His 2005 book, Squandered Victory: The American Occupation and the Bungled Effort to Bring Democracy to Iraq, was one of the first books to critically analyze America's postwar engagement in Iraq.

Among Diamond’s other edited books are Democracy in Decline?; Democratization and Authoritarianism in the Arab WorldWill China Democratize?; and Liberation Technology: Social Media and the Struggle for Democracy, all edited with Marc F. Plattner; and Politics and Culture in Contemporary Iran, with Abbas Milani. With Juan J. Linz and Seymour Martin Lipset, he edited the series, Democracy in Developing Countries, which helped to shape a new generation of comparative study of democratic development.

Download full-resolution headshot; photo credit: Rod Searcey.

Former Director of the Center on Democracy, Development and the Rule of Law
Faculty Chair, Jan Koum Israel Studies Program
Date Label
Ron Diebert
-

Image
Digital Trade Wars

Please join the Cyber Policy Center, Wednesday, October 21, from 10 a.m. –11 a.m. pacific time, with host Marietje Schaake, International Policy Director of the Cyber Policy Center, in conversation with Dmitry Grozoubinski, founder of ExplainTrade.com, and visiting professor at University of Strathclyde, along with Anu Bradford, Henry L. Moses Professor of Law and International Organizations at Columbia Law School and author of How the European Union Rules the World, for a discussion and exploration of the digital trade war. 

This event is free and open to the public, but registration is required.

 

0
marietje.schaake

Marietje Schaake is a non-resident Fellow at Stanford’s Cyber Policy Center and at the Institute for Human-Centered AI. She is a columnist for the Financial Times and serves on a number of not-for-profit Boards as well as the UN's High Level Advisory Body on AI. Between 2009-2019 she served as a Member of European Parliament where she worked on trade-, foreign- and tech policy. She is the author of The Tech Coup.


 

Non-Resident Fellow, Cyber Policy Center
Fellow, Institute for Human-Centered Artificial Intelligence
Date Label
Marietje Schaake
Anu Bradford
Dmitry Grozoubinski
Panel Discussions
-

Image
Man w/ iPad

Social media sites have now surpassed cable, network, and local TV as primary sources of political news for one-in-five Americans. Yet the speed and volume of online information, challenges discerning the credibility of online sources, and concerns about viral online disinformation place a significant burden on users. What do we know about what new user-facing digital literacy initiatives are underway, what the research has to say about the impact and effectiveness of media literacy interventions, and what the implications are for both corporate and government policy.

On Wednesday, October 14th, from 10 a.m. - 11 a.m. Pacific Time, please join Kelly Born, Executive Director of the Stanford Cyber Policy Center, in conversation with Jennifer Kavanaugh, of RAND’s Countering Truth Decay initiativeKristin Lord, President and CEO of IREX, and Claire Wardle, co-founder and director of First Draft, for a discussion on the state of Media Literacy.

Kristin Lord
Jennifer Kavanaugh
Claire Wardle
Seminars
0
Postdoctoral Fellow, Stanford Internet Observatory (2021-2022)
Predoctoral Fellow, Stanford Internet Observatory (2020-2021)
Josh-Goldstein.jpeg

Josh A. Goldstein is a past postdoctoral scholar at the Stanford Internet Observatory. He received his PhD in International Relations from the University of Oxford, where he studied as a Clarendon Scholar. At the Stanford Internet Observatory, Dr. Goldstein investigated covert influence operations on social media platforms, studied the effects of foreign interference on democratic societies, and explored how emerging technologies will impact the future of propaganda campaigns. He has given briefings to the Department of Defense, the State Department, and senior technology journalists based on this work, and published in outlets including Brookings, Lawfare, and Foreign Policy.

Prior to joining SIO, Dr. Goldstein received an MPhil in International Relations at Oxford with distinction and a BA in Government from Harvard College, summa cum laude. He also assisted with research and writing related to international security at the Belfer Center, Brookings Institution, House Foreign Affairs Committee, and Department of Defense.

Authors
Daphne Keller
News Type
Q&As
Date
Paragraphs

Daphne Keller leads the newly launched Program on Platform Regulation a program designed to offer lawmakers, academics, and civil society groups ground-breaking analysis and research to support wise governance of Internet platforms.

Q: Facebook, YouTube and Twitter rely on algorithms and artificial intelligence to provide services for their users. Could AI also help in protecting free speech and policing hate speech and disinformation?   

DK: Platforms increasingly rely on artificial intelligence and other algorithmic means to automate the process of assessing – and sometimes deleting – online speech. But tools like AI can’t really “understand” what we are saying, and automated tools for content moderation make mistakes all the time. We should worry about platforms’ reliance on automation, and worry even more about legal proposals that would make such automated filters mandatory. Constitutional and human rights law give us a legal framework to push back on such proposals, and to craft smarter rules about the use of AI. I wrote about these issues in this New York Times op ed and in some very wonky academic analysis in the Journal of European and International IP Law.

Q: Can you explain the potential impacts on citizens’ rights when the platforms have global reach but governments do not?

DK: On one hand, people worry about platforms displacing the legitimate power of democratic governments. On the other hand, platforms can actually expand state power in troubling ways. One way they do that is by enforcing a particular country’s speech rules everywhere else in the world. Historically that meant a net export of U.S. speech law and values, as American companies applied those rules to their global platforms. More recently, we’ve seen that trend reversed, with things like European and Indian courts requiring Facebook to take user posts down globally – even if the users’ speech would be legally protected in other countries. Governments can also use soft power, or economic leverage based on their control of access to lucrative markets, to convince platforms to “voluntarily” globally enforce that country’s preferred speech rules. That’s particularly troubling, since the state influence may be invisible to any given users whose rights are affected.   

There is such a pressing need for thoughtful work on the laws that govern Internet platforms right now, and this is the place to do it... We have access to the people who are making these decisions and who have the greatest expertise in the operational realities of the tech platforms.
Daphne Keller
Director of Program on Platform Regulation, Cyber Policy Center Lecturer, Stanford Law School

Q: Are there other ways that platforms can expand state power? 

DK: Yes, platforms can let states bypass democratic accountability and constitutional limits by using private platforms as proxies for their own agenda. States that want to engage in surveillance or censorship are constrained by the U.S. Constitution, and by human rights laws around the world. But platforms aren’t. If you’re a state and you want to do something that would violate the law if you did it yourself, it’s awfully tempting to coerce or persuade a platform to do it for you. This issue of platforms being proxies for other actors isn’t limited to governments – anyone with leverage over a platform, including business partners, can potentially play a hidden role like this.

I wrote about this complicated nexus of state and private power in Who Do You Sue? for the Hoover Institution.    

Q: What inspired you to create the Program on Platform Regulation at the Cyber Policy Center right now?

DK: There is such a pressing need for thoughtful work on the laws that govern Internet platforms right now, and this is the place to do it. At the Cyber Policy Center, there’s an amazing group of experts, like Marietje Schaake, Eileen Donahoe, Alex Stamos and Nate Persily, who are working on overlapping issues. We can address different aspects of the same issues and build on each other’s work to do much more together than we could individually.

The program really benefits from being at Stanford and in Silicon Valley because we have access to the people who are making these decisions and who have the greatest expertise in the operational realities of the tech platforms. 

The Cyber Policy Center is part of the Freeman Spogli Institute for International Studies at Stanford University.

Hero Image
Q&A with Daphne Keller
All News button
1
Subtitle

Keller explains some of the issues currently surrounding platform regulation

Paragraphs

This essay closely examines the effect on free-expression rights when platforms such as Facebook or YouTube silence their users’ speech. The first part describes the often messy blend of government and private power behind many content removals, and discusses how the combination undermines users’ rights to challenge state action. The second part explores the legal minefield for users—or potentially, legislators—claiming a right to speak on major platforms. The essay contends that questions of state and private power are deeply intertwined. To understand and protect internet users’ rights, we must understand and engage with both.

 

All Publications button
1
Publication Type
Commentary
Publication Date
Authors
Subscribe to Europe