News Type
76 Platforms v. Supreme Court with Daphne Keller (part 1)
All News button

Daphne Keller spoke with the Initiative for Digital Public Infrastructure at the University of Massachusetts Amherst about two potentially major cases currently before the Supreme Court

News Type

Picture this: you are sitting in the kitchen of your home enjoying a drink. As you sip, you scroll through your phone, looking at the news of the day. You text a link to a news article critiquing your government’s stance on the press to a friend who works in media. Your sibling sends you a message on an encrypted service updating you on the details of their upcoming travel plans. You set a reminder on your calendar about a doctor’s appointment, then open your banking app to make sure the payment for this month’s rent was processed.

Everything about this scene is personal. Nothing about it is private.

Without your knowledge or consent, your phone has been infected with spyware. This technology makes it possible for someone to silently watch and taking careful notes about who you are, who you know, and what you’re doing. They see your files, have your contacts, and know the exact route you took home from work on any given day. They can even turn the microphone of your phone on and listen to the conversations you’re having in the room.

This is not some hypothetical, Orwellian drama, but a reality for thousands of people around the world. This kind of technology — once a capability only of the most technologically advanced governments — is now commercially available and for sale from numerous private companies who are known to sell it to state agencies and private actors alike. This total loss of privacy should worry everyone, but for human rights activists and journalists challenging authoritarian powers, it has become a matter of life and death. 

The companies who develop and sell this technology are only passively accountable toward governments at best, and at worse have their tacit support. And it is this lack of regulation that Marietje Schaake, the International Policy Director at the Cyber Policy Center and International Policy Fellow at Stanford HAI, is trying to change.

Amsterdam and Tehran: A Tale of Two Elections

Schaake did not begin her professional career with the intention of becoming Europe’s “most wired politician,” as she has frequently been dubbed by the press. In many ways, her step into politics came as something of a surprise, albeit a pleasant one.
“I've always been very interested in public service and trying to improve society and the lives of others, but I ran not expecting at all that I would actually get elected,” Schaake confesses.

As a candidate on the 2008 ticket for the Democrats 66 (D66) political party of the Netherlands, Schaake saw herself as someone who could help move the party’s campaign forward, but not as a serious contender in the open party election system. But when her party performed exceptionally well, at the age of 30, Schaake landed in the third position of a 30-person list vying to fill the 25 open seats available for representatives from all political parties in the Netherlands. Having taken a top spot among a field of hundreds of candidates, she found herself on her way to being a Member of the European Parliament (MEP).

Marietje Schaake participates in a panel on human rights and communication technologies as a member of the European Parliament in April 2012.
Marietje Schaake participates in a panel on human rights and communication technologies as a member of the European Parliament in April 2012. Alberto Novi, Flikr

In 2009, world events collided with Schaake’s position as a newly-seated MEP. While the democratic elections in the EU were unfolding without incident, 3,000 miles away in Iran, a very different story was unfolding. Following the re-election of Mahmoud Ahmadinejad to a second term as Iran’s president, allegations of fraud and vote tampering were immediately voiced by supporters of former prime minister Mir-Hossein Mousavi, the leading candidate opposing Ahmadinejad. The protests that followed quickly morphed into the Green Movement, one of the largest sustained protest movements in Iran’s history after the Iranian Revolution of 1978 and until the protests against the death of Mahsa Amini began in September 2022.
With the protests came an increased wave of state violence against the demonstrators. While repression and intimidation are nothing new to autocratic regimes, in 2009 the proliferation of cell phones in the hands of an increasingly digitally connected population allowed citizens to document human rights abuses firsthand and beam the evidence directly from the streets of Tehran to the rest of the world in real-time.
As more and more footage poured in from the situation on the ground, Schaake, with a pre-politics background in human rights and a specific interest in civil rights, took up the case of the Green Movement as one of her first major issues in the European Parliament. She was appointed spokesperson on Iran for her political group. 

Marietje Schaake [second from the left] during a press conference on universal human rights alongside her colleauges from the European Parliament.
Marietje Schaake [second from left] alongside her colleauges from the European Parliament during a press conference on universal human rights in 2010. Alberto Novi, Flikr

The Best of Tech and the Worst of Tech

But the more Schaake learned, the clearer it became that the Iranian were not the only ones using technology to stay informed about the protests. Meeting with ights defenders who had escaped from Iran to Eastern Turkey, Schaake was told anecdote after anecdote about how the Islamic Republic’s authorities were using tech to surveil, track, and censor dissenting opinions.
Investigations indicated that they were utilizing a technique referred to then as “deep packet inspection,” a system which allows the controller of a communications network to read and block information from going through, alter communications, and collect data about specific individuals. What was more, journalists revealed that many of the systems such regimes were using to perform this type of surveillance had been bought from, and were serviced by, Western companies.
For Schaake, this revelation was a turning point of her focus as a politician and the beginning of her journey into the realm of cyber policy and tech regulation.
“On the one hand, we were sharing statements urging to respect the human rights of the demonstrators. And then it turned out that European companies were the ones selling this monitoring equipment to the Iranian regime. It became immediately clear to me that if technology was to play a role in enhancing human rights and democracy, we couldn’t simply trust the market to make it so; we needed to have rules,” Schaake explained.

We have to have a line in the sand and a limit to the use of this technology. It’s extremely important, because this is proliferating not only to governments, but also to non-state actors.
Marietje Schaake
International Policy Director at the Cyber Policy Center

The Transatlantic Divide

But who writes the rules? When it comes to tech regulation, there is longstanding unease between the private and public sectors, and a different approach between the East and West shores of the Atlantic. In general, EU member countries favor oversight of the technology sector and have supported legislation like the General Data Protection Regulation (GDPR) and Digital Services Act to protect user privacy and digital human rights. On the other hand, major tech companies — many of them based in North America — favor the doctrine of self-regulation and frequently cite claims to intellectual property or widely-defined protections such as Section 230 as a justification for keeping government oversight at arm’s length. Efforts by governing bodies like the European Union to legislate privacy and transparency requirements are with raised hackles 
It’s a feeling Schaake has encountered many times in her work. “When you talk to companies in Silicon Valley, they make it sound as if Europeans are after them and that these regulations are weapons meant to punish them,” she says.
But the need to place checks on those with power is rooted in history, not histrionics, says Schaake. Memories of living under the eye of surveillance states such as the Soviet Union and East Germany still are fresh on many European’s minds. The drive to protect privacy is as much about keeping the government in check as it is about reining in the outsized influence and power of private technology companies, Schaake asserts.

Big Brother Is Watching

In the last few years, the momentum has begun to shift. 
In 2020, a joint reporting effort by The Guardian, The Washington Post, Le Monde, Proceso, and over 80 journalists at a dozen additional news outlets worked in partnership with Amnesty International and Forbidden Stories to publish the Pegasus Project, a detailed report showing that spyware from the private company NSO Group was used to target, track, and retaliate against tens of thousands journalists, activists, civil rights leaders, and even against prominent politicians around the world.
This type of surveillance has innovated quickly beyond the network monitoring undertaken by regimes like Iran in the 2000s, and taps into the most personal details of an individual’s device, data, and communications. In the absence of widespread regulation, companies like NSO Group have been able to develop commercial products with capabilities as sophisticated as state intelligence bureaus. In many cases, “no-click” infections are now possible, meaning a device can be targeted and have the spyware installed without the user ever knowing or having any suspicions that they have become a victim of covert surveillance.

Marietje Schaake [left] moderates a panel at the 2023 Summit for Democracy with Neal Mohan, CEO of YouTube; John Scott-Railton, Senior Researcher at Citizen Lab; Avril Haines, U.S. Director of National Intelligence; and Alejandro N. Mayorkas, U.S. Secretary of Homeland Security.
Marietje Schaake at the 2023 Summit for Democracy with Neal Mohan, CEO of YouTube; John Scott-Railton, Senior Researcher at Citizen Lab; Avril Haines, U.S. Director of National Intelligence; and Alejandro Mayorkas, U.S. Secretary of Homeland Security. U.S. Department of State

“If we were to create a spectrum of harmful technologies, spyware could easily take the top position,” said Schaake, speaking as the moderator of a panel on “Countering the Misuse of Technology and the Rise of Digital Authoritarianism” at the 2023 Summit for Democracy co-hosted by U.S. President Joe Biden alongside the governments of Costa Rica, the Netherlands, Republic of Korea, and Republic of Zambia.
Revelations like those of the Pegasus Project have helped spur what Schaake believes is long-overdue action from the United States on regulating this sector of the tech world. On March 27, 2023, President Biden signed an executive order prohibiting the operational use of commercial spyware products by the United States government. It is the first time such an action has been formally taken in Washington.
For Schaake, the order is a “fantastic first step,” but she also cautions that there is still much more that needs to be done. The use of spyware made by the government is not limited by Biden's executive order, and neither is the use by individuals who can get their hands on these tools. 

Human Rights vs. National Security

One of Schaake’s main concerns is the potential for governmental overreach in the pursuit of curtailing the influence of private companies.
Schaake explains, “What's interesting is that while the motivation in Europe for this kind of regulation is very much anchored in fundamental rights, in the U.S., what typically moves the needle is a call to national security, or concern for China.”
It is important to stay vigilant about how national security can become a justification for curtailing civil liberties. Writing for the Financial Times, Schaake elaborated on the potential conflict of interest the government has in regulating tech more rigorously:
“The U.S. government is right to regulate technology companies. But the proposed measures, devised through the prism of national security policy, must also pass the democracy test. After 9/11, the obsession with national security led to warrantless wiretapping and mass data collection. I back moves to curb the outsized power of technology firms large and small. But government power must not be abused.”
While Schaake hopes well-established democracies will do more to lead by example, she also acknowledges that the political will to actually step up to do so is often lacking. In principle, countries rooted in the rule of law and the principles of human rights decry the use of surveillance technology beyond their own borders. But in practice, these same governments are also sometimes customers of the surveillance industrial complex. 

It’s up to us to guarantee the upsides of technology and limit its downsides. That’s how we are going to best serve our democracy in this moment.
Marietje Schaake
International Policy Director at the Cyber Policy Center

Schaake has been trying to make that disparity an impossible needle for nations to keep threading. For over a decade, she has called for an end to the surveillance industry and has worked on developing export controls rules for the sale of surveillance technology from Europe to other parts of the world. But while these measures make it harder for non-democratic regimes to purchase these products from the West, the legislation is still limited in its ability to keep European and Western nations from importing spyware systems like Pegasus back into the West. And for as long as that reality remains, it undermines the credibility of the EU and West as a whole, says Schaake. 
Speaking at the 2023 Summit for Democracy, Schaake urged policymakers to keep the bigger picture in mind when it comes to the risks of unaccountable, ungoverned spyware industries. “We have to have a line in the sand and a limit to the use of this technology. It’s extremely important, because this is proliferating not only to governments, but also to non-state actors. This is not the world we want to live in.”


Building Momentum for the Future

Drawing those lines in the sands is crucial not just for the immediate safety and protection of individuals who have been targeted with spyware but applies to other harms of technology vis-a-vis the long-term health of democracy.

“The narrative that technology is helping people's democratic rights, or access to information, or free speech has been oversold, whereas the need to actually ensure that democratic principles govern technology companies has been underdeveloped,” Schaake argues.

While no longer an active politician, Schaake has not slowed her pace in raising awareness and contributing her expertise to policymakers trying to find ways of threading the digital needle on tech regulation. Working at the Cyber Policy Center at the Freeman Spogli Institute for International Studies (FSI), Schaake has been able to combine her experiences in European politics with her academic work in the United States against the backdrop of Silicon Valley, the home-base for many of the world’s leading technology companies and executives.
Though now half a globe away from the European Parliament, Schaake’s original motivations to improve society and people’s lives have not dimmed.

Marietje Schaake speaking at conference at Stanford University
Though no longer working in government, Schaake, seen here at a conference on regulating Big Tech hosted by Stanford's Human-Centered Intelligence (HAI), continues to research and advocate for better regulation of technology industries. Midori Yoshimura

“It’s up to us to guarantee the upsides of technology and limit its downsides. That’s how we are going to best serve our democracy in this moment,” she says.
Schaake is clear-eyed about the hurdles still ahead on the road to meaningful legislation about tech transparency and human rights in digital spaces. With a highly partisan Congress in the United States and other issues like the war in Ukraine and concerns over China taking center stage, it will take time and effort to build a critical mass of political will to tackle these issues. But Biden’s executive order and the discussion of issues like digital authoritarianism at the Summit for Democracy also give Schaake hope that progress can be made.
“The bad news is we're not there yet. The good news is there's a lot of momentum for positive change and improvement, and I feel like people are beginning to understand how much it is needed.”
And for anyone ready to jump into the fray and make an impact, Schaake adds a standing invitation: “I’m always happy to grab a coffee and chat. Let’s talk!”

The complete recording of "Countering the Misuse of Technology and the Rise of Digital Authoritarianism," the panel Marietje Schaake moderated at the 2023 Summit for Democracy, is available below.

Read More

All News button

A transatlantic background and a decade of experience as a lawmaker in the European Parliament has given Marietje Schaake a unique perspective as a researcher investigating the harms technology is causing to democracy and human rights.

The Law of Democracy, Legal Structure of the Political Process book cover

This book created the field of the law of democracy, offering a systematic account of the legal construction of American democracy. This edition represents a significant revision that reflects the embattled state of democracy in the U.S. and abroad. With the addition of Franita Tolson as well as Nathaniel Persily to the prior edition, the book now turns to a changed legal environment following the radical reconfiguration of the Voting Rights Act, the rise of social media and circumvention of the formal channels of campaign finance, and the increased fragmentation of political parties. Strikingly, in the current political environment the right to register and vote passes from being a largely historical inquiry to a source of front-burner legal challenge. This edition further streamlines the coverage of the Voting Rights Act, expands the scope of coverage of campaign finance and political corruption issues, and turns to the new dispute over voter access to the ballot. The section on election litigation and remedies has been expanded to address the expanded range of legal challenges to election results. For the first time, this book isolates the distinct problems of presidential elections, ranging from the conflict over federal and state law in Bush v. Gore, to the distinct challenges to the 2020 presidential elections, to the renewed focus on the Electoral Count Act.

The basic structure of the book continues to follow the historical development of the individual right to vote; current struggles over gerrymandering; the relationship of the state to political parties; the constitutional and policy issues surrounding campaign-finance reform; and the tension between majority rule and fair representation of minorities in democratic bodies.

For more information and additional teaching materials, visit the companion site.

All Publications button
Publication Type
Publication Date

This book created the field of the law of democracy, offering a systematic account of the legal construction of American democracy. This edition represents a significant revision that reflects the embattled state of democracy in the U.S. and abroad.

Samuel Issacharoff
Pamela S. Karlan
Pamela S. Karlan
Nathaniel Persily
Franita Tolson
Book Publisher
Foundation Press
6th Edition
blue background with text overlay that reads uncommon yet consequential online harms

Come join The Journal of Online Trust & Safety, an open access journal for cutting-edge trust and safety scholarship, as we bring together authors published in our special issue, Uncommon yet Consequential Online Harms, for a webinar, hosted on September 1, 9:30-10:30am PT. 

The Journal of Online Trust & Safety publishes research from computer science, sociology, political science, law, and more. Journal articles have been covered in The Guardian, The Washington Post, and Platformer and cited in Senate testimony and a platform policy announcement.

Articles in this special issue will include: 

Election Fraud, YouTube, and Public Perception of the Legitimacy of President Biden by James Bisbee, Megan A. Brown, Angela Lai, Richard Bonneau, Joshua A. Tucker, and Jonathan Nagler

Predictors of Radical Intentions among Incels: A Survey of 54 Self-identified Incels by Sophia Moskalenko, Naama Kates, Juncal Fernández-Garayzábal González, and Mia Bloom

Procedural Justice and Self Governance on Twitter: Unpacking the Experience of Rule Breaking on Twitter by Matthew Katsaros, Tom Tyler, Jisu Kim, and Tracey Meares

Twitter’s Disputed Tags May Be Ineffective at Reducing Belief in Fake News and Only Reduce Intentions to Share Fake News Among Democrats and Independents by Jeffrey Lees, Abigail McCarter, and Dawn M. Sarno

To hear from the authors about their new research, please register for the webinar. To be notified about journal updates, please sign up for Stanford Internet Observatory announcements and follow @journalsafetech. Questions about the journal can be sent to



Panel Discussions

Christopher Giles is a researcher and open-source investigator focusing on information operations, and monitoring conflict and human rights issues.

Prior to joining Stanford University in 2021, Christopher reported on disinformation for BBC News, covering the COVID-19 pandemic and the 2020 U.S. election, where his reporting was “highly commended” by the Royal Statistical Society’s Journalism Awards. Christopher is a recipient of the Knight-Hennessy Scholarship and is pursing graduate studies in international policy and journalism.

Researcher, Stanford Internet Observatory
abstract blue image with text Trust and Safety Research Conference

Join us September 29-30 for two days of cross-professional presentations and conversations designed to push forward research on trust and safety.

Hosted at Stanford University’s Frances. C. Arrillaga Alumni Center, the Trust and Safety Research Conference will convene trust and safety practitioners, people in government and civil society, and academics in fields like computer science, sociology, law, and political science to think deeply about trust and safety issues.

Your ticket gives you access to:

  • Two days of talks, panels, workshops, and breakouts
  • Networking opportunities, including happy hours on September 28, 29 and 30th.
  • Breakfast and lunch on September 29 and 30th.

Early bird tickets are $100 for attendees from academia and civil society and $500 for attendees from industry. Ticket prices go up August 1, 2022. Full refunds or substitutions will be honored until August 15, 2022. After August 15, 2022 no refunds will be allowed.

More information is available at:

For questions, please contact us through

Frances C. Arrillaga Alumni Center
326 Galvez Street
Stanford, CA 94305

Melissa De Witte, Taylor Kubota, Ker Than
Taylor Kubota
Ker Than
News Type

During a speech at Stanford University on Thursday, April 21, 2022, former U.S. President Barack Obama presented his audience with a stark choice: “Do we allow our democracy to wither, or do we make it better?”

Over the course of an hour-long address, Obama outlined the threat that disinformation online, including deepfake technology powered by AI, poses to democracy as well as ways he thought the problems might be addressed in the United States and abroad.

“This is an opportunity, it’s a chance that we should welcome for governments to take on a big important problem and prove that democracy and innovation can coexist,” Obama said.

Obama, who served as the 44th president of the United States from 2009 to 2017, was the keynote speaker at a one-day symposium, titled “Challenges to Democracy in the Digital Information Realm,” co-hosted by the Stanford Cyber Policy Center and the Obama Foundation on the Stanford campus on April 21.

The event brought together people working in technology, policy, and academia for panel discussions on topics ranging from the role of government in establishing online trust, the relationship between democracy and tech companies, and the threat of digital authoritarians.

Obama told a packed audience of more than 600 people in CEMEX auditorium – as well as more than 250,000 viewers tuning in online – that everyone is part of the solution to make democracy stronger in the digital age and that all of us – from technology companies and their employees to students and ordinary citizens – must work together to adapt old institutions and values to a new era of information. “If we do nothing, I’m convinced the trends that we’re seeing will get worse,” he said.

Introducing the former president was Michael McFaul, director at the Freeman Spogli Institute for International Studies and U.S. ambassador to Russia under Obama, and Stanford alum and Obama Foundation fellow, Tiana Epps-Johnson, BA ’08.

Epps-Johnson, who is the founder and executive director of the Center for Tech and Civic Life, recalled her time answering calls to an election protection hotline during the 2006 midterm election. She said the experience taught her an important lesson, which was that “the overall health of our democracy, whether we have a voting process that is fair and trustworthy, is more important than any one election outcome.”

Stanford freshman Evan Jackson said afterward that Obama’s speech resonated with him. “I use social media a lot, every day, and I’m always seeing all the fake news that can be spread easily. And I do understand that when you have controversy attached to what you’re saying, it can reach larger crowds,” Jackson said. “So if we do find a way to better contain the controversy and the fake news, it can definitely help our democracy stay powerful for our nation.”

The Promise and Perils Technology Poses to Democracy

In his keynote, Obama reflected on how technology has transformed the way people create and consume media. Digital and social media companies have upended traditional media – from local newspapers to broadcast television, as well as the role these outlets played in society at large.

During the 1960s and 1970s, the American public tuned in to one of three major networks, and while media from those earlier eras had their own set of problems – such as excluding women and people of color – they did provide people with a shared culture, Obama said.

Moreover, these media institutions, with established journalistic best practices for accuracy and accountability, also provided people with similar information: “When it came to the news, at least, citizens across the political spectrum tended to operate using a shared set of facts – what they saw or what they heard from Walter Cronkite or David Brinkley.”

Fast forward to today, where everyone has access to individualized news feeds that are fed by algorithms that reward the loudest and angriest voices (and which technology companies profit from). “You have the sheer proliferation of content, and the splintering of information and audiences,” Obama observed. “That’s made democracy more complicated.”

Facts are competing with opinions, conspiracy theories, and fiction. “For more and more of us, search and social media platforms aren’t just our window into the internet. They serve as our primary source of news and information,” Obama said. “No one tells us that the window is blurred, subject to unseen distortions, and subtle manipulations.”

The splintering of news sources has also made all of us more prone to what psychologists call “confirmation bias,” Obama said. “Inside our personal information bubbles, our assumptions, our blind spots, our prejudices aren’t challenged, they are reinforced and naturally, we’re more likely to react negatively to those consuming different facts and opinions – all of which deepens existing racial and religious and cultural divides.”

But the problem is not just that our brains can’t keep up with the growing amount of information online, Obama argued. “They’re also the result of very specific choices made by the companies that have come to dominate the internet generally, and social media platforms in particular.”

The former president also made clear that he did not think technology was to blame for many of our social ills. Racism, sexism, and misogyny, all predate the internet, but technology has helped amplify them.

“Solving the disinformation problem won’t cure all that ails our democracies or tears at the fabric of our world, but it can help tamp down divisions and let us rebuild the trust and solidarity needed to make our democracy stronger,” Obama said.

He gave examples of how social media has fueled violence and extremism around the world. For example, leaders from countries such as Russia to China, Hungary, the Philippines, and Brazil have harnessed social media platforms to manipulate their populations. “Autocrats like Putin have used these platforms as a strategic weapon against democratic countries that they consider a threat,” Obama said.

He also called out emerging technologies such as AI for their potential to sow further discord online. “I’ve already seen demonstrations of deep fake technology that show what looks like me on a screen, saying stuff I did not say. It’s a strange experience people,” Obama said. “Without some standards, implications of this technology – for our elections, for our legal system, for our democracy, for rules of evidence, for our entire social order – are frightening and profound.”

‘Regulation Has to Be Part of the Answer’

Obama discussed potential solutions for addressing some of the problems he viewed as contributing to a backsliding of democracy in the second half of his talk.

In an apt metaphor for a speech delivered in Silicon Valley, Obama compared the U.S. Constitution to software for running society. It had “a really innovative design,” Obama said, but also significant bugs. “Slavery. You can discriminate against entire classes of people. Women couldn’t vote. Even white men without property couldn’t vote, couldn’t participate, weren’t part of ‘We the People.’”

The amendments to the Constitution were akin to software patches, the former president said, that allowed us to “continue to perfect our union.”

Similarly, governments and technology companies should be willing to introduce changes aimed at improving civil discourse online and reducing the amount of disinformation on the internet, Obama said.

“The internet is a tool. Social media is a tool. At the end of the day, tools don’t control us. We control them. And we can remake them. It’s up to each of us to decide what we value and then use the tools we’ve been given to advance those values,” he said.

The former president put forth various solutions for combating online disinformation, including regulation, which many tech companies fiercely oppose.

“Here in the United States, we have a long history of regulating new technologies in the name of public safety, from cars and airplanes to prescription drugs to appliances,” Obama said. “And while companies initially always complain that the rules are going to stifle innovation and destroy the industry, the truth is that a good regulatory environment usually ends up spurring innovation, because it raises the bar on safety and quality. And it turns out that innovation can meet that higher bar.”

In particular, Obama urged policymakers to rethink Section 230, enacted as part of the United States Communications Decency Act in 1996, which ​​stipulates that generally, online platforms cannot be held liable for content that other people post on their website.

But technology has changed dramatically over the past two decades since Section 230 was enacted, Obama said. “These platforms are not like the old phone company.”

He added: “In some cases, industry standards may replace or substitute for regulation, but regulation has to be part of the answer.”

Obama also urged technology companies to be more transparent in how they operate and “at minimum” should share with researchers and regulators how some of their products and services are designed so there is some accountability.

The responsibility also lies with ordinary citizens, the former president said. “We have to take it upon ourselves to become better consumers of news – looking at sources, thinking before we share, and teaching our kids to become critical thinkers who know how to evaluate sources and separate opinion from fact.”

Obama warned that if the U.S. does not act on these issues, it risks being eclipsed in this arena by other countries. “As the world’s leading democracy, we have to set a better example. We should be able to lead on these discussions internationally, not [be] in the rear. Right now, Europe is forging ahead with some of the most sweeping legislation in years to regulate the abuses that are seen in big tech companies,” Obama said. “Their approach may not be exactly right for the United States, but it points to the need for us to coordinate with other democracies. We need to find our voice in this global conversation.”


Transcript of President Obama's Keynote

Read More

Image of social media icons and a hand holding a phone

Full-Spectrum Pro-Kremlin Online Propaganda about Ukraine

Narratives from overt propaganda, unattributed Telegram channels, and inauthentic social media accounts
Full-Spectrum Pro-Kremlin Online Propaganda about Ukraine
All News button

At a conference hosted by the Cyber Policy Center and Obama Foundation, former U.S. President Barack Obama delivered the keynote address about how information is created and consumed, and the threat that disinformation poses to democracy.

Visiting Scholar, Global Digital Policy Incubator

Charles served as an elected member of the Legislative Council in the Hong Kong Special Administrative Region, representing the Information Technology functional constituency, for two terms from 2012 to 2020. He served alternatively as chair and vice chair of the Information Technology and Broadcasting Panel from 2016 to 2020. As a lawmaker, Charles was a champion for policies and legislations on privacy, open data, freedom of expression and information, cybersecurity, innovation, fintech, electronic health records, as well as human rights and democracy. After leaving the legislature, he founded Tech for Good Asia, a regional initiative in Asia bringing together businesses and civil societies to harness the positive powers of digital technologies. Before entering the legislature, He co-founded HKNet in 1994, one of the earliest Internet service providers in Asia and Hong Kong. He was the founding chair of the Internet Society Hong Kong, honorary president and former president of the Hong Kong Information Technology Federation, former chair of the Hong Kong Internet Service Providers Association, former chair of the Asian, Australiasian and Pacific Islands Regional At-Large Organization (APRALO) of ICANN, and a founding member of the Hong Kong Human Rights Monitor. Charles began his career in technology with Digital Equipment Corporation, and then Sun Microsystems, Inc. in the U.S. He holds a BS in Computer and Electrical Engineering and an MS in Electrical Engineering from Purdue University.

Shorenstein APARC

Encina Hall

Stanford University

APARC Predoctoral Fellow, 2021-2022
Stanford Internet Observatory Postdoctoral Fellow, 2022-2023

Tongtong Zhang joins the Walter H. Shorenstein Asia-Pacific Research Center (APARC) as APARC Predoctoral Fellow for the 2021-2022 academic year. She is a Ph.D candidate at the department of Political Science at Stanford University. Her research focuses on authoritarian deliberation and responsiveness in China.

Digital Activism and Authoritarian Adaptation in the Middle East Agenda (1 of 2)

Digital Activism and Authoritarian Adaptation in the Middle East Agenda (2 of 2)

Panel 1: Digital Activism

Tuesday, May 25, 2021 | 9-10:30 am PT

Opening Remarks: Marc Lynch, Eileen Donahoe, and Larry Diamond

Moderator: Hesham Sallam

  • Wafa Ben-Hassine: “The Hyper-Aware and Not-So-Aware: What's Next for the MENA Region's Activists and Society at Large Vis-a-Vis the Internet?”
  • Adel Iskander: “Re(Membering) Culture and Heritage: Egypt's Latest Political Turf War”
  • Zachary Steinert-Threlkeld: “Civilian Behavior on Social Media During Civil War”
  • Joshua Tucker: “Beyond Liberation Technology? The Recent Uses of Social Media by Pro-Democracy Activists”


Panel 2: Authoritarian Abuses of Internet Technologies

Thursday, May 27, 2021 | 9-10:30 am PT

Moderator: Marc Lynch

  • Marwa Fatafta: “Transnational or Cross-Border Digital Repression in the MENA Region”
  • Andrew Leber: “Social Media Manipulation in the MENA: Inauthenticity, Inequality, and Insecurity” (Co-authored paper with Alexei Abrahams)
  • Marc Owen Jones: “Tracking Adversaries: The Evolution of Manipulation Tactics on Gulf Twitter”
  • Xiao Qiang: “Chinese Digital Authoritarianism and Its Global Impact”


Panel 3: Government Reshaping of Norms and Practices to Constrain Online Activity

Tuesday, June 1, 2021 | 9-10:30 am PT

Moderator: Eileen Donahoe

  • Ahmed Shaheed: “Binary Threat: How State Cyber Policy and Practice Undermines Human Rights in the Middle East and North Africa Region”
  • Mona Elswah, Mahsa Alimardani: "The Hurdles Involved in Content Moderation in the MENA Region"
  • Mohamed Najem: “The Role of the Gulf in Governing Digital Space in the Arab Region”
  • James Shires: “The Techno-Regulation of Critical Communications Infrastructures and Their Political Potential in the Gulf”
  • Alexei Abrahams: “The Web (In)Security of Middle Eastern Civil Society and Media”


Panel 4: Cross-Border Information Operations

Thursday, June 3, 2021 | 9-10:30 am PT

Moderator: Larry Diamond

  • Alexandra Siegel: “Official Foreign Influence Operations: Transnational State Media in the Arab Online Sphere”
  • Hamit Akin Unver: “Russian Disinformation Operations in Turkey: 2015-2020”
  • Shelby Grossman and Renee DiResta: “In-House vs. Outsourced Trolls: How Digital Mercenaries Shape State Influence Strategies”
  • Nathaniel Gleicher: “Covert Manipulation, Overt Influence, Direct Exploit: Understanding and Countering Influence Operations in the Middle East and Beyond”
Subscribe to Democracy