Authors
News Type
News
Date
Paragraphs
76 Platforms v. Supreme Court with Daphne Keller (part 1)
Hero Image
The Supreme Court Getty Images
All News button
1
Subtitle

Daphne Keller spoke with the Initiative for Digital Public Infrastructure at the University of Massachusetts Amherst about two potentially major cases currently before the Supreme Court

Authors
News Type
Blogs
Date
Paragraphs

Picture this: you are sitting in the kitchen of your home enjoying a drink. As you sip, you scroll through your phone, looking at the news of the day. You text a link to a news article critiquing your government’s stance on the press to a friend who works in media. Your sibling sends you a message on an encrypted service updating you on the details of their upcoming travel plans. You set a reminder on your calendar about a doctor’s appointment, then open your banking app to make sure the payment for this month’s rent was processed.

Everything about this scene is personal. Nothing about it is private.

Without your knowledge or consent, your phone has been infected with spyware. This technology makes it possible for someone to silently watch and taking careful notes about who you are, who you know, and what you’re doing. They see your files, have your contacts, and know the exact route you took home from work on any given day. They can even turn the microphone of your phone on and listen to the conversations you’re having in the room.

This is not some hypothetical, Orwellian drama, but a reality for thousands of people around the world. This kind of technology — once a capability only of the most technologically advanced governments — is now commercially available and for sale from numerous private companies who are known to sell it to state agencies and private actors alike. This total loss of privacy should worry everyone, but for human rights activists and journalists challenging authoritarian powers, it has become a matter of life and death. 

The companies who develop and sell this technology are only passively accountable toward governments at best, and at worse have their tacit support. And it is this lack of regulation that Marietje Schaake, the International Policy Director at the Cyber Policy Center and International Policy Fellow at Stanford HAI, is trying to change.
 

Amsterdam and Tehran: A Tale of Two Elections


Schaake did not begin her professional career with the intention of becoming Europe’s “most wired politician,” as she has frequently been dubbed by the press. In many ways, her step into politics came as something of a surprise, albeit a pleasant one.
 
“I've always been very interested in public service and trying to improve society and the lives of others, but I ran not expecting at all that I would actually get elected,” Schaake confesses.

As a candidate on the 2008 ticket for the Democrats 66 (D66) political party of the Netherlands, Schaake saw herself as someone who could help move the party’s campaign forward, but not as a serious contender in the open party election system. But when her party performed exceptionally well, at the age of 30, Schaake landed in the third position of a 30-person list vying to fill the 25 open seats available for representatives from all political parties in the Netherlands. Having taken a top spot among a field of hundreds of candidates, she found herself on her way to being a Member of the European Parliament (MEP).

Marietje Schaake participates in a panel on human rights and communication technologies as a member of the European Parliament in April 2012.
Marietje Schaake participates in a panel on human rights and communication technologies as a member of the European Parliament in April 2012. | Alberto Novi, Flikr

In 2009, world events collided with Schaake’s position as a newly-seated MEP. While the democratic elections in the EU were unfolding without incident, 3,000 miles away in Iran, a very different story was unfolding. Following the re-election of Mahmoud Ahmadinejad to a second term as Iran’s president, allegations of fraud and vote tampering were immediately voiced by supporters of former prime minister Mir-Hossein Mousavi, the leading candidate opposing Ahmadinejad. The protests that followed quickly morphed into the Green Movement, one of the largest sustained protest movements in Iran’s history after the Iranian Revolution of 1978 and until the protests against the death of Mahsa Amini began in September 2022.
 
With the protests came an increased wave of state violence against the demonstrators. While repression and intimidation are nothing new to autocratic regimes, in 2009 the proliferation of cell phones in the hands of an increasingly digitally connected population allowed citizens to document human rights abuses firsthand and beam the evidence directly from the streets of Tehran to the rest of the world in real-time.
 
As more and more footage poured in from the situation on the ground, Schaake, with a pre-politics background in human rights and a specific interest in civil rights, took up the case of the Green Movement as one of her first major issues in the European Parliament. She was appointed spokesperson on Iran for her political group. 

Marietje Schaake [second from the left] during a press conference on universal human rights alongside her colleauges from the European Parliament.
Marietje Schaake [second from left] alongside her colleauges from the European Parliament during a press conference on universal human rights in 2010. | Alberto Novi, Flikr

The Best of Tech and the Worst of Tech


But the more Schaake learned, the clearer it became that the Iranian were not the only ones using technology to stay informed about the protests. Meeting with ights defenders who had escaped from Iran to Eastern Turkey, Schaake was told anecdote after anecdote about how the Islamic Republic’s authorities were using tech to surveil, track, and censor dissenting opinions.
 
Investigations indicated that they were utilizing a technique referred to then as “deep packet inspection,” a system which allows the controller of a communications network to read and block information from going through, alter communications, and collect data about specific individuals. What was more, journalists revealed that many of the systems such regimes were using to perform this type of surveillance had been bought from, and were serviced by, Western companies.
 
For Schaake, this revelation was a turning point of her focus as a politician and the beginning of her journey into the realm of cyber policy and tech regulation.
 
“On the one hand, we were sharing statements urging to respect the human rights of the demonstrators. And then it turned out that European companies were the ones selling this monitoring equipment to the Iranian regime. It became immediately clear to me that if technology was to play a role in enhancing human rights and democracy, we couldn’t simply trust the market to make it so; we needed to have rules,” Schaake explained.

We have to have a line in the sand and a limit to the use of this technology. It’s extremely important, because this is proliferating not only to governments, but also to non-state actors.
Marietje Schaake
International Policy Director at the Cyber Policy Center

The Transatlantic Divide


But who writes the rules? When it comes to tech regulation, there is longstanding unease between the private and public sectors, and a different approach between the East and West shores of the Atlantic. In general, EU member countries favor oversight of the technology sector and have supported legislation like the General Data Protection Regulation (GDPR) and Digital Services Act to protect user privacy and digital human rights. On the other hand, major tech companies — many of them based in North America — favor the doctrine of self-regulation and frequently cite claims to intellectual property or widely-defined protections such as Section 230 as a justification for keeping government oversight at arm’s length. Efforts by governing bodies like the European Union to legislate privacy and transparency requirements are with raised hackles 
 
It’s a feeling Schaake has encountered many times in her work. “When you talk to companies in Silicon Valley, they make it sound as if Europeans are after them and that these regulations are weapons meant to punish them,” she says.
 
But the need to place checks on those with power is rooted in history, not histrionics, says Schaake. Memories of living under the eye of surveillance states such as the Soviet Union and East Germany still are fresh on many European’s minds. The drive to protect privacy is as much about keeping the government in check as it is about reining in the outsized influence and power of private technology companies, Schaake asserts.
 

Big Brother Is Watching


In the last few years, the momentum has begun to shift. 
 
In 2020, a joint reporting effort by The Guardian, The Washington Post, Le Monde, Proceso, and over 80 journalists at a dozen additional news outlets worked in partnership with Amnesty International and Forbidden Stories to publish the Pegasus Project, a detailed report showing that spyware from the private company NSO Group was used to target, track, and retaliate against tens of thousands journalists, activists, civil rights leaders, and even against prominent politicians around the world.
 
This type of surveillance has innovated quickly beyond the network monitoring undertaken by regimes like Iran in the 2000s, and taps into the most personal details of an individual’s device, data, and communications. In the absence of widespread regulation, companies like NSO Group have been able to develop commercial products with capabilities as sophisticated as state intelligence bureaus. In many cases, “no-click” infections are now possible, meaning a device can be targeted and have the spyware installed without the user ever knowing or having any suspicions that they have become a victim of covert surveillance.

Marietje Schaake [left] moderates a panel at the 2023 Summit for Democracy with Neal Mohan, CEO of YouTube; John Scott-Railton, Senior Researcher at Citizen Lab; Avril Haines, U.S. Director of National Intelligence; and Alejandro N. Mayorkas, U.S. Secretary of Homeland Security.
Marietje Schaake at the 2023 Summit for Democracy with Neal Mohan, CEO of YouTube; John Scott-Railton, Senior Researcher at Citizen Lab; Avril Haines, U.S. Director of National Intelligence; and Alejandro Mayorkas, U.S. Secretary of Homeland Security. | U.S. Department of State

“If we were to create a spectrum of harmful technologies, spyware could easily take the top position,” said Schaake, speaking as the moderator of a panel on “Countering the Misuse of Technology and the Rise of Digital Authoritarianism” at the 2023 Summit for Democracy co-hosted by U.S. President Joe Biden alongside the governments of Costa Rica, the Netherlands, Republic of Korea, and Republic of Zambia.
 
Revelations like those of the Pegasus Project have helped spur what Schaake believes is long-overdue action from the United States on regulating this sector of the tech world. On March 27, 2023, President Biden signed an executive order prohibiting the operational use of commercial spyware products by the United States government. It is the first time such an action has been formally taken in Washington.
 
For Schaake, the order is a “fantastic first step,” but she also cautions that there is still much more that needs to be done. The use of spyware made by the government is not limited by Biden's executive order, and neither is the use by individuals who can get their hands on these tools. 

Human Rights vs. National Security


One of Schaake’s main concerns is the potential for governmental overreach in the pursuit of curtailing the influence of private companies.
 
Schaake explains, “What's interesting is that while the motivation in Europe for this kind of regulation is very much anchored in fundamental rights, in the U.S., what typically moves the needle is a call to national security, or concern for China.”
 
It is important to stay vigilant about how national security can become a justification for curtailing civil liberties. Writing for the Financial Times, Schaake elaborated on the potential conflict of interest the government has in regulating tech more rigorously:
 
“The U.S. government is right to regulate technology companies. But the proposed measures, devised through the prism of national security policy, must also pass the democracy test. After 9/11, the obsession with national security led to warrantless wiretapping and mass data collection. I back moves to curb the outsized power of technology firms large and small. But government power must not be abused.”
 
While Schaake hopes well-established democracies will do more to lead by example, she also acknowledges that the political will to actually step up to do so is often lacking. In principle, countries rooted in the rule of law and the principles of human rights decry the use of surveillance technology beyond their own borders. But in practice, these same governments are also sometimes customers of the surveillance industrial complex. 

It’s up to us to guarantee the upsides of technology and limit its downsides. That’s how we are going to best serve our democracy in this moment.
Marietje Schaake
International Policy Director at the Cyber Policy Center

Schaake has been trying to make that disparity an impossible needle for nations to keep threading. For over a decade, she has called for an end to the surveillance industry and has worked on developing export controls rules for the sale of surveillance technology from Europe to other parts of the world. But while these measures make it harder for non-democratic regimes to purchase these products from the West, the legislation is still limited in its ability to keep European and Western nations from importing spyware systems like Pegasus back into the West. And for as long as that reality remains, it undermines the credibility of the EU and West as a whole, says Schaake. 
 
Speaking at the 2023 Summit for Democracy, Schaake urged policymakers to keep the bigger picture in mind when it comes to the risks of unaccountable, ungoverned spyware industries. “We have to have a line in the sand and a limit to the use of this technology. It’s extremely important, because this is proliferating not only to governments, but also to non-state actors. This is not the world we want to live in.”

 

Building Momentum for the Future


Drawing those lines in the sands is crucial not just for the immediate safety and protection of individuals who have been targeted with spyware but applies to other harms of technology vis-a-vis the long-term health of democracy.

“The narrative that technology is helping people's democratic rights, or access to information, or free speech has been oversold, whereas the need to actually ensure that democratic principles govern technology companies has been underdeveloped,” Schaake argues.

While no longer an active politician, Schaake has not slowed her pace in raising awareness and contributing her expertise to policymakers trying to find ways of threading the digital needle on tech regulation. Working at the Cyber Policy Center at the Freeman Spogli Institute for International Studies (FSI), Schaake has been able to combine her experiences in European politics with her academic work in the United States against the backdrop of Silicon Valley, the home-base for many of the world’s leading technology companies and executives.
 
Though now half a globe away from the European Parliament, Schaake’s original motivations to improve society and people’s lives have not dimmed.

Marietje Schaake speaking at conference at Stanford University
Though no longer working in government, Schaake, seen here at a conference on regulating Big Tech hosted by Stanford's Human-Centered Intelligence (HAI), continues to research and advocate for better regulation of technology industries. | Midori Yoshimura

“It’s up to us to guarantee the upsides of technology and limit its downsides. That’s how we are going to best serve our democracy in this moment,” she says.
 
Schaake is clear-eyed about the hurdles still ahead on the road to meaningful legislation about tech transparency and human rights in digital spaces. With a highly partisan Congress in the United States and other issues like the war in Ukraine and concerns over China taking center stage, it will take time and effort to build a critical mass of political will to tackle these issues. But Biden’s executive order and the discussion of issues like digital authoritarianism at the Summit for Democracy also give Schaake hope that progress can be made.
 
“The bad news is we're not there yet. The good news is there's a lot of momentum for positive change and improvement, and I feel like people are beginning to understand how much it is needed.”
 
And for anyone ready to jump into the fray and make an impact, Schaake adds a standing invitation: “I’m always happy to grab a coffee and chat. Let’s talk!”



The complete recording of "Countering the Misuse of Technology and the Rise of Digital Authoritarianism," the panel Marietje Schaake moderated at the 2023 Summit for Democracy, is available below.

Read More

Hero Image
Marietje Schaake discusses the misuse of technology and the rise of digital authoritarianism with Youtube CEO Neal Mohan at the 2023 Summit for Democracy.
Marietje Schaake discusses the misuse of technology and the rise of digital authoritarianism with Youtube CEO Neal Mohan at the 2023 Summit for Democracy co-hosted by President Biden and the governments of Costa Rica, the Netherlands, Republic of Korea, and Republic of Zambia.
U.S. Department of State
All News button
1
Subtitle

A transatlantic background and a decade of experience as a lawmaker in the European Parliament has given Marietje Schaake a unique perspective as a researcher investigating the harms technology is causing to democracy and human rights.

-
Karen Nershi headshot on a blue background with Fall Seminar Series in white font

Join the Cyber Policy Center and moderator  Daniel Bateyko in conversation with Karen Nershi for How Strong Are International Standards in Practice?:  Evidence from Cryptocurrency Transactions. 

The rise of cryptocurrency (decentralized digital currency) presents challenges for state regulators given its connection to illegal activity and pseudonymous nature, which has allowed both individuals and businesses to circumvent national laws through regulatory arbitrage. Karen Nershi assess the degree to which states have managed to regulate cryptocurrency exchanges, providing a detailed study of international efforts to impose common regulatory standards for a new technology. To do so, she introduces a dataset of cryptocurrency transactions collected during a two-month period in 2020 from exchanges in countries around the world and employ bunching estimation to compare levels of unusual activity below a threshold at which exchanges must screen customers for money laundering risk. She finds that exchanges in some, but not all, countries show substantial unusual activity below the threshold; these findings suggest that while countries have made progress toward regulating cryptocurrency exchanges, gaps in enforcement across countries allow for regulatory arbitrage. 

This session is part of the Fall Seminar Series, a months-long series designed to bring researchers, policy makers, scholars and industry professionals together to share research, findings and trends in the cyber policy space. Both in-person (Stanford-affiliation required) and virtual attendance (open to the public) is available; registration is required.

Karen Nershi is a Postdoctoral Fellow at Stanford University's Stanford Internet Observatory and the Center for International Security and Cooperation (CISAC). In the summer of 2021, she completed her Ph.D. in political science at the University of Pennsylvania specializing in the fields of international relations and comparative politics. Through an empirical lens, her research examines questions of international cooperation and regulation within international political economy, including challenges emerging from the adoption of decentralized digital currency and other new technologies. 

Specific topics Dr. Nershi explores in her research include ransomware, cross-national regulation of the cryptocurrency sector, and international cooperation around anti-money laundering enforcement. Her research has been supported by the University of Pennsylvania GAPSA Provost Fellowship for Innovation and the Christopher H. Browne Center for International Politics. 

Before beginning her doctorate, Karen Nershi earned a B.A. in International Studies with honors at the University of Alabama. She lived and studied Arabic in Amman, Jordan and Meknes, Morocco as a Foreign Language and Area Studies Fellow and a Critical Language Scholarship recipient. She also lived and studied in Mannheim, Germany, in addition to interning at the U.S. Consulate General Frankfurt (Frankfurt, Germany).

Dan Bateyko is the Special Projects Manager at the Stanford Internet Observatory.

Dan worked previously as a Research Coordinator for The Center on Privacy & Technology at Georgetown Law, where he investigated Immigration and Customs Enforcement surveillance practices, co-authoring American Dragnet: Data-Drive Deportation in the 21st Century. He has worked at the Berkman Klein Center for Internet & Society, the Dangerous Speech Project, and as a research assistant for Amanda Levendowski, whom he assisted with legal scholarship on facial surveillance.

In 2016, he received a Thomas J. Watson Fellowship. He spent his fellowship year talking with people about digital surveillance and Internet infrastructure in South Korea, China, Malaysia, Germany, Ghana, Russia, and Iceland. His writing has appeared in Georgetown Tech Law Review, Columbia Journalism Review, Dazed Magazine, The Internet Health Report, Council on Foreign Relations' Net Politics, and Global Voices. He is a 2022 Internet Law & Policy Foundry Fellow.

Dan received his Masters of Law & Technology from Georgetown University Law Center (where he received the IAPP Westin Scholar Book Award for excellence in Privacy Law), and his B.A. from Middlebury College.

Karen Nershi
Seminars
-
robert robertson headshot fall seminar series text on blue background

Join the Program on Democracy and the Internet (PDI) and moderator Alex Stamos in conversation with Ronald E. Robertson for Engagement Outweighs Exposure to Partisan and Unreliable News within Google Search 

This session is part of the Fall Seminar Series, a months-long series designed to bring researchers, policy makers, scholars and industry professionals together to share research, findings and trends in the cyber policy space. Both in-person (Stanford-affiliation required) and virtual attendance (open to the public) is available; registration is required.

If popular online platforms systematically expose their users to partisan and unreliable news, they could potentially contribute to societal issues like rising political polarization. This concern is central to the echo chamber and filter bubble debates, which critique the roles that user choice and algorithmic curation play in guiding users to different online information sources. These roles can be measured in terms of exposure, the URLs seen while using an online platform, and engagement, the URLs selected while on that platform or browsing the web more generally. However, due to the challenges of obtaining ecologically valid exposure data--what real users saw during their regular platform use--studies in this vein often only examine engagement data, or estimate exposure via simulated behavior or inference. Despite their centrality to the contemporary information ecosystem, few such studies have focused on web search, and even fewer have examined both exposure and engagement on any platform. To address these gaps, we conducted a two-wave study pairing surveys with ecologically valid measures of exposure and engagement on Google Search during the 2018 and 2020 US elections. We found that participants' partisan identification had a small and inconsistent relationship with the amount of partisan and unreliable news they were exposed to on Google Search, a more consistent relationship with the search results they chose to follow, and the most consistent relationship with their overall engagement. That is, compared to the news sources our participants were exposed to on Google Search, we found more identity-congruent and unreliable news sources in their engagement choices, both within Google Search and overall. These results suggest that exposure and engagement with partisan or unreliable news on Google Search are not primarily driven by algorithmic curation, but by users' own choices.

Dr. Ronald E Robertson received his Ph.D. in Network Science from Northeastern University in 2021. He was advised by Christo Wilson, a computer scientist, and David Lazer, a political scientist. For his research, Dr. Robertson uses computational tools, behavioral experiments, and qualitative user studies to measure user activity, algorithmic personalization, and choice architecture in online platforms. By rooting his questions in findings and frameworks from the social, behavioral, and network sciences, his goal is to foster a deeper and more widespread understanding of how humans and algorithms interact in digital spaces. Prior to Northeastern, Dr. Robertson obtained a BA in Psychology from the University of California San Diego and worked with research psychologist Robert Epstein at the American Institute for Behavioral Research and Technology.

Alex Stamos
0
ronald-e-robertson-2024.jpg PhD

Dr. Ronald E Robertson received his Ph.D. in Network Science from Northeastern University in 2021. He was advised by Christo Wilson, a computer scientist, and David Lazer, a political scientist. For his research, Dr. Robertson uses computational tools, behavioral experiments, and qualitative user studies to measure user activity, algorithmic personalization, and choice architecture in online platforms. By rooting his questions in findings and frameworks from the social, behavioral, and network sciences, his goal is to foster a deeper and more widespread understanding of how humans and algorithms interact in digital spaces.

Prior to Northeastern, Dr. Robertson obtained a BA in Psychology from the University of California San Diego and worked with research psychologist Robert Epstein at the American Institute for Behavioral Research and Technology.

Research Scientist
Date Label
Seminars
-
l jean camp headshot on blue background

Join the Program on Democracy and the Internet (PDI) and moderator Andrew Grotto, in conversation with L. Jean Camp for Create a Market for Safe, Secure Software

This session is part of the Fall Seminar Series, a months-long series designed to bring researchers, policy makers, scholars and industry professionals together to share research, findings and trends in the cyber policy space. Both in-person (Stanford-affiliation required) and virtual attendance (open to the public) is available; registration is required.

Today the security market, particularly in embedded software and Internet of Things (IoT) devices, is a lemons market.  Buyers simply cannot distinguish between secure and insecure products. To enable the market for secure high quality products to thrive,  buyers need to have some knowledge of the contents of these digital products. Once purchased, ensuring a product or software package remains safe requires knowing if these include publicly disclosed vulnerabilities. Again this requires knowledge of the contents.  When consumers do not know the contents of their digital products, they can not know if they are at risk and need to take action.

The Software Bill of Materials  is a proposal that was identified as a critical instrument for meeting these challenges and securing software supply chains in the Executive Order on Improving the Nation’s Cybersecurity} by the Biden Administration (EO 14028. In this presentation Camp will introduce SBOMs, provide examples, and explain the components that are needed in the marketplace for this initiative to meet its potential.

Jean Camp is a Professor at Indiana University with appointments in Informatics and Computer Science.  She is a Fellow of the AAAS (2017), the IEEE (2018), and the ACM (2021).  She joined Indiana after eight years at Harvard’s Kennedy School. A year after earning her doctorate from Carnegie Mellon she served as a Senior Member of the Technical Staff at Sandia National Laboratories. She began her career as an engineer at Catawba Nuclear Station after a double major in electrical engineering and mathematics, followed by a MSEE in optoelectronics at University of North Carolina at Charlotte.

L. Jean Camp Professor at Indiana University
Seminars
-
Aleksandra Kuczerawy headshot on a blue background with text European Developments in Internet Regulation

Join the Program on Democracy and the Internet (PDI) and moderator Daphne Keller, in conversation with Aleksandra Kuczerawy for European Developments in Internet Regulation.

This session is part of the Fall Seminar Series, a months-long series designed to bring researchers, policy makers, scholars and industry professionals together to share research, findings and trends in the cyber policy space. Both in-person (Stanford-affiliation required) and virtual attendance (open to the public) is available; registration is required.

The Digital Services Act is a new landmark European Union legislation addressing illegal and harmful content online. Its main goals are to create a safer digital space but also to enhance protection of fundamental rights online. In this talk, Aleksandra Kuczerawy will discuss the core elements of the DSA, such as the layered system of due diligence obligations, content moderation rules and the enforcement framework, while providing underlying policy context for the US audience.

Aleksandra Kuczerawy is a postdoctoral scholar at the Program on Platform Regulation and has been a postdoctoral researcher at KU Leuven’s Centre for IT & IP Law and is assistant editor of the International Encyclopedia of Law (IEL) – Cyber Law. She has worked on the topics of privacy and data protection, media law, and the liability of Internet intermediaries since 2010 (projects PrimeLife, Experimedia, REVEAL). In 2017 she participated in the works of the Committee of experts on Internet Intermediaries (MSI-NET) at the Council of Europe, responsible for drafting a recommendation by the Committee of Ministers on the roles and responsibilties of internet intermediaries and a study on Algorithms and Human Rights.

Daphne Keller
Aleksandra Kuczerawy Postdoctoral Scholar at the Program on Platform Regulation (PPR)
Seminars
-
transatlantic summit text on blue background with globe

Please note, event is now sold out, though waitlist is available through the registration link above.

The Transatlantic Summit is where the worlds of cutting-edge research, industry, and policy come together to find answers on geopolitics, digital platforms and emerging tech as well as digital sovereignty. Whether you're an industry leader, policy maker, or student - join the start of a new Transatlantic movement seeking synergies between technology and society and become part of the international conversation going forward.

About:

  • Creates a vibrant forum for a dialogue between the US and Europe in Silicon Valley about the impact of digital technologies on business and society
  • Builds a strong network for German American collaboration in digital innovation, business, and geopolitics
  • Excite, connect and inspire: Participants meet the movers and shakers of the digital future from business, academia, and politics

 

Topics:

  1. Digital Sovereignty
  2. Geopolitics of Emerging Technologies
  3. Digital Platforms and Misinformation

 

The conference, which is jointly organized by the German Federal Foreign Office, The Representatives of German Business (GAAC West), German Consulate General of San Francisco, Stanford German Student Association and Program on Geopolitics, Technology, and Governance at the Stanford Cyber Policy Center addresses current discussions about digital technologies, business and society. Join us and get inspired by our series of speakers and networking sessions to bring together leaders, politicians, students, and changemakers.

Digital Sovereignty and Multilateral Collaboration

Digital sovereignty vs. cooperation: What should the future of the transatlantic partnership on digital policies look like, and how do we reach it?

Technology increasingly sits at the epicenter of geopolitics. In recent years, the notion of technological or digital sovereignty has emerged in Europe as a means of promoting the notion of European leadership and strategic autonomy in the digital field. On the other side of the Atlantic, the United States find themselves in an increasingly fierce race with China for global technology dominance. Against this backdrop, cooperation between the European Union and the United States may be more critical than ever. This raises important questions: What does Europe's move toward digital sovereignty and self- determination mean for the transatlantic partnership? And how should the US and EU balance sovereignty and cooperation in digital and technology policy? Our panel will explore tensions between sovereignty and cooperation and what the future of transatlantic policy may look like on issues from data protection to semiconductors, in light of the rising technological influence and ambitions of China.

John Zysman, Professor Emeritus, UC Berkeley
Maryam Cope, Head of Government Affairs, ASML U. S.
Hannah Bracken, Policy Advisor -Privacy Shield, U.S. Department of Commerce
Adriana Groh, Co-Founder, Sovereign Tech Fund

Agenda & Speakers

Transatlantic Summit: Sovereignty vs. Cooperation in the Digital Era
Thursday, Nov. 17th, 2022, 9:00am – 6:00pm PT
Vidalakis Dining Hall, Schwab Residential Center Stanford, CA 94305

FULL AGENDA
Download pdf
SPEAKER BIOS
Download pdf
Conferences
-

Join the Program on Democracy and the Internet (PDI) and moderator Nate Persily, in conversation with Aleksandra Kuczerawy for European Developments in Internet Regulation.

This session is part of the Fall Seminar Series, a months-long series designed to bring researchers, policy makers, scholars and industry professionals together to share research, findings and trends in the cyber policy space. Both in-person (Stanford affiliation only) and virtual attendance (open to public) is available; registration is required.

Aleksandra Kuczerawy is a postdoctoral scholar at the Program on Platform Regulation and has been a postdoctoral researcher at KU Leuven’s Centre for IT & IP Law and is assistant editor of the International Encyclopedia of Law (IEL) – Cyber Law. She has worked on the topics of privacy and data protection, media law, and the liability of Internet intermediaries since 2010 (projects PrimeLife, Experimedia, REVEAL). In 2017 she participated in the works of the Committee of experts on Internet Intermediaries (MSI-NET) at the Council of Europe, responsible for drafting a recommendation by the Committee of Ministers on the roles and responsibilties of internet intermediaries and a study on Algorithms and Human Rights.

Aleksandra Kuczerawy Postdoctoral Scholar at the Program on Platform Regulation (PDI)
Seminars
-
chenyan jia headshot on flyer

Join the Program on Democracy and the Internet (PDI) and moderator Nate Persily, in conversation with Chenyan Jia for The Evolving Role of AI In Political News Consumption: The Effects of Algorithmic vs. Community Label on Perceived Accuracy of Hyper-partisan Misinformation.

This session is part of the Fall Seminar Series, a months-long series designed to bring researchers, policy makers, scholars and industry professionals together to share research, findings and trends in the cyber policy space. Both in-person (Stanford affiliation only) and virtual attendance (open to the public) is available; registration is required.

Chenyan Jia (Ph.D., The University of Texas at Austin) is a postdoctoral scholar in The Program on Democracy and the Internet (PDI) at Stanford University. In 2023 Fall, she will be joining Northeastern University as an Assistant Professor in the School of Journalism in the College of Arts, Media, and Design with a joint appointment in the Khoury College of Computer Sciences. She has been working as a research assistant for UT's Human–AI Interaction Lab.

Her research interests lie at the intersection of communication and human-computer interaction. Her work has examined (a) the influence of emerging media technologies such as automated journalism and misinformation detection algorithms on people’s political attitudes and news consumption behaviors; (b) the political bias in news coverage through NLP techniques; (c) how to leverage AI technologies to reduce bias and promote democracy.

Her research has appeared in mass communication journals and top-tier AI and HCI venues including Human-Computer Interaction Journal (CSCW), Journal of Artificial Intelligence, International Journal of Communication, Media and Communication, ICLR, ICWSM, EMNLP, ACL, and AAAI. Her research has been awarded the Best Paper Award at AAAI 21. She was the recipient of the Harrington Dissertation Fellowship and the Dallas Morning News Graduate Fellowship for Journalism Innovation.

YOUTUBE RECORDING

Chenyan Jia Postdoctoral Scholar at the Program on Democracy and the Internet (PDI) 
Seminars
-
meicen sun headshot on blue background advertising seminar

Join the Program on Democracy and the Internet (PDI) and moderator Nate Persily, in conversation with Meicen Sun for Internet Control as A Winning Strategy: How the Duality of Information Consolidates Autocratic Rule in the Digital Age.

This paper advances a new theory on how the Internet as a digital technology helps consolidate autocratic rule. Exploiting a major Internet control shock in China in 2014, this paper finds that Chinese data-intensive firms have gained from Internet control a 10% increase in revenue over other Chinese firms, and about 1-2% over their U.S. competitors. Meanwhile, the same Internet control has incurred an up to 25% reduction in research quality for Chinese scholars conditional on the knowledge-intensity of their discipline. This occurred specifically via a reduction in the access to cutting-edge knowledge from the outside world. These findings suggest that while politically motivated information flow restrictions do take a toll on the country’s long-term capacity for innovation, they lend a short-term benefit to its data-intensive sectors. Conventional wisdom on the inherent limit to information control by autocracies overlooks this crucial protectionist benefit that aids in autocratic power consolidation in the digital age. 

This session is part of the Fall Seminar Series, a months-long series designed to bring researchers, policy makers, scholars and industry professionals together to share research, findings and trends in the cyber policy space. Both in-person and virtual attendance is available; registration is required.

Meicen Sun is a postdoctoral scholar with the Program on Democracy and the Internet at Stanford University. Her research examines the political economy of information and the effect of information policy on the future of innovation and state power. Her writings have appeared in academic and policy outlets including Foreign Policy Analysis, Harvard Business Review, World Economic Forum, the Asian Development Bank Institute, and The Diplomat among others. She had previously conducted research at the Center for Strategic and International Studies and at Georgetown University in Washington, DC, and at the UN Regional Centre for Peace and Disarmament in Africa. Bilingual in English and Chinese, she has also written stories, plays, and music and staged many of her works -- in both languages -- in China, Singapore and the U.S. Sun has served as a Fellow on the World Economic Forum's Global Future Council on China and as a Research Affiliate with the MIT Initiative on the Digital Economy. She holds an A.B. with Honors from Princeton University, an A.M. with a Certificate in Law from the University of Pennsylvania, and a Ph.D from the Massachusetts Institute of Technology.

Meicen Sun Postdoctoral scholar with the Program on Democracy and the Internet
Seminars
News Feed Image
meicen-sun-seminar-v2_1.png
Subscribe to Europe