Governance

FSI's research on the origins, character and consequences of government institutions spans continents and academic disciplines. The institute’s senior fellows and their colleagues across Stanford examine the principles of public administration and implementation. Their work focuses on how maternal health care is delivered in rural China, how public action can create wealth and eliminate poverty, and why U.S. immigration reform keeps stalling. 

FSI’s work includes comparative studies of how institutions help resolve policy and societal issues. Scholars aim to clearly define and make sense of the rule of law, examining how it is invoked and applied around the world. 

FSI researchers also investigate government services – trying to understand and measure how they work, whom they serve and how good they are. They assess energy services aimed at helping the poorest people around the world and explore public opinion on torture policies. The Children in Crisis project addresses how child health interventions interact with political reform. Specific research on governance, organizations and security capitalizes on FSI's longstanding interests and looks at how governance and organizational issues affect a nation’s ability to address security and international cooperation.

-

Image
Julie Ness, Paul Barrett and Julie Owono head shots on blue background

Join us on Tuesday, June 7th from 12 PM - 1 PM PT for “Enhancing the FTC's Consumer Protection Authority to Regulate Social Media Companies” featuring Paul Barrett of the NYU Stern Center for Business and Human Rights, and Susan Ness of the Annenberg Public Policy Center in conversation with Julie Owono of the Content Policy & Society Lab. This weekly seminar series is jointly organized by the Cyber Policy Center’s Program on Democracy and the Internet and the Hewlett Foundation’s Cyber Initiative.

About the Seminar: 

The social media industry’s self-regulation has proven inadequate. It is time for Congress and the Federal Trade Commission to step in. Enhancing the FTC's Consumer Protection Authority to Regulate Social Media Companies offers principles and policy goals to help lawmakers and regulators sort through the dozens of bills pending before Congress and shape an agenda for the FTC to use its consumer protection authority to incentivize better corporate conduct.

About the Speakers:

Paul Barrett is the deputy director and senior research scholar at the Center for Business and Human Rights at New York University’s Stern School of Business. He joined the Center in September 2017 after working for more than three decades as a journalist focusing on the intersection of business, law, and society. At Bloomberg Businessweek magazine, he wrote cover stories on topics such as energy and the environment, military procurement, and the civilian firearm industry. From 1986 to 2005, he wrote for The Wall Street Journal, serving for part of that time as the newspaper’s Supreme Court correspondent. Paul is the author of four nonfiction books, including GLOCK: The Rise of America’s Gun, a New York Times Bestseller.
 
At the Center for Business and Human Rights, Paul has written a series of reports on the role of the social media industry in a democracy. Topics have included the problems of foreign and domestic disinformation, the consequences of outsourced content moderation, the debate over Section 230, the role of social media in intensifying political polarization in the U.S., and how Congress could enhance the Federal Trade Commission’s consumer protection authority to regulate the major platforms. Since 2008, Paul has served as an adjunct professor at the NYU School of Law, where he co-teaches a seminar called “Law, Economics, and Journalism.” He holds undergraduate and law degrees from Harvard.

Susan Ness is a distinguished fellow at the Annenberg Public Policy Center, where she leads a project to encourage transatlantic governments and stakeholders to forge common ‘modular’ solutions that are accepted under different tech regulatory frameworks. Previously, she convened the Transatlantic High-Level Working Group on Content Moderation and Freedom of Expression, which published a report and 14 briefing papers. She also is a distinguished fellow at the German Marshall Fund, working on transatlantic digital policy. She is a former member of the Federal Communications Commission, where she focused on digital transformation of communications. She is a board member of both media company TEGNA Inc, and Vital Voices Global Partnership, an NGO that supports women leaders who are improving the world. She holds a J.D. from Boston College Law School and an M.B.A. from The Wharton Graduate School (University of Pennsylvania).

Julie Owono is the Executive Director of the Content Policy & Society Lab (CPSL) and a fellow of the Program on Democracy and the Internet (PDI) at Stanford University. She is also the Executive Director of digital rights organization Internet Sans Frontières, one of the inaugural members of the Facebook Oversight Board, and an affiliate at the Berkman Klein Center at Harvard University. She holds a Master’s degree in International Law from la Sorbonne University in Paris, and practiced as a lawyer at the Paris Bar. 

With a fluency in five languages, a childhood spent in various countries, and an educational background at the Lyçée Français Alexandre Dumas in Moscow, Julie has a unique perspective to understand the challenges and opportunities of a global Internet. This background has shaped her belief that global and multi stakeholder collaborations can be instrumental in the emergence of rights-based content policies and regulations.

Susan Ness
Paul Barrett
Seminars
-

Image
image of Julie Owono and Phumzile Van Damme on blue background

Join us on Tuesday, May 31st from 12 PM - 1 PM PT for "A Former South African Politician’s Effort to Combat Misinformation in Elections" featuring Phumzile Van Damme, former Member of Parliament in South Africa, in conversation with Julie Owono of the Content Policy & Society Lab (CPSL). This weekly seminar series is jointly organized by the Cyber Policy Center’s Program on Democracy and the Internet and the Hewlett Foundation’s Cyber Initiative.

About the Seminar: 

Misinformation during elections is a serious concern for democratic systems around the world. This is particularly true in various African countries, cases of electoral violence have been linked to disruptions in the informational realm. Yet, the underinvestment by technology companies in initiatives to limit the existence and impact of disinformation in Africa remains a reality.

Local initiatives have attempted to mitigate this inequality. This week’s webinar will focus on the work of Former South African MP Phumzile Van Damme, who launched a project to tackle the spread of misinformation on social media platforms before and during the local government elections in November 2021. She will share on the methodology used, and results observed. The webinar will also discuss the challenges faced in ensuring that South African users and citizens have access to reliable information.
 

About the Speakers:

Phumzile Van Damme is an independent consultant on disinformation and digital rights. She is a member of the Real Facebook Oversight Board, the International Grand Committee on Disinformation, and an advisory council member of #ShePersisted. Van Damme’s work on misinformation was the subject of a documentary that premiered at Sundance Film Festival in 2020, “Influence.” 
 
A former Member of Parliament in South Africa, Van Damme served on the Communications and Digital Technologies committee as Shadow Minister. She played a pivotal role in holding social media platforms accountable for misinformation on their platforms and spearheaded the summoning of Facebook and other tech giants to Parliament. 
 
In September 2021, she helped found and coordinate South Africa’s first electoral disinformation monitoring project, the ‘Local Government Anti-Disinformation Project’. She has spoken on various platforms on the subject of disinformation including at the UNDP and the UN Commission on the Status of Women.

Julie Owono is the Executive Director of the Content Policy & Society Lab (CPSL) and a fellow of the Program on Democracy and the Internet (PDI) at Stanford University. She is also the Executive Director of digital rights organization Internet Sans Frontières, one of the inaugural members of the Facebook Oversight Board, and an affiliate at the Berkman Klein Center at Harvard University. She holds a Master’s degree in International Law from la Sorbonne University in Paris, and practiced as a lawyer at the Paris Bar. 

With a fluency in five languages, a childhood spent in various countries, and an educational background at the Lyçée Français Alexandre Dumas in Moscow, Julie has a unique perspective to understand the challenges and opportunities of a global Internet. This background has shaped her belief that global and multi stakeholder collaborations can be instrumental in the emergence of rights-based content policies and regulations.

Phumzile Van Damme
Seminars
-

Image
image of jeff hancock on blue background with ryan moore and ross dahlke

Join us on Tuesday, May 17th from 12 PM - 1 PM PT for “Exposure to Untrustworthy Websites in the 2020 US Election” featuring Jeff Hancock, Ross Dahlke & Ryan Moore of the Social Media Lab. This weekly seminar series is jointly organized by the Cyber Policy Center’s Program on Democracy and the Internet and the Hewlett Foundation’s Cyber Initiative.

About The Seminar: 

Prior research has documented exposure to fake news and online misinformation using large-scale data on individuals’ media use, which has provided important information about the scope and nature of people’s exposure to misinformation online. However, most of this work has made use of data collected during the 2016 US election, and far fewer studies have examined how exposure to misinformation online has changed since 2016. In this paper, we examine exposure to untrustworthy websites in the lead up to the 2020 US election using a dataset of over 7.5 million passively tracked website visits from a nationally representative sample of American adults (N = 1,151). We find that a significantly smaller percentage of Americans were exposed to untrustworthy websites in 2020 compared to in 2016 (as calculated by Guess et al. [2020]). While exposure was concentrated among similar groups of people as it was in 2016, levels of exposure appear to be lower across the board. There were also differences in the role online platforms played in directing people to untrustworthy websites in 2020 compared to 2016. Our findings have implications for future research and practice around online misinformation.

About The Speakers:

Jeff Hancock is the founding director of the Stanford Social Media Lab and is Harry and Norman Chandler Professor of Communication at Stanford University. Professor Hancock and his group work on understanding psychological and interpersonal processes in social media. The team specializes in using computational linguistics and experiments to understand how the words we use can reveal psychological and social dynamics, such as deception and trust, emotional dynamics, intimacy and relationships, and social support. Recently Professor Hancock has begun work on understanding the mental models people have about algorithms in social media, as well as working on the ethical issues associated with computational social science.

Ross Dahlke, from Westfield, Wisconsin, is pursuing a PhD in theory and research in the Stanford Social Media Lab at the Stanford School of Humanities and Sciences. He graduated from the University of Wisconsin-Madison with bachelor’s degrees in journalism and political science. Ross’s research focuses on applying AI and computational techniques to understand how people interact with complex systems. Before graduate school, he was a data scientist at a marketing technology firm where he developed machine learning platforms that helped Fortune 500 companies optimize their digital marketing spend in order to drive sales. He has also consulted on dozens of state-wide and local political campaigns. In high school, Ross started a cheese distribution business which has sold more than $3 million in cheese.

Ryan Moore studies how features of new media platforms and technologies affect the consumption, processing, and sharing of information, especially information about politics and news. In addition, he is interested in the role that age plays in internet and technology use, particularly as it relates to encountering deceptive or misleading content.

Seminars
-
Image
two logos displayed on blue abstract background, Korea Foundation and Stanford's GTG program

Geopolitics of Technology in East Asia

 

WHEN: May 17 & May 18 
WHERE: Frances C. Arrillaga Alumni Center (IN PERSON) or Live Webcast


AGENDA: 

Day 1 of the workshop will focus on the strategic dimensions of industrial policy relating to digital goods and services. Key topics include national security reviews of inbound and outbound investments, export controls, and supply chain risks, with a view towards identifying areas that are ripe for multilateral alignment as well as points of friction and options for managing those points of friction. Elaborating the respective roles and responsibilities of government and private sector actors will be an important theme.

Day 2 of the workshop will focus on regulatory policy and workforce challenges and opportunities, especially AI and its ecosystem of supporting technologies. 
 

FULL AGENDA

Frances C. Arrillaga Alumni Center or Live Webcast

Seminars
-

Image
headshots of Kate Klonick and Nate Persily on a blue background with text that reads Big Speech, May 10ths 12-1 pacific

Join us on Tuesday, May 10 from 12 PM - 1 PM PT for “Big Speech” featuring Kate Klonick of St. John’s University Law School, in conversation with Nate Persily of the Cyber Policy Center. This weekly seminar series is jointly organized by the Cyber Policy Center’s Program on Democracy and the Internet and the Hewlett Foundation’s Cyber Initiative.

About the Seminar:

Technology companies seem omnipotent, omnipresent, and without accountability for their harms to society. Nowhere is this truer than in the realm of Big Speech—the firms who control and profit from large scale user-generated content platforms. With reform through direct regulation likely foreclosed by the First Amendment, recent intervention has focused instead on breaking up these platforms under antitrust law. These proposals tap into both the pragmatic and emotional frustration around the power of private firms over freedom of expression and the public sphere. But while break up might be valuable in other areas of big tech, its effect on Big Speech is less certain.

Will breaking up Big Speech make individual user experience and the digital public sphere better or worse? Join us for insight on the nature of Big Speech and the challenges of reform.

About the Speakers:

Kate Klonick is an Associate Professor at St. John's University Law School, a fellow at the Brookings Institution and Yale Law School’s Information Society Project. Her writing on online speech, freedom of expression, and private governance has appeared in the Harvard Law Review, Yale Law Journal, The New Yorker, the New York Times, The Atlantic, theWashington Post and numerous other publications.

Nathaniel Persily is the James B. McClatchy Professor of Law at Stanford Law School, with appointments in the departments of Political Science, Communication, and FSI.  Prior to joining Stanford, Professor Persily taught at Columbia and the University of Pennsylvania Law School, and as a visiting professor at Harvard, NYU, Princeton, the University of Amsterdam, and the University of Melbourne. Professor Persily’s scholarship and legal practice focus on American election law or what is sometimes called the “law of democracy,” which addresses issues such as voting rights, political parties, campaign finance, redistricting, and election administration. He has served as a special master or court-appointed expert to craft congressional or legislative districting plans for Georgia, Maryland, Connecticut, New York, North Carolina, and Pennsylvania.  He also served as the Senior Research Director for the Presidential Commission on Election Administration. His current work, for which he has been honored as a Guggenheim Fellow, Andrew Carnegie Fellow, and a Fellow at the Center for Advanced Study in the Behavioral Sciences, examines the impact of changing technology on political communication, campaigns, and election administration.  He is codirector of the Stanford Cyber Policy Center, Stanford Program on Democracy and the Internet, and the Stanford-MIT Healthy Elections Project, which supported local election officials in taking the necessary steps during the COVID-19 pandemic to provide safe voting options for the 2020 election. He is also a member of the American Academy of Arts and Sciences, and a commissioner on the Kofi Annan Commission on Elections and Democracy in the Digital Age.

 

Kate Klonick
Seminars
News Feed Image
Authors
Melissa De Witte, Taylor Kubota, Ker Than
Taylor Kubota
Ker Than
News Type
News
Date
Paragraphs

During a speech at Stanford University on Thursday, April 21, 2022, former U.S. President Barack Obama presented his audience with a stark choice: “Do we allow our democracy to wither, or do we make it better?”

Over the course of an hour-long address, Obama outlined the threat that disinformation online, including deepfake technology powered by AI, poses to democracy as well as ways he thought the problems might be addressed in the United States and abroad.

“This is an opportunity, it’s a chance that we should welcome for governments to take on a big important problem and prove that democracy and innovation can coexist,” Obama said.

Obama, who served as the 44th president of the United States from 2009 to 2017, was the keynote speaker at a one-day symposium, titled “Challenges to Democracy in the Digital Information Realm,” co-hosted by the Stanford Cyber Policy Center and the Obama Foundation on the Stanford campus on April 21.

The event brought together people working in technology, policy, and academia for panel discussions on topics ranging from the role of government in establishing online trust, the relationship between democracy and tech companies, and the threat of digital authoritarians.

Obama told a packed audience of more than 600 people in CEMEX auditorium – as well as more than 250,000 viewers tuning in online – that everyone is part of the solution to make democracy stronger in the digital age and that all of us – from technology companies and their employees to students and ordinary citizens – must work together to adapt old institutions and values to a new era of information. “If we do nothing, I’m convinced the trends that we’re seeing will get worse,” he said.

Introducing the former president was Michael McFaul, director at the Freeman Spogli Institute for International Studies and U.S. ambassador to Russia under Obama, and Stanford alum and Obama Foundation fellow, Tiana Epps-Johnson, BA ’08.

Epps-Johnson, who is the founder and executive director of the Center for Tech and Civic Life, recalled her time answering calls to an election protection hotline during the 2006 midterm election. She said the experience taught her an important lesson, which was that “the overall health of our democracy, whether we have a voting process that is fair and trustworthy, is more important than any one election outcome.”

Stanford freshman Evan Jackson said afterward that Obama’s speech resonated with him. “I use social media a lot, every day, and I’m always seeing all the fake news that can be spread easily. And I do understand that when you have controversy attached to what you’re saying, it can reach larger crowds,” Jackson said. “So if we do find a way to better contain the controversy and the fake news, it can definitely help our democracy stay powerful for our nation.”

The Promise and Perils Technology Poses to Democracy


In his keynote, Obama reflected on how technology has transformed the way people create and consume media. Digital and social media companies have upended traditional media – from local newspapers to broadcast television, as well as the role these outlets played in society at large.

During the 1960s and 1970s, the American public tuned in to one of three major networks, and while media from those earlier eras had their own set of problems – such as excluding women and people of color – they did provide people with a shared culture, Obama said.

Moreover, these media institutions, with established journalistic best practices for accuracy and accountability, also provided people with similar information: “When it came to the news, at least, citizens across the political spectrum tended to operate using a shared set of facts – what they saw or what they heard from Walter Cronkite or David Brinkley.”

Fast forward to today, where everyone has access to individualized news feeds that are fed by algorithms that reward the loudest and angriest voices (and which technology companies profit from). “You have the sheer proliferation of content, and the splintering of information and audiences,” Obama observed. “That’s made democracy more complicated.”

Facts are competing with opinions, conspiracy theories, and fiction. “For more and more of us, search and social media platforms aren’t just our window into the internet. They serve as our primary source of news and information,” Obama said. “No one tells us that the window is blurred, subject to unseen distortions, and subtle manipulations.”

The splintering of news sources has also made all of us more prone to what psychologists call “confirmation bias,” Obama said. “Inside our personal information bubbles, our assumptions, our blind spots, our prejudices aren’t challenged, they are reinforced and naturally, we’re more likely to react negatively to those consuming different facts and opinions – all of which deepens existing racial and religious and cultural divides.”

But the problem is not just that our brains can’t keep up with the growing amount of information online, Obama argued. “They’re also the result of very specific choices made by the companies that have come to dominate the internet generally, and social media platforms in particular.”

The former president also made clear that he did not think technology was to blame for many of our social ills. Racism, sexism, and misogyny, all predate the internet, but technology has helped amplify them.

“Solving the disinformation problem won’t cure all that ails our democracies or tears at the fabric of our world, but it can help tamp down divisions and let us rebuild the trust and solidarity needed to make our democracy stronger,” Obama said.

He gave examples of how social media has fueled violence and extremism around the world. For example, leaders from countries such as Russia to China, Hungary, the Philippines, and Brazil have harnessed social media platforms to manipulate their populations. “Autocrats like Putin have used these platforms as a strategic weapon against democratic countries that they consider a threat,” Obama said.

He also called out emerging technologies such as AI for their potential to sow further discord online. “I’ve already seen demonstrations of deep fake technology that show what looks like me on a screen, saying stuff I did not say. It’s a strange experience people,” Obama said. “Without some standards, implications of this technology – for our elections, for our legal system, for our democracy, for rules of evidence, for our entire social order – are frightening and profound.”

‘Regulation Has to Be Part of the Answer’


Obama discussed potential solutions for addressing some of the problems he viewed as contributing to a backsliding of democracy in the second half of his talk.

In an apt metaphor for a speech delivered in Silicon Valley, Obama compared the U.S. Constitution to software for running society. It had “a really innovative design,” Obama said, but also significant bugs. “Slavery. You can discriminate against entire classes of people. Women couldn’t vote. Even white men without property couldn’t vote, couldn’t participate, weren’t part of ‘We the People.’”

The amendments to the Constitution were akin to software patches, the former president said, that allowed us to “continue to perfect our union.”

Similarly, governments and technology companies should be willing to introduce changes aimed at improving civil discourse online and reducing the amount of disinformation on the internet, Obama said.

“The internet is a tool. Social media is a tool. At the end of the day, tools don’t control us. We control them. And we can remake them. It’s up to each of us to decide what we value and then use the tools we’ve been given to advance those values,” he said.

The former president put forth various solutions for combating online disinformation, including regulation, which many tech companies fiercely oppose.

“Here in the United States, we have a long history of regulating new technologies in the name of public safety, from cars and airplanes to prescription drugs to appliances,” Obama said. “And while companies initially always complain that the rules are going to stifle innovation and destroy the industry, the truth is that a good regulatory environment usually ends up spurring innovation, because it raises the bar on safety and quality. And it turns out that innovation can meet that higher bar.”

In particular, Obama urged policymakers to rethink Section 230, enacted as part of the United States Communications Decency Act in 1996, which ​​stipulates that generally, online platforms cannot be held liable for content that other people post on their website.

But technology has changed dramatically over the past two decades since Section 230 was enacted, Obama said. “These platforms are not like the old phone company.”

He added: “In some cases, industry standards may replace or substitute for regulation, but regulation has to be part of the answer.”

Obama also urged technology companies to be more transparent in how they operate and “at minimum” should share with researchers and regulators how some of their products and services are designed so there is some accountability.

The responsibility also lies with ordinary citizens, the former president said. “We have to take it upon ourselves to become better consumers of news – looking at sources, thinking before we share, and teaching our kids to become critical thinkers who know how to evaluate sources and separate opinion from fact.”

Obama warned that if the U.S. does not act on these issues, it risks being eclipsed in this arena by other countries. “As the world’s leading democracy, we have to set a better example. We should be able to lead on these discussions internationally, not [be] in the rear. Right now, Europe is forging ahead with some of the most sweeping legislation in years to regulate the abuses that are seen in big tech companies,” Obama said. “Their approach may not be exactly right for the United States, but it points to the need for us to coordinate with other democracies. We need to find our voice in this global conversation.”

 

Transcript of President Obama's Keynote

Read More

Image of social media icons and a hand holding a phone
Blogs

Full-Spectrum Pro-Kremlin Online Propaganda about Ukraine

Narratives from overt propaganda, unattributed Telegram channels, and inauthentic social media accounts
Full-Spectrum Pro-Kremlin Online Propaganda about Ukraine
All News button
1
Subtitle

At a conference hosted by the Cyber Policy Center and Obama Foundation, former U.S. President Barack Obama delivered the keynote address about how information is created and consumed, and the threat that disinformation poses to democracy.

Date Label
-
Image
Text on blue background showing speaker headshots for bridging the gap event

Join us on Tuesday, May 24 from 12 PM - 1 PM PT for “Bridging the Cybersecurity Data Gap with Privacy Protected Data Sharing” featuring Taylor Reynolds of MIT’s Internet Policy Research Initiative, Megan Stifel of the Institute for Security and Technology, and Klara JordanChief Public Policy Officer of the Cyber Peace Institute, in conversation with Kelly Born of the Hewlett Foundation. This weekly seminar series is jointly organized by the Cyber Policy Center’s Program on Democracy and the Internet and the Hewlett Foundation’s Cyber Initiative.

About the Seminar:

Cyber attacks are increasing over time and useful insights into the causes and impact of successful attacks could help all organizations better understand the harm caused by such incidents, and improve their defenses. However, organizations currently have little incentive to report attempted or successful attacks if sharing such sensitive information could invite regulatory scrutiny, create reputational harm for the company, or provide an advantage to their competitors. The result is an environment where attacks happen on a regular basis, but collectively we learn very little from them. Today, neither the public nor policy makers fully understand the impact and risks of cyber-attacks - a gap that needs to be addressed to inform policy making, resiliency measures, and individual empowerment to seek redress.  Join Taylor Reynolds of MIT, Klara Jordan of the Cyber Peace Institute, and Megan Stifel of the Institute for Security and Technology, in conversation with Kelly Born of the Hewlett Foundation, to explore the problems posed by underreporting, the promise of new “privacy enhancing technologies” and the real-world challenges of deploying these technologies at scale. 

About the Speakers:

Taylor Reynolds is the research director of MIT's Internet Policy Research Initiative (IPRI) which collaborates with policymakers and technologists to improve the trustworthiness and effectiveness of interconnected digital systems like the Internet. Taylor's current research focuses on three areas: cyber security, cyber risk and the future of data. Taylor was previously a senior economist at the OECD and led the organization’s Information Economy Unit covering policy issues such as the role of information and communication technologies in the economy, digital content, the economic impacts of the Internet and green ICTs. His previous work at the OECD concentrated on telecommunication and broadcast markets with a particular focus on broadband.Before joining the OECD, Taylor worked at the International Telecommunication Union, the World Bank and the National Telecommunications and Information Administration (United States). Taylor has an MBA from MIT and a Ph.D. in Economics from American University in Washington, DC.

Megan Stifel is the Chief Strategy Officer at the Institute for Security and Technology, where she also leads the organization’s cyber-related work. Megan previously served as Global Policy Officer at the Global Cyber Alliance and as the Cybersecurity Policy Director at Public Knowledge. She is a Visiting Fellow at the National Security Institute. Megan previously served as a Director for International Cyber Policy at the National Security Council. Prior to the NSC, Ms. Stifel served in the U.S. Department of Justice as Director for Cyber Policy in the National Security Division and as counsel in the Criminal Division’s Computer Crime and Intellectual Property Section. Before law school, Ms. Stifel worked for the U.S. House of Representatives Permanent Select Committee on Intelligence. She received a Juris Doctorate from Indiana University and a Bachelor of Arts, magna cum laude, from the University of Notre Dame.

Klara Jordan is Chief Public Policy Officer of the Cyber Peace Institute. Prior to that, Klara was the Director for Government Affairs and Public Policy for the UK at BlackBerry and the Executive Director for the EU and Africa at the Global Cyber Alliance. She also served as the director of the Cyber Statecraft Initiative at the Atlantic Council think tank, and worked in the policy and privacy division of FireEye. Her background also includes work on international law issues at the American Society of International Law and at NATO’s Allied Command Transformation.

Kelly Born (moderator) is the Director of the Cyber Initiative at the William and Flora Hewlett Foundation. She leads a ten-year, $130 million grantmaking effort that aims to build a more robust cybersecurity field and improve policymaking. Previously, Kelly was executive director of the Stanford Cyber Policy Center. Prior to that, she was a Program Officer for the Madison Initiative at the William and Flora Hewlett Foundation, an 8-year, $150 million portfolio focused on improving U.S. democracy. Kelly oversaw Madison’s grantmaking on campaigns and elections, and digital disinformation.

Seminars

Image
headshots of Annet Aris, Sarah V. Stewart, Eva Maydell and Pierre-Arnaud Proux

Join us Tuesday, May 3rd from 12 PM - 1 PM PT for a webinar on Semiconductors, Supply Chains and Industrial Policy featuring Annet Aris of INSEAD, Sarah V. Stewart of Silverado Policy Accelerator Eva Maydell of the European Parliament and Pierre-Arnaud Proux, member of Executive Vice-President Margrethe Vestager’s Cabinet, in conversation with Marietje Schaake of the Cyber Policy Center. This weekly seminar series is jointly organized by the Cyber Policy Center’s Program on Democracy and the Internet and the Hewlett Foundation’s Cyber Initiative.

About The Seminar: 

A conversation exploring the economic and policy challenges resulting from the recent global chip shortage, with a discussion of issues such as protections against technology transfer efforts, the attraction and retention of high-skilled talent, and the strategic significance of the industry in light of accelerating digitization. How should the US and European governments tackle China’s market-distorting subsidies? How can onshore chip factory capacity be strengthened and secured? 

Together, this group will explore the history and future of the semiconductors industry and how policymakers across the Atlantic should respond to both vulnerabilities and opportunities.

About the Speakers

Annet Aris is Senior Affiliate Professor of Strategy at INSEAD. She joined INSEAD in 2003, her focus is on Digital transformation and disruption and its impact on society, industries and companies. She was nominated in 2010 and 2011 for the best teacher award by the MBA students. Annet has also extensive experience as a non-executive board member of a variety publicly listed companies across Europe. Currently she serves at the boards of Rabobank Group, Randstad NV, a global leader in HR services, the microchip machine manufacturer ASML NV, the intralogistics and forklift truck manufacturer Jungheinrich AG and the insurance company A.S.R. Netherlands N.V. Annet ranks in the top 10 most influential corporate directors in The Netherlands.

Sarah V. Stewart is the Executive Director of Silverado Policy Accelerator. Ms. Stewart has nearly two decades of experience as an international trade lawyer, trade policy expert, and trade negotiator. Immediately prior to joining Silverado, Ms. Stewart led the public policy efforts at Amazon on U.S. trade policy and export controls matters. From 2013 to 2018, Ms. Stewart worked for the Office of the United States Trade Representative, with her most recent position being the Deputy Assistant USTR for Environment and Natural Resources. During her time at USTR, Ms. Stewart was the lead environment chapter negotiator for the US-Mexico-Canada Agreement and the Transatlantic Trade and Investment Partnership (TTIP) negotiations with the European Union. Prior to joining USTR, Ms. Stewart served in different legal and policy roles at The Humane Society of the United States and Humane Society International, including spearheading a first ever international legal group.

Eva Maydell is a Bulgarian Member of the European Parliament. In 2017, she was the first woman elected as President of the European Movement International (EMI), the largest pan-European network of civil society organizations. It is present in 34 countries and encompasses 38 International Associations. Maydell was first elected to the European Parliament in 2014 at the age of 28, the youngest member of the European People's Party (EPP) Group at the time. She was re-elected in 2019 and is serving her second term as an MEP.

Pierre-Arnaud Proux is a member of Executive Vice-President Margrethe Vestager’s Cabinet. He leads the Cabinet’s work on industrial policy, the internal market, space policy, and Important Projects of Common European Interest. He previously worked at DG Competition, assessing public support to the financial sector as well as aid to the real economy channelled through financial intermediaries.

Marietje Schaake (Moderator) is international policy director at Stanford University Cyber Policy Center and international policy fellow at Stanford’s Institute for Human-Centered Artificial Intelligence. Between 2009 and 2019, Marietje served as a Member of European Parliament for the Dutch liberal democratic party where she focused on trade, foreign affairs, and technology policies. Marietje is an (Advisory) Board Member with a number of nonprofits including MERICS, ECFR, ORF and AccessNow. She writes a monthly column for the Financial Times and a bi-weekly column for the Dutch NRC newspaper.

 

Marietje Schaake
Annet Aris
Sarah V. Stewart
Eva Maydell
Seminars
-

Image
headshots of alex rice, camille francois and amit elazari

Join us on Tuesday, April 26th from 12 PM - 1 PM PT for “Bug Bounties & Bridge-Building: Lessons from Cybersecurity Vulnerability Disclosure for Addressing Socio-Technical Harms” featuring Camille François, Global Director for Trust & Safety at Niantic, Dr. Amit Elazari of Intel, and Alex Rice of HackerOne in conversation with Marietje Schaake of the Stanford Cyber Policy Center. This weekly seminar series is jointly organized by the Cyber Policy Center’s Program on Democracy and the Internet and the Hewlett Foundation’s Cyber Initiative.

About The Seminar: 

Join us for a conversation on the nascent adoption of ‘bug bounties,’ a popular bug-for-reward-style audit mechanism in the cybersecurity domain, (and related approaches, such as VDPs and pentesting) to the discovery of various social-technical harms, including those inflicted through algorithmic (or “AI”) systems. 

Following the recent publication by the Algorithmic Justice League (AJL) of a paper on the risks and opportunities presented by this shift, we are joined by one of the paper’s co-authors, Camille François, alongside practitioners with insights into these mechanisms from industry and government perspectives. Together, this group will explore these mechanisms in the context of emerging and historic practices, including as illuminated in AJL’s recent report.

Speakers:

Camille François works on the impacts of technology on society, with an emphasis on cyber conflict and information operations and currently serves as the global director of trust and safety at Niantic and is a lecturer at Columbia University’s School of International and Public Affairs. She was previously the chief innovation officer at Graphika where she oversaw its investigation, analyses and R&D teams and led the company’s work to detect and mitigate disinformation, media manipulation and harassment. François was previously a principal researcher at Google, in the “Jigsaw” team, an innovation unit that builds technology to address global security challenges and protect vulnerable users. François has advised governments and parliamentary committees on both sides of the Atlantic, investigated Russian interference in the 2016 U.S. presidential election on behalf of the U.S. Senate Select Intelligence Committee, and served as a special advisor to the chief technology officer of France. François is an affiliate scholar of the Harvard Berkman-Klein Center for Internet and Society, a Fulbright scholar and a Mozilla Fellow. She holds a masters degree in human rights from the French Institute of Political Sciences (Sciences-Po) and a masters degree in international security from the School of International and Public Affairs (SIPA) at Columbia University.

Dr. Amit Elazari is a Director, Global Cybersecurity Policy at Intel Corporation and a Lecturer at University of California (UC), Berkeley School of Information Master in Information and Cybersecurity, as well as a member of the External Advisory Committee for the Center of Long Term Cybersecurity. She holds a Doctoral Degree in the Law (J.S.D.) from UC Berkeley School of Law, the world’s leading law institution for technology law, and graduated summa cum laude three prior degrees in law and business. Her research in cybersecurity, privacy and intellectual property has appeared in leading technology law and computer science journals, presented at conferences such as RSA, Black Hat, USENIX and USENIX Security, and featured at leading news sites such as The Wall Street Journal, The Washington Post and the New York Times. She practiced law in Israel. 

Alex Rice is a founder and chief technology officer at HackerOne, the world's most popular bug bounty platform. Alex is responsible for developing the HackerOne technology vision, driving engineering efforts, and counseling customers as they build world-class security programs. Alex was previously at Facebook, where he founded the product security team, built one of the industry’s most successful security programs, and introduced new transport layer encryption used by more than a billion users. Alex also serves on the board of the Internet Bug Bounty, a nonprofit organization that enables and encourages friendly hackers to help build a more secure Internet.

Marietje Schaake
Camille François
Dr. Amit Elazari
Alex Rice
Seminars
-

Image
image of Anna-Maria Osula advertising event on april 20, 2022 on a blue background

Please join us on Wednesday, April 20th for a talk with Anna-Maria Osula, visiting scholar from TalTech. At this event co-sponsored by Stanford University Libraries, Anna-Maria will be introducing her research on private sector initiatives to develop and promote cyber norms of behavior.

Research Overview:

Given the multistakeholder nature of running the Internet and governing information and communication technologies, nation-states are not the only entities interested in shaping norms of behavior for cyberspace. Non-state actors are directly impacted by any decision on international norms in cyberspace. They are also expected to behave as responsible actors, being tied by the agreements negotiated by states at the UN platform. This means that non-state actors are involved in building and promoting norms and also playing a role in their interpretation and implementation. Anna-Maria will talk about her research project where she analyzes the private sector involvement in advancing cyber norms in international fora such as the United Nations.

Bio:

Anna-Maria Osula, currently a Global Digital Governance Fellow at Stanford University, is a senior researcher at Tallinn University of Technology and a senior policy officer at Guardtime. Her current research focus is cyber diplomacy and international law applicable to cyber operations. She also serves as a research fellow at Masaryk University under the project “Cyber Security, Cyber Crime and Critical Information Infrastructures Center of Excellence.” Previously, she worked as a legal researcher at the NATO CCDCOE, undertaking projects on national cyber security strategies, international organizations, international criminal cooperation, and norms. In addition to a Ph.D. in law from the University of Tartu, she holds an LLM degree in IT law from Stockholm University.

ENCINA HALL, ROOM E008, 616 Jane Stanford Way Stanford University Stanford, CA

Anna-Maria Osula Global Digital Governance Fellow
Seminars
Subscribe to Governance