Program on Platform Regulation

hands holding phones

The Program on Platform Regulation

The Program on Platform Regulation focuses on current or emerging law governing Internet platforms, with an emphasis on laws’ consequences for the rights and interests of Internet users and the public.
illustration of people gathered to protest

Amplification and Its Discontents: Daphne Keller on why regulating the reach of online content is hard

Paper exploring the idea that government can side-step free expression concerns by regulating online speech that platforms recommend or rank algorithmically. The analysis primarily addresses limits based in U.S. First Amendme


collage of social media company icons

The Future of Platform Power: Making Middleware Work

Unlike many other proposals to curtail platform power, middleware does not violate the First Amendment of the U.S. Constitution. In the United States, that makes middleware a path forward in a neighborhood full of dead ends.
getty image of person holding transparent phone

Some Humility About Transparency

In a new blog post, Daphne Keller, Director of the Program on Platform Regulation at the Cyber Policy Center, looks at the need for transparency when it comes to content moderation and asks, what kind of transparency do we really want?
Lawfare podcast icon

The Lawfare Podcast: The Good, the Bad and the Ugly of Section 230 Reform

On this episode of Arbiters of Truth, the Lawfare Podcast’s miniseries on our online information ecosystem, Evelyn Douek and Quinta Jurecic speaks with Daphne Keller, the director of the Program on Platform Regulation at Stanford's Cyber Policy Center and an expert on Section 230 of the Communications Decency Act.

Who Do You Sue

Questions of state and private power are deeply intertwined. To understand and protect internet users’ rights, we must understand and engage with both.

More from PPR

justice scales

If Lawmakers Don’t Like Platforms’ Speech Rules, Here’s What They Can Do About It. Spoiler: The Options Aren’t Great

What should platforms like Facebook or YouTube do when users post speech that is technically legal, but widely abhorred? In the U.S. that has included things like the horrific video of the 2019 massacre in Christchurch. What about harder calls – like posts that some people see as anti-immigrant hate speech, and others see as important political discourse?
GRUR International Icon

JOURNAL ARTICLE | Facebook Filters, Fundamental Rights, and the CJEU’s Glawischnig-Piesczek Ruling

The Court of Justice of the European Union’s (CJEU) 2019 ruling in Glawischnig-Piesczek v Facebook Ireland** addresses courts’ powers to issue injunctions requiring internet hosting platforms to proactively monitor content posted by their users.
European Union Flag

GUEST POST | Positive Intent Protections: Incorporating a Good Samaritan principle in the EU Digital Services Act

The “Good Samaritan” principle ensures that online intermediaries are not penalized for good faith measures against illegal or other forms of inappropriate content. This is a rule that applies to concrete types of intermediaries, particularly those providing hosting services.

WILMap: An ongoing project of the Program on Platform Regulation

The WILMap offers an overview of legislation, decisions or legislative proposals around the globe. The database allows users to search for legal developments by topic, country, kind of intermediary, and more.



Daphne Keller

Director of Program on Platform Regulation, Cyber Policy Center

Daphne Keller

Director of Program on Platform Regulation, Cyber Policy Center

Joan Barata

Intermediary Liability Fellow, Program on Platform Regulation, Cyber Policy Center

Joan Barata

Intermediary Liability Fellow, Program on Platform Regulation, Cyber Policy Center

Q&A with Daphne Keller of the Program on Platform Regulation

Keller explains some of the issues currently surrounding platform regulation.