Generative Machine Learning and Online Sexual Exploitation
The Stanford Internet Observatory and Thorn find rapid advances in generative machine learning make it possible to create realistic imagery that is facilitating child sexual exploitation.
The Safeguarding Democracy Project brings together in dialogue scholars, election administrators, legislators, lawyers, voting rights advocates, and concerned citizens to develop practical solutions to urgent problems.
Challenges to Democracy in the Digital Information Realm
Former U.S. President Barack Obama delivered a keynote address about how information is created and consumed, and the threat that disinformation poses to democracy.
Following election day, narrative after bad-faith narrative took aim at election officials, often culminating in months of personal threats against their lives and the lives of their family members.
The Journal of Online Trust and Safety is a no fee, fast peer review, and open access journal. Authors may submit letters of inquiry to assess whether their manuscript is a good fit.
Moderated Content from Stanford Law School is podcast content about content moderation, moderated by assistant professor Evelyn Douek. The community standards of this podcast prohibit anything except the wonkiest conversations about the regulation—both public and private—of what you see, hear and do online.
The Trust & Safety Teaching Consortium is a coalition of academic, industry and non-profit experts in online trust and safety problems. Our goal is to create content that can be used to teach a variety of audiences about trust and safety issues in a wide
Platformer Highlights Findings from Journal Commentary
A February 2024 Platformer article highlighted a Journal of Online Trust and Safety commentary titled: “Burden of Proof: Lessons Learned for Regulators from the Oversight Board’s Implementation Work.”
Wall Street Journal Highlights Findings from Journal Article
A February 2024 article in the Wall Street Journal on talking to kids about sexting discussed a Journal of Online Trust and Safety article titled "American Parents’ Perceptions of Child Explicit Image Sharing."
An September 2023 article in the New York Times about fact checking discussed a Journal of Online Trust and Safety commentary titled "Future Challenges for Online, Crowdsourced Content Moderation: Evidence from Twitter’s Community Notes."
in Antje von Ungern-Sternberg (ed.), Content Regulation in the European Union – The Digital Services Act, TRIER STUDIES ON DIGITAL LAW, Volume 1, Verein für Recht und Digitalisierung e.V., Institute for Digital Law (IRDT), Trier April 2023
The Biden administration’s new National Cybersecurity Strategy takes on the third rail of cybersecurity policy: software liability. For decades, scholars and litigators have been talking about imposing legal liability on the makers of insecure software. Authored by Jim Dempsey for Lawfare Blog
The European Union’s Digital Services Act (DSA) is a major milestone in the history of platform regulation. Other governments are now asking themselves what the DSA’s passage means for them. This post will briefly discuss that question, with a focus on platforms like Facebook or YouTube and their smaller would-be rivals.
Graham Webster has authored a chapter in the forthcoming book from Harvard University Press, The China Questions 2: Critical Insights into US-China Relations
This book created the field of the law of democracy, offering a systematic account of the legal construction of American democracy. This edition represents a significant revision that reflects the embattled state of democracy in the U.S. and abroad.
Transparency is essential to getting every other part of platform regulation right. But defining sound transparency rules—identifying what information is needed most from platforms like Twitter or YouTube, and how to get it—is quite complicated.
Transparency is essential to getting every other part of platform regulation right. But defining sound transparency rules—identifying what information is needed most from platforms like Twitter or YouTube, and how to get it—is quite complicated.
Responding to Elon Musk’s proposed acquisition of Twitter, Daphne Keller suggests that “middleware” models, not common carriage rules, best put control over internet speech regulation in the hands of users.
United States Senate Committee on the Judiciary, Subcommittee on Subcommittee on Privacy, Technology and the Law,
May 5, 2022
On May 4th, in front of the Subcommittee on Privacy, Technology, and the Law, Nate Persily, James B. McClatchy Professor Of Law and codirector of the Stanford Cyber Poicy Center called upon the Subcomittee to enact legislation to ensure data relevant to contemporary social problems is unlocked, so researchers can study how big these problems are and seek to solve them.