Following the election of another Liberal Government, free speech and censorship will soon be back on the table. On this week’s No Nonsense, Tech Law Expert Daphne Keller on the problems of regulating online content.
On the 20th anniversary of 9/11, four Stanford scholars and leading experts in national security, terrorism and contemporary conflict – Condoleezza Rice, Amy Zegart, Martha Crenshaw and Lisa Blaydes – reflect on how their teaching of the terrorist attacks has evolved.
In a new piece in the Financial Times, Marietje Schaake argues that protection for critical infrastructure is too often awarded using outdated criteria
A new grant aims to support a collaborative team of both Stanford and University of Washington researchers, as they explore new areas of study in the mis- and disinformation field.
James joins as a Senior Advisor and will be partnering with Andrew Grotto, Director of GTG on a project focused on the concept of "reasonableness" in tort law and regulatory policy for digital risks, especially cybersecurity risks.
POLITICO’s annual ranking of the 28 power players behind Europe’s tech revolution includes the Cyber Policy Center's Marietje Schaake. "As EU and U.S. officials seek common ground in regulating the tech sector, Schaake is the voice to listen to on both sides of the Atlantic."
Christopher Painter explains why the emerging pattern of ransomware attacks needs to be addressed at a political level – both domestically and internationally – and not be treated solely as a criminal issue.
In The Politics of Order in Informal Markets: How the State Shapes Private Governance, Grossman explores findings that challenge the conventional wisdom that private good governance in developing countries thrives when the government keeps its hands off private group affairs.
Scholars at the Freeman Spogli Institute for International Studies hope that President Joe Biden’s meeting with Russian President Vladimir Putin will lay the groundwork for negotiations in the near future, particularly around nuclear weapons.
When we’re faced with a video recording of an event—such as an incident of police brutality—we can generally trust that the event happened as shown in the video. But that may soon change, thanks to the advent of so-called “deepfake” videos that use machine learning technology to show a real person saying and doing things they haven’t.