Newsletter

Subscribe

* indicates required
News
Filter:
Show Hide
Ex: author name, topic, etc.
Ex: author name, topic, etc.
By Topic
Show Hide
  • Expanded
  • Expanded
  • Expanded
  • Expanded
  • Expanded
  • Expanded
  • Expanded
  • Expanded
By Region
Show Hide
  • Expanded
  • Expanded
  • Expanded
  • Expanded
  • Expanded
  • Expanded
By Type
Show Hide
By date
Show Hide

In political conspiracy theories, as in television shows, the plot elements are always the same. (From The Atlantic)

Commentary

Renee DiResta of the Stanford Internet Observatory writes about the growing body of research suggesting human behavior on social media is strikingly similar to collective behavior in nature. Published in Noema Magazine.

A political cartoon encouraging Indians to boycott Chinese products
Blogs
Blogs

An Analysis of a Pro-Indian Army Covert Influence Operation on Twitter

Blogs

Stanford Internet Observatory collaborated with Graphika to analyze a large network of accounts removed from Facebook, Instagram, and Twitter in our latest report. This information operation likely originated in the United States and targeted a range of countries in the Middle East and Central Asia.

Twitter suspended a network of accounts that coordinated to promote narratives around the coronavirus pandemic, and to amplify a pro-Russian news site ahead of the invasion of Ukraine.

In an essay for Lawfare Blog, Samantha Bradshaw, Renee DiResta and Christopher Giles look at how state war propaganda in Russia is increasingly prevalent on platforms that offer minimal-moderation virality as their value proposition.

Riana Pfefferkorn of SIO spoke with Wired on Meta's expansion of end-to-end encryption in Messenger.

In an essay for Lawfare Blog, Samantha Bradshaw of American University and Shelby Grossman of the Stanford Internet Observatory explore whether two key platforms, Facebook and Twitter, were internally consistent in how they applied their labels during the 2020 presidential election.

A graphic depiction of a face falling towards the ground on a red background overlayed with a black satellite dish and the word "takedown".
Blogs
Blogs

An Investigation into an Inauthentic Facebook and Instagram Network Linked to an Israeli Public Relations Firm

During a hearing titled “A Growing Threat: Foreign And Domestic Sources Of Disinformation," DiResta offered expert testimony on influence operations and the spread of narratives across social and media networks.

A look at how user choice and transparency provide new ways of addressing content moderation and online safety policy.

Gab was founded in 2016 as an uncensored alternative to mainstream social media platforms. Stanford Internet Observatory’s latest report looks at behaviors and dynamics across the platform.

The Stanford Internet Observatory and the Trust and Safety Foundation will host a two-day conference focusing on cutting-edge research in trust and safety for those in academia, industry, civil society, and government.

Shelby Grossman shares what she and her team watch for when analyzing social media posts and other online reports related to the Russian invasion of Ukraine. (Appeared first in Stanford News)

The Journal of Online Trust and Safety published its second issue on Tuesday, March 1.

The Virality Project final report finds recycled anti-vaccine narratives and viral content driven by recurring actors.

Narratives from overt propaganda, unattributed Telegram channels, and inauthentic social media accounts

Research on inauthentic behavior on TikTok, misinformation on Stanford's campus, Telegram activity in Belarus, health insurance scams that run advertisements on Google, and QAnon content on Tumblr.

How well do platform reporting flows and context labels work with screen readers for the visually impaired?

In this post and in the attached reports, we investigate four newly suspended Twitter operations.

The report is the culmination of work by Aspen Digita's Commission on Information Disorder, with guidance from Stanford Cyber's Renee DiResta, Alex Stamos, Daphne Keller, Nate Persily and Herb Lin, and provides a framework for action with 15 recommendations to build trust & reduce harm.