IO - Home
My Heart Loves the Army
An Investigation into a Jordanian Disinformation Campaign on Facebook, TikTok and Twitter
Featured News and Publications
The Stanford Internet Observatory Turns Two
Event: Surgeon General Vivek Murthy
Election Integrity Partnership Releases Final Report on Mis- and Disinformation in 2020 U.S. Election
Subscribe to Stanford Internet Observatory News
Sign up to receive notification of our latest blog posts and reports.
Plandemic was one step in a larger process to raise the profile of its subject. SIO had begun to observe an increasing number of posts about Plandemic’s subject, Judy Mikovits, beginning on April 16. For two and a half weeks, we observed a series of cross-platform moments in which Judy Mikovits – a scientist whose work was retracted by journal editors – was recast as an expert whistleblower exposing a vast government cover-up.
Analysis of April 2020 Twitter takedowns linked to Saudia Arabia, the UAE, Egypt, Honduras, Serbia, and Indonesia
On March 11, 2020 Twitter shared with the Stanford Internet Observatory accounts and tweets associated with five distinct takedowns. These include:
As scientists continue to study how the COVID-19 pandemic took hold in Wuhan, China, and around the world, the infection’s early pathways have proven fertile ground for speculation and conspiracy theories. Although COVID-19’s earliest origins may remain uncertain, the story of one volley in the ongoing U.S.-China blame game shows that misinformation about the disease can be traced to specific speculations, distortions, and amplifications.
The Stanford Internet Observatory has been investigating new facets to the manipulation of the local media environment in Libya: Russian actors who are known to have previously created and sponsored online news media fronts and associated Facebook pages, now appear to be expanding into similar activities in broadcast media.
The perception of China’s handling of the coronavirus pandemic has been a significant challenge for the Chinese Communist Party (CCP) over the past two months. The CCP has been attempting to control the narrative and deflect blame since the start of the outbreak, both domestically and abroad.
This is the third of a series of pieces the Observatory intends to publish on societies and elections at risk from online disinformation. Our goal is to draw the attention of the media, tech platforms and other academics to these risks and to provide a basic background that could be useful to those who wish to study the information environment in these areas.
On January 11, 2020, Taiwan held its 15th presidential and 10th Legislative Yuan election. Taiwanese citizens soundly re-elected Democratic Progressive Party (DPP) candidate, Tsai Ing-wen, who won 57.1% of the vote over her opponents, Kuomintang (KMT) candidate Han Kuo-yu (who took 38.61%), and the People’s First Party candidate James Soong (4.26%). The DPP also maintained its majority in the Legislative Yuan, though with a slight decrease of a few seats. Voter turnout was high, with almost 74% of eligible voters casting ballots, up from 66% in 2016.
There is only one day left before Taiwan heads to the polls, and researchers, election integrity teams at tech platforms, and press are following the dynamics closely. On January 1st, Taiwan entered into its ten day polling black-out period, a time during which there is a strict ban on agencies and individuals sharing, or citing, any public survey related to a candidate or the election overall.
On December 20, 2019 Twitter announced the removal of 88,000 accounts managed by Smaat, a digital marketing company based in Saudi Arabia, and attributed thousands of these accounts to involvement in “a significant state-backed information operation”. On December 17 Twitter shared with the Stanford Internet Observatory 32,054,257 tweets from 5,929 randomly sampled accounts. In this report we provide a first analysis of the data.
Last Friday, December 13, 2019, Facebook announced it had removed 118 fan pages, 99 groups, and 51 accounts supporting Taiwan’s KMT presidential candidate, Han Kuo-yu. Our team at SIO had been observing several of the Groups removed, including one that was prominently featured in media coverage of the takedown: 2020韓國瑜總統後援會（總會）[“2020 Han Kuo-yu presidential support group (General group)”].
Russia’s global strategy for reasserting itself as a geopolitical superpower has led to an increased presence in Africa, where it has broadened efforts to shape the continent’s politics and pursue new economic opportunities to allay the effects of sanctions.
In the course of assisting reporter Judd Legum of Popular Information on an investigation into a Ukraine-based network of Facebook Pages (recently taken down), SIO researchers uncovered a similar network that appeared to be operating from Kosovo. This network, consisting of approximately 9 pages with 312,000 followers, focused predominantly on “Blue Lives Matter” content – an American social movement that expresses support for police officers.
Should We Be Worried About Election Interference in 2020? Probably, says Facebook’s Former Chief Security Officer
Alex Stamos is “extremely worried” that the upcoming U.S. presidential election will see some kind of interference from foreign adversaries.
“It’s too late for legislation — we start voting in the primaries in February,” Stamos told Michael McFaul, director of the Freeman Spogli Institute for International Studies, on the World Class podcast. “And it’s really unfortunate that we as a society watched the ball fly over the plate on this one.”
This is the first of a series of pieces we intend to publish on societies and elections at risk from online disinformation. Our goal is to draw the attention of the media, tech platforms and other academics to these risks and to provide a basic background that could be useful to those who wish to study the information environment in these areas.