Information Technology
Paragraphs

Hate speech is a contextual phenomenon. What offends or inflames in one context may differ from what incites violence in a different time, place, and cultural landscape. Theories of hate speech, especially Susan Benesch’s concept of “dangerous speech” (hateful speech that incites violence), have focused on the factors that cut across these paradigms. However, the existing scholarship is narrowly focused on situations of mass violence or societal unrest in America or Europe.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Subtitle

Published by Michigan Law School Scholarship Repository

Journal Publisher
Michigan Law School Scholarship Repository
Authors
Brittan Heller
Number
2
Authors
News Type
News
Date
Paragraphs

During a speech at Stanford University on Thursday, April 21, 2022, former U.S. President Barack Obama presented his audience with a stark choice: “Do we allow our democracy to wither, or do we make it better?”

Over the course of an hour-long address, Obama outlined the threat that disinformation online, including deepfake technology powered by AI, poses to democracy as well as ways he thought the problems might be addressed in the United States and abroad.

“This is an opportunity, it’s a chance that we should welcome for governments to take on a big important problem and prove that democracy and innovation can coexist,” Obama said.

Obama, who served as the 44th president of the United States from 2009 to 2017, was the keynote speaker at a one-day symposium, titled “Challenges to Democracy in the Digital Information Realm,” co-hosted by the Stanford Cyber Policy Center and the Obama Foundation on the Stanford campus on April 21.

The event brought together people working in technology, policy, and academia for panel discussions on topics ranging from the role of government in establishing online trust, the relationship between democracy and tech companies, and the threat of digital authoritarians.

Obama told a packed audience of more than 600 people in CEMEX auditorium – as well as more than 250,000 viewers tuning in online – that everyone is part of the solution to make democracy stronger in the digital age and that all of us – from technology companies and their employees to students and ordinary citizens – must work together to adapt old institutions and values to a new era of information. “If we do nothing, I’m convinced the trends that we’re seeing will get worse,” he said.

Introducing the former president was Michael McFaul, director at the Freeman Spogli Institute for International Studies and U.S. ambassador to Russia under Obama, and Stanford alum and Obama Foundation fellow, Tiana Epps-Johnson, BA ’08.

Epps-Johnson, who is the founder and executive director of the Center for Tech and Civic Life, recalled her time answering calls to an election protection hotline during the 2006 midterm election. She said the experience taught her an important lesson, which was that “the overall health of our democracy, whether we have a voting process that is fair and trustworthy, is more important than any one election outcome.”

Stanford freshman Evan Jackson said afterward that Obama’s speech resonated with him. “I use social media a lot, every day, and I’m always seeing all the fake news that can be spread easily. And I do understand that when you have controversy attached to what you’re saying, it can reach larger crowds,” Jackson said. “So if we do find a way to better contain the controversy and the fake news, it can definitely help our democracy stay powerful for our nation.”

The Promise and Perils Technology Poses to Democracy


In his keynote, Obama reflected on how technology has transformed the way people create and consume media. Digital and social media companies have upended traditional media – from local newspapers to broadcast television, as well as the role these outlets played in society at large.

During the 1960s and 1970s, the American public tuned in to one of three major networks, and while media from those earlier eras had their own set of problems – such as excluding women and people of color – they did provide people with a shared culture, Obama said.

Moreover, these media institutions, with established journalistic best practices for accuracy and accountability, also provided people with similar information: “When it came to the news, at least, citizens across the political spectrum tended to operate using a shared set of facts – what they saw or what they heard from Walter Cronkite or David Brinkley.”

Fast forward to today, where everyone has access to individualized news feeds that are fed by algorithms that reward the loudest and angriest voices (and which technology companies profit from). “You have the sheer proliferation of content, and the splintering of information and audiences,” Obama observed. “That’s made democracy more complicated.”

Facts are competing with opinions, conspiracy theories, and fiction. “For more and more of us, search and social media platforms aren’t just our window into the internet. They serve as our primary source of news and information,” Obama said. “No one tells us that the window is blurred, subject to unseen distortions, and subtle manipulations.”

The splintering of news sources has also made all of us more prone to what psychologists call “confirmation bias,” Obama said. “Inside our personal information bubbles, our assumptions, our blind spots, our prejudices aren’t challenged, they are reinforced and naturally, we’re more likely to react negatively to those consuming different facts and opinions – all of which deepens existing racial and religious and cultural divides.”

But the problem is not just that our brains can’t keep up with the growing amount of information online, Obama argued. “They’re also the result of very specific choices made by the companies that have come to dominate the internet generally, and social media platforms in particular.”

The former president also made clear that he did not think technology was to blame for many of our social ills. Racism, sexism, and misogyny, all predate the internet, but technology has helped amplify them.

“Solving the disinformation problem won’t cure all that ails our democracies or tears at the fabric of our world, but it can help tamp down divisions and let us rebuild the trust and solidarity needed to make our democracy stronger,” Obama said.

He gave examples of how social media has fueled violence and extremism around the world. For example, leaders from countries such as Russia to China, Hungary, the Philippines, and Brazil have harnessed social media platforms to manipulate their populations. “Autocrats like Putin have used these platforms as a strategic weapon against democratic countries that they consider a threat,” Obama said.

He also called out emerging technologies such as AI for their potential to sow further discord online. “I’ve already seen demonstrations of deep fake technology that show what looks like me on a screen, saying stuff I did not say. It’s a strange experience people,” Obama said. “Without some standards, implications of this technology – for our elections, for our legal system, for our democracy, for rules of evidence, for our entire social order – are frightening and profound.”

‘Regulation Has to Be Part of the Answer’


Obama discussed potential solutions for addressing some of the problems he viewed as contributing to a backsliding of democracy in the second half of his talk.

In an apt metaphor for a speech delivered in Silicon Valley, Obama compared the U.S. Constitution to software for running society. It had “a really innovative design,” Obama said, but also significant bugs. “Slavery. You can discriminate against entire classes of people. Women couldn’t vote. Even white men without property couldn’t vote, couldn’t participate, weren’t part of ‘We the People.’”

The amendments to the Constitution were akin to software patches, the former president said, that allowed us to “continue to perfect our union.”

Similarly, governments and technology companies should be willing to introduce changes aimed at improving civil discourse online and reducing the amount of disinformation on the internet, Obama said.

“The internet is a tool. Social media is a tool. At the end of the day, tools don’t control us. We control them. And we can remake them. It’s up to each of us to decide what we value and then use the tools we’ve been given to advance those values,” he said.

The former president put forth various solutions for combating online disinformation, including regulation, which many tech companies fiercely oppose.

“Here in the United States, we have a long history of regulating new technologies in the name of public safety, from cars and airplanes to prescription drugs to appliances,” Obama said. “And while companies initially always complain that the rules are going to stifle innovation and destroy the industry, the truth is that a good regulatory environment usually ends up spurring innovation, because it raises the bar on safety and quality. And it turns out that innovation can meet that higher bar.”

In particular, Obama urged policymakers to rethink Section 230, enacted as part of the United States Communications Decency Act in 1996, which ​​stipulates that generally, online platforms cannot be held liable for content that other people post on their website.

But technology has changed dramatically over the past two decades since Section 230 was enacted, Obama said. “These platforms are not like the old phone company.”

He added: “In some cases, industry standards may replace or substitute for regulation, but regulation has to be part of the answer.”

Obama also urged technology companies to be more transparent in how they operate and “at minimum” should share with researchers and regulators how some of their products and services are designed so there is some accountability.

The responsibility also lies with ordinary citizens, the former president said. “We have to take it upon ourselves to become better consumers of news – looking at sources, thinking before we share, and teaching our kids to become critical thinkers who know how to evaluate sources and separate opinion from fact.”

Obama warned that if the U.S. does not act on these issues, it risks being eclipsed in this arena by other countries. “As the world’s leading democracy, we have to set a better example. We should be able to lead on these discussions internationally, not [be] in the rear. Right now, Europe is forging ahead with some of the most sweeping legislation in years to regulate the abuses that are seen in big tech companies,” Obama said. “Their approach may not be exactly right for the United States, but it points to the need for us to coordinate with other democracies. We need to find our voice in this global conversation.”

 

Transcript of President Obama's Keynote

Read More

Hero Image
President Barack Obama at the “Challenges to Democracy in the Digital Information Realm" conference.
President Barack Obama delivers the keynote address at the “Challenges to Democracy in the Digital Information Realm" conference hosted by the Cyber Policy Center and Obama Foundation.
Andrew Brodhead, Stanford University
All News button
1
Subtitle

At a conference hosted by the Cyber Policy Center and Obama Foundation, former U.S. President Barack Obama delivered the keynote address about how information is created and consumed, and the threat that disinformation poses to democracy.

Date Label
News Type
Q&As
Date
Paragraphs

This interview with CISAC Affiliate Christopher Painter was originally produced by Jen Kirby. The complete article is available at Vox.

The frequency, scope and scale of ransomware attacks against public and private systems is accelerating. In the latest incident, the ransomware group REvil has demanded $70 million to unlock the systems of the software company Kaseya, an attack that affects not only Kaseya, but simultaneously exploits all of the company’s clients.

The REvil, JBS meatpacking and Colonial Pipeline attacks have abruptly raised the profile of ransomware from a malicious strand of criminality to a national security priority. These are issues that Christopher Painter, an affiliate at the Center for International Security and Cooperation (CISAC), has worked on at length during his tenures as a senior official at the Department of Justice, the FBI, the National Security Council and as the world's first top cyber diplomat at the State Department.

Jen Kirby, a reporter for Vox, interviewed Painter to discuss how cybercrimes are evolving and what governments should do to keep ransomware attacks from escalating geopolitical tensions online and off.



Jen Kirby:
I think a good place to start would be: What are “ransomware attacks”?

Christopher Painter:
It is largely criminal groups who are getting into computers through any number of potential vulnerabilities, and then they essentially lock the systems — they encrypt the data in a way that makes it impossible for you to see your files. And they demand ransom, they demand payment. In exchange for that payment, they will give you — or they claim, they don’t always do it — they claim they’ll give you the decryption keys, or the codes, that allow you to unlock your own files and have access to them again.

That is what traditionally we say is “ransomware.” That’s been going on for some time, but it’s gotten much more acute recently.

There is another half of that, which is that groups don’t just hold your files for ransom, they either leak or threaten to leak or expose your files and your information — your secrets and your emails, whatever you have — publicly, either in an attempt to embarrass you or to extort more money out of you, because you don’t want those things to happen. So it’s split now into two tracks, but they’re a combined method of getting money.

Jen Kirby:
We’ve recently had some high-profile ransomware attacks, including this recent REvil incident. Is it that we’re seeing a lot more of them, or they’re just bigger and bolder? How do you assess that ransomware attacks are becoming more acute?

Christopher Painter:
We’ve seen this going on for some time. I was one of the co-chairs of this Ransomware Task Force that issued a report recently. One of the reasons we did this report was we’re trying to call greater attention to this issue. Although governments and law enforcement were taking it seriously, it wasn’t being given the kind of national-level priority it deserved.

It was being treated as more of an ordinary cybercrime issue. Most governments’ attention is focused on big nation-state activity — like the SolarWinds hack [where suspected Russian government hackers breached US government departments], which are important, and we need to care about those. But we’re very worried about this, too.

It’s especially become more of an issue during the pandemic, when some of the ransomware actors were going after health care systems and health care providers.That combined with these big infrastructure attacks — the Colonial Pipeline clearly was one of them. Another one was the meat processing plants. Another one was hospital systems in Ireland. You also had the DC Police Department being victimized by ransomware. These things are very high-profile. When you’re lining up for gas because of a ransomware attack, and you can’t get your food because of a ransomware attack, that brings it home as a priority. And then, of course, you have what happened this past weekend. So ransomware has not abated, and it continues to get more serious and hit more organizations.

painter

Christopher Painter

Affiliate at the Center for Internatial Security and Cooperation (CISAC)
Full Profile

Read More

Hero Image
Ransomware locks up digital data until a fee is paid to the hackers. Getty Images
All News button
1
Subtitle

Christopher Painter explains why the emerging pattern of ransomware attacks needs to be addressed at a political level – both domestically and internationally – and not be treated solely as a criminal issue.

Image
Digital Activism and Authoritarian Adaptation in the Middle East Agenda (1 of 2)

Image
Digital Activism and Authoritarian Adaptation in the Middle East Agenda (2 of 2)

Panel 1: Digital Activism

Tuesday, May 25, 2021 | 9-10:30 am PT

Opening Remarks: Marc Lynch, Eileen Donahoe, and Larry Diamond

Moderator: Hesham Sallam

  • Wafa Ben-Hassine: “The Hyper-Aware and Not-So-Aware: What's Next for the MENA Region's Activists and Society at Large Vis-a-Vis the Internet?”
  • Adel Iskander: “Re(Membering) Culture and Heritage: Egypt's Latest Political Turf War”
  • Zachary Steinert-Threlkeld: “Civilian Behavior on Social Media During Civil War”
  • Joshua Tucker: “Beyond Liberation Technology? The Recent Uses of Social Media by Pro-Democracy Activists”

 

Panel 2: Authoritarian Abuses of Internet Technologies

Thursday, May 27, 2021 | 9-10:30 am PT

Moderator: Marc Lynch

  • Marwa Fatafta: “Transnational or Cross-Border Digital Repression in the MENA Region”
  • Andrew Leber: “Social Media Manipulation in the MENA: Inauthenticity, Inequality, and Insecurity” (Co-authored paper with Alexei Abrahams)
  • Marc Owen Jones: “Tracking Adversaries: The Evolution of Manipulation Tactics on Gulf Twitter”
  • Xiao Qiang: “Chinese Digital Authoritarianism and Its Global Impact”

 

Panel 3: Government Reshaping of Norms and Practices to Constrain Online Activity

Tuesday, June 1, 2021 | 9-10:30 am PT

Moderator: Eileen Donahoe

  • Ahmed Shaheed: “Binary Threat: How State Cyber Policy and Practice Undermines Human Rights in the Middle East and North Africa Region”
  • Mona Elswah, Mahsa Alimardani: "The Hurdles Involved in Content Moderation in the MENA Region"
  • Mohamed Najem: “The Role of the Gulf in Governing Digital Space in the Arab Region”
  • James Shires: “The Techno-Regulation of Critical Communications Infrastructures and Their Political Potential in the Gulf”
  • Alexei Abrahams: “The Web (In)Security of Middle Eastern Civil Society and Media”

 

Panel 4: Cross-Border Information Operations

Thursday, June 3, 2021 | 9-10:30 am PT

Moderator: Larry Diamond

  • Alexandra Siegel: “Official Foreign Influence Operations: Transnational State Media in the Arab Online Sphere”
  • Hamit Akin Unver: “Russian Disinformation Operations in Turkey: 2015-2020”
  • Shelby Grossman and Renee DiResta: “In-House vs. Outsourced Trolls: How Digital Mercenaries Shape State Influence Strategies”
  • Nathaniel Gleicher: “Covert Manipulation, Overt Influence, Direct Exploit: Understanding and Countering Influence Operations in the Middle East and Beyond”
-

This event is co-sponsored with the Cyber Policy Center and the Center for a New American Security.

* Please note all CISAC events are scheduled using the Pacific Time Zone

 

Seminar Recording: https://youtu.be/KaydMdIVtGc

 

About the Event: The United States is steadily losing ground in the race against China to pioneer the most important technologies of the 21st century. With technology a critical determinant of future military advantage, a key driver of economic prosperity, and a potent tool for the promotion of different models of governance, the stakes could not be higher. To compete, China is leveraging its formidable scale—whether measured in terms of research and development expenditures, data sets, scientists and engineers, venture capital, or the reach of its leading technology companies. The only way for the United States to tip the scale back in its favor is to deepen cooperation with allies. The global diffusion of innovation also places a premium on aligning U.S. and ally efforts to protect technology. Unless coordinated with allies, tougher U.S. investment screening and export control policies will feature major seams that Beijing can exploit.

On early June, join Stanford's Center for International Security and Cooperation (CISAC) and the Center for a New American Security (CNAS) for a unique virtual event that will feature three policy experts advancing concrete ideas for how the United States can enhance cooperation with allies around technology innovation and protection.

This webinar will be on-the-record, and include time for audience Q&A.

 

About the Speakers: 

Anja Manuel, Stanford Research Affiliate, CNAS Adjunct Senior Fellow, Partner at Rice, Hadley, Gates & Manuel LLC, and author with Pav Singh of Compete, Contest and Collaborate: How to Win the Technology Race with China.

 

Daniel Kliman, Senior Fellow and Director, CNAS Asia-Pacific Security Program, and co-author of a recent report, Forging an Alliance Innovation Base.

 

Martijn Rasser, Senior Fellow, CNAS Technology and National Security Program, and lead researcher on the Technology Alliance Project

Virtual Seminar

Anja Manuel, Daniel Kliman, and Martijn Rasser
Seminars
0
renee-diresta.jpg

Renée DiResta is the former Research Manager at the Stanford Internet Observatory. She investigates the spread of malign narratives across social networks, and assists policymakers in understanding and responding to the problem. She has advised Congress, the State Department, and other academic, civic, and business organizations, and has studied disinformation and computational propaganda in the context of pseudoscience conspiracies, terrorism, and state-sponsored information warfare.

You can see a full list of Renée's writing and speeches on her website: www.reneediresta.com or follow her @noupside.

 

Former Research Manager, Stanford Internet Observatory
0
Eloise Duvillier

Eloise Duvillier is the Program Manager of the Program on Democracy and the Internet at the Cyber Policy Center. She previously was a HR Program Manager and acting HR Business Partner at Bytedance Inc, a rapidly-growing Chinese technology startup. At Bytedance, she supported the globalization of the company by driving US acquisition integrations in Los Angeles and building new R&D teams in Seattle and Silicon Valley. Prior to Bytedance, she led talent acquisition for Baidu USA LLC’s artificial intelligence division. She began her career in the nonprofit industry where she worked in foster care, HIV education and emergency response during humanitarian crises, as well as helping war-torn communities rebuild. She graduated from University of California, Berkeley with a bachelor’s degree in Development Studies, focusing on political economics in unindustrialized societies.

Program Manager, Program on Democracy and the Internet
News Type
News
Date
Paragraphs

Midterm elections pose an opportunity for hackers interested in disrupting the democratic process

Voter registration systems provide an additional target for hackers intending to disrupt the US midterm elections; if voting machines themselves are too disperse or too obvious a target, removing voters from the rolls could have a similar effect. in Esquire, Jack Holmes explains that election security experts consider this one of many nightmare scenarios facing the American voting public—and thus, American democracy itself—on the eve of the 2018 midterm elections. (Allison Berke, Executive Director of the Stanford Cyber Initiative, quoted.)

All News button
1
0
top_pick_rsd25_070_0254a.jpg

Daphne Keller is the Director of Platform Regulation at the Stanford Program in Law, Science, & Technology. Her academic, policy, and popular press writing focuses on platform regulation and Internet users'; rights in the U.S., EU, and around the world. Her recent work has focused on platform transparency, data collection for artificial intelligence, interoperability models, and “must-carry” obligations. She has testified before legislatures, courts, and regulatory bodies around the world on topics ranging from the practical realities of content moderation to copyright and data protection. She was previously Associate General Counsel for Google, where she had responsibility for the company’s web search products. She is a graduate of Yale Law School, Brown University, and Head Start.

SHORT PIECES

 

ACADEMIC PUBLICATIONS

 

POLICY PUBLICATIONS

 

FILINGS

  • U.S. Supreme Court amicus brief on behalf of Francis Fukuyama, NetChoice v. Moody (2024)
  • U.S. Supreme Court amicus brief with ACLU, Gonzalez v. Google (2023)
  • Comment to European Commission on data access under EU Digital Services Act
  • U.S. Senate testimony on platform transparency

 

PUBLICATIONS LIST

Director of Platform Regulation, Stanford Program in Law, Science & Technology (LST)
Social Science Research Scholar
Date Label
Subscribe to Information Technology