The Consulate General of France and CPSL Convene for Content Moderation Seminar
In February the Consulate General of France, and the Content Policy & Society Lab convened a seminar to discuss online content moderation and the ways forward in 2022
On February 8th, coinciding with the Safer Internet Day, the Consulate General of France and the Content Policy & Society Lab, a project of the Democracy and the Internet at Stanford University, co-organized a multistakeholder seminar on the theme Online Content Regulation: the way(s) forward in 2022.
The seminar was the opportunity to gather EU Diplomats in San Francisco and their Tech Advisors, Representatives of 14 Content Platforms, as well as Stanford academics, to discuss upcoming EU content regulation, US debates on content regulation, and the self-regulation initiatives by companies and platforms.
General Takeaways from the meeting:
- There is a need for cross industry and multistakeholder conversations, in informal settings that allow candid dialogues on complex issues related to Content governance, content policy, and content regulation
- There is a clear need for clear international guidelines on content governance, to avoid conflicting obligations that would stem from national regulation
- A substantial future threat is going to be the expansion of compelled content removal without due process and transparency by Governments; setting standards in those areas requires leadership by Governments, as the industry cannot set those standards itself
Specific Takeaways on the Digital Services Act
The scope of the DSA is wide : hosting services, intermediaries, online platforms like marketplaces, app stores, social media platforms, etc. The new EU legislation proposes new rules that are proportionate and allow scaling of smaller platforms in the European Single Market. The main objective of the DSA is to:
- Place Citizens at the center
- Establish transparency, clear responsibility and accountability for content platforms
Participants discussed a number of other proposed acts:
- Platform Accountability and Transparency Act (PATA), introduced in Congress by U.S. Senators Rob Portman (R), Amy Klobuchar (D) and Chris Coons (D), which would increase transparency, and give researchers access to critical data gathered by platforms, in order to gain insight into key societal issues
- Communications Decency Act’s section 230, which provides immunity for website platforms with respect to third-party content
- The EARN It Act that proposes to form a Commission that will draft rules to fight Child Sexual Abusem Material (CSAM) online and decide what platform actions were deemed reasonable or not
Speakers and participants looked also at the Digital Trust and Safety Partnership (DTSP) that aims to help company members be ready for content moderation crisis, by focusing on standards and processes, not on content. The organization recently initiated the development of the SAFE framework, the “first ever attempt to articulate current industry efforts to address online Content and Conduct-Related Risks”. This framework will serve as the basis of auditing mechanisms, conducted by third-parties. Participants noted too the various efforts by platforms to increase transparency, including the work of the Facebook Oversight Board, and Twitter's publishing of an archive of removed content and a paper on their transparency effort.