A note: This post discusses suicide and other forms of self-harm in the context of online platform policies. In the United States, the National Suicide Prevention Lifeline is available 24/7 in English at 1-800-273-8255 and in Spanish at 1-888-628-9454. It offers Tele-Interpreter services in over 150 additional languages.
Online platforms — from search engines to social media sites to chat apps — can play a role in supporting individuals considering hurting themselves by providing resources and making space for recovery and support communities. Yet they can also host content glorifying or inciting self-harm.
What are platforms’ public-facing policies on suicide, self-injury and eating disorders? In this report we analyze the published policies for 39 online platforms, including search engines, social media networks, creator platforms, gaming platforms, dating apps and chat apps. Emulating similar Internet Observatory analyses of platform policies on election and vaccine misinformation, the report ranks platforms based on policy comprehensiveness across defined categories.
While the existence of a platform’s policy does not mean the policy cannot be improved, this report takes the first step of documenting whether there is a public-facing policy at all. In ongoing research, we are assessing how effectively platforms implement their stated policies.
Overall, we find that many of the platforms’ policies intended to keep users safe have significant gaps. While it is not necessary for policies to be identical across different platforms, each platform should address content about suicide, self-injury and eating disorders. When platforms provide clear policies, users can know what to expect and can understand why content may be acted upon. Self-harm prevention groups can understand whether a platform’s policies are implementing best practices.
If you believe we made an error in describing your platforms’ policies, or want to update us on new policies, please email us at email@example.com.