Preventing “Torrents of Hate” or Stifling Free Expression Online?
An Assessment of Social Media Content Removal in France, Germany, and Sweden
This report examines how content moderation occurs on two major online platforms, Facebook and YouTube, analyzing the frequency of comment removals and the nature of the deleted comments. In a world that works the way policymakers intend, we would expect to find that most deleted comments constitute illegal speech.
To understand the nature of deleted comments in this study, the authors gathered comments from 60 of the largest Facebook pages and YouTube channels in France, Germany, and Sweden (20 in each country) and tracked which comments disappeared within a two-week period between June and July 2023. While not feasible to ascertain the actor responsible for deleting comments—the platform itself, the page or channel administrators, or the users—the report can determine the scope and content of the deleted comments on relevant Facebook pages and YouTube channels.
The collected comments were analyzed by legal experts to determine whether they were illegal based on the relevant laws in effect in each country. The non-legal deleted comments were coded into several categories, including general expressions of opinion, incomprehensible comments, spam, derogatory speech, and legal hate speech. While there was some overlap among the comment categories, it is important to note that our legal experts did not find, for instance, that all hate speech comments would be considered illegal in every country. Additionally, the report analyzes the specific content rules, or lack thereof, for all the pages under examination. These rules apply to content hosted by the pages or channels and complement Facebook’s and YouTube’s general content policies.
The report is based on more than 1,275,000 comments on social media in Sweden, France, and Germany collected over a 14-day period.
Udgivelsesdato
May 28, 2024