Meta's Advocacy for 'Liberal Speech' Leads to Significant Reduction in Content Removals
Meta announced plans in January to scale back certain content moderation efforts and relax its rules, favoring free expression over stringent moderation. These changes have led to a decrease in the number of posts removed from Facebook and Instagram, Meta disclosed in its Q1 Community Standards Enforcement Report.
The shift in policy has resulted in a roughly 33 percent reduction in global content removals on Facebook and Instagram from January to March 2025, compared to the previous quarter. The tech giant removed approximately 1.6 billion items compared to 2.4 billion in the previous quarter, according to an analysis by our website.
Meta's new policies have assisted in cutting erroneous content takedowns in half in the U.S., without exposing users to a significant increase in offensive content. Across Instagram and Facebook, Meta removed about 50 percent fewer posts for spam, 36 percent fewer for child endangerment, and 29 percent fewer for hateful conduct. The only category with an increase in removals was suicide and self-harm content.
The tech giant acknowledged that changes aimed at minimizing enforcement errors was one reason for the significant drop in content removal. The company stated that across various policy areas, there was a decrease in the amount of content actioned, and a decrease in the percentage of content taken down before a user reported it. There was also a corresponding decrease in content appealed and ultimately restored.
Meta started implementation of its new content rules at the beginning of the year, which CEO Mark Zuckerberg described as outdated. These changes allowed Instagram and Facebook users to use some language that human rights activists view as discriminatory toward immigrants or individuals identifying as transgender. For instance, Meta now permits "allegations of mental illness or abnormality when based on gender or sexual orientation."
As part of these wide-reaching changes, Meta also reduced its reliance on automated tools to identify and remove posts suspected of less severe violations of its rules because of their high error rates. This move has been met with frustration from users.
During the first quarter of 2025, Meta's automated systems accounted for 97.4 percent of content removed from Instagram under its hate speech policies. This figure represents a minor decrease from the end of 2024. On the other hand, automated removals for bullying and harassment on Facebook dropped nearly 12 percentage points. In some categories, Meta's systems were slightly more proactive compared to the previous quarter.
Recent developments have shown Meta's shift towards community-driven moderation and away from proactive scanning for rule-breaking content. This approach may lead to variations in post removal rates and could impact the overall effectiveness of content moderation on Facebook and Instagram. For instance, a move towards community notes for fact-checking, similar to X (formerly Twitter), owned by Elon Musk, could result in inconsistent enforcement due to the reliance on user reports. Furthermore, decisions by the Meta Oversight Board, which published its first six decisions under new content moderation policies in late April 2025, reflect a more permissive stance on potentially harmful or discriminatory expression. This change may make it harder to regulate against such content. The company's transparency reports for Q1 2025 also noted an increase in prevalence of Violent & Graphic content on Facebook, suggesting that despite efforts to improve enforcement, the actual removal of violating content may not be keeping pace with the volume of shared content.
- Meta's new content policies, announced in January, have led to a decrease in the number of posts removed for hateful conduct, spam, and child endangerment, as disclosed in its Q1 Community Standards Enforcement Report.
- In response to Meta's policy changes, the tech giant reduced its reliance on automated tools for identifying and removing posts suspected of less severe violations due to their high error rates.
- The shift towards community-driven moderation on Facebook and Instagram, as a result of Meta's new policies, may lead to variations in post removal rates and potential inconsistencies in enforcement.
- An analysis by our website reveals that Meta removed approximately 800 million fewer items from Facebook and Instagram from January to March 2025, compared to the previous quarter, due to the implementation of the new content rules.