Meta Hands Over Content Moderation to Users, Inspired by Elon Musk’s X
Meta, the parent company of Facebook, Instagram, and Threads, has made significant changes to its content moderation policies. Drawing inspiration from Elon Musk’s X platform, Meta is shifting away from traditional third-party fact-checking and moving toward a model driven by “community feedback.”
Previously, Meta partnered with independent fact-checking organizations to verify the accuracy of posts, a system often criticized for bias and slow response times. Now, with its new “Community Notes” feature, Meta allows users to provide additional context and information directly on posts. This system mirrors X’s successful implementation, enabling users to contribute to verifying the accuracy of shared content.
READ MORE: How to Delete or Deactivate Your Instagram Account in One Minute
Meta aims to increase transparency and reduce bias in content moderation. According to the company, gathering diverse perspectives from users will create a more accurate picture of reality while avoiding unilateral decision-making. The new system will initially launch in the United States in the coming months.
In addition to changes in fact-checking, Meta is revising other policies, including relocating its “Trust and Safety” teams from California to Texas and other U.S. locations. Content restrictions on topics like immigration and gender identity will be eased, and political content will return to users’ feeds with a more personalized approach.
Mark Zuckerberg Announces Shift in Meta’s Content Moderation Approach, Emphasizing User Reports
Mark Zuckerberg, CEO of Meta, has announced that automated content moderation systems will continue to operate, but their focus will shift toward addressing severe violations of rules, such as terrorism, child sexual abuse, drugs, and fraud. Less critical topics will be handled based on user reports. These changes come after Meta faced criticism in recent months for excessive censorship of harmless content and slow responses to user reports.
According to Meta, these changes are part of the company’s efforts to return to its commitment to free speech and rectify past mistakes. However, it remains unclear how effective this new system will be in practice and whether it will successfully tackle the spread of misinformation and misleading content. Despite this, the changes represent a significant shift in Meta’s approach to content moderation and the role users will play in the process.