A client recently asked me to assess the potential impact of Meta’s latest changes to its content moderation approach, specifically regarding their effect on Canadian government organizations. I’ve summarized some key findings below, as I believe they may also be valuable to others.
What’s the issue?
Meta, the parent company of global social media platforms such as Facebook, Instagram, and Threads, recently announced that it will remove its third-party fact-checking program and shift away from extensive content moderation. It plans to replace its existing content moderation system with a crowd-sourced “Community Notes” model similar to the one used by X (formerly Twitter). This development raises concerns for Canadian government organizations, many of which use Meta’s platforms for communication, outreach, and engagement with Canadians.
“We built a lot of complex systems to moderate content, but the problem with complex systems is they make mistakes even if they accidentally censor just 1% of posts. That’s millions of people, and we’ve reached a point where it’s just too many mistakes and too much censorship”
-Mark Zuckerberg, CEO, Meta
Context
- Content moderation on social media involves monitoring and regulating user-generated content to ensure it adheres to the platform’s community guidelines and complies with the legal requirements in the countries where the platform operates.
- On January 7, 2025, Meta CEO Mark Zuckerberg announced a significant shift in the company’s approach to content moderation. Citing concerns about the political bias of its 3rd party verifiers and a desire to promote free expression, Mark stated that Meta will discontinue its fact-checking program and reduce reliance on content moderation teams.
- Meta previously employed a combination of artificial intelligence, human moderators, and third-party fact-checkers to identify and address content that violated its policies, including hate speech, mis/dis- information, and incitement to violence. This system, while imperfect, aimed to create a safer and more trustworthy online environment.
- Meta plans to replace its existing system with a crowd-sourced “Community Notes” model, similar to the one used by X, where users can comment on the accuracy of posts.
- Meta hasn’t released specifics about how its own version of Community Notes will work; however, it is likely going to be similar to X, where approved contributors call out content deemed false or misleading by attaching notes providing more context.
- This decision follows previous policy changes that reduced the visibility of political content and loosened restrictions on controversial topics like immigration and gender.
- Meta plans to phase in Community notes in the United States over the first quarter of 2025 and continue to improve it over the year.
- While there’s no confirmed timeline for expanding Community Notes to Canada, it’s highly probable that Meta will eventually implement this feature globally, given its goal of promoting user-driven content moderation.
- Meta plans to relocate its remaining content moderation teams from California to Texas, which some experts suggest could further influence how content is moderated.
“As we make the transition, we will get rid of our fact-checking control, stop demoting fact-checked content, and, instead of overlaying full-screen interstitial warnings you have to click through before you can even see the post, we will use a much less obtrusive label indicating that there is additional information for those who want to see it.”
-Joel Kaplan, Chief Global Affairs Officer, Meta
Key considerations
Pros:
- Greater user engagement and sense of ownership.
- Diverse perspectives reduce perceived bias.
- Facilitates more open dialogue and a wider range of views.
- Less risk of over-censorship.
- Avoids false positives (i.e. blocking content that shouldn’t have been blocked).
- Users can personalize their experience and curate their content.
- Advertisers may benefit from a more lenient content policy, which could allow for a wider range of creative and targeted marketing strategies.
- Users who value free speech and less restrictive moderation may increase engagement, creating more opportunities for advertisers to connect with specific audiences.
- For government organizations, platforms with high user engagement could still serve as valuable channels for outreach and public awareness campaigns.
- Relaxed policies on sensitive topics (e.g., immigration, gender) could foster more open discussions and engage users who felt previously censored, potentially broadening public discourse.
- Greater user engagement could increase participation in conversations on topics critical to policy development or public awareness.
- Meta’s reduced reliance on fact-checkers may appeal to users who distrust institutional interventions in content moderation.
Cons:
- Accuracy and reliability of content may decline.
- Possibility of false balance and bias in Community Notes.
- Incentive for malicious actors to infiltrate Community Notes with AI and/or bots.
- Potential harm to public trust and discourse integrity.
- Higher exposure to harmful content like hate speech.
- Public trust in Canadian government organizations may decline if misinformation about them increases as a result.
- Risk of creating even greater echo chambers and deepening polarization.
- Sharing critical information with diverse audiences and minority communities will likely become more complex.
- Advertisers may reduce spending or withdraw entirely due to concerns over brand safety if their ads appear alongside harmful or controversial content, potentially decreasing platform revenue and trust.
- A decline in public perception of Meta’s platforms could impact their effectiveness as communication tools for government organizations.
- Government agencies might face reputational risks if their messaging is associated with a platform that is seen as unreliable or polarizing.
- Relaxed policies could lead to the normalization of harmful stereotypes, discriminatory content, or misinformation, creating a more hostile environment for marginalized groups.
- In response to Canada’s Online News Act (Bill C-18), Meta has already stopped displaying news content on its platforms in Canada. This and the shift to Community Notes could further amplify misinformation and disinformation, complicating efforts to share accurate and trusted information.
- Meta’s perceived alignment with specific political figures or demographics could harm the government’s reputation for neutrality, making government engagement efforts on Meta platforms more challenging to justify.
What to do for now
At this time, I think Canadian government organizations should closely monitor how Meta’s content moderation changes unfold. These platforms remain key for reaching many Canadians, especially given the ongoing effects of Bill C-18 on news content. It’s important to keep an eye on how these changes impact public discussions as they roll out in the United States. Once the changes are introduced in Canada and there’s more clarity on how Community Notes will work, organizations should evaluate whether participating as contributors could help build trust. This is also a good opportunity to reassess whether Meta’s platforms align with your organization’s strategic priorities and social media engagement goals.
Be First to Comment