Meta is hurtling towards total platform decay - let's be honest about why.

Meta's decision to peel back third party fact checking is going to make everything worse. And I don't, for even a second, buy that it is in the name of free speech, expression, or anything other than what's best for Meta's business.
Back in 2018, I was in Sydney at a roundtable discussing how tech could help combat terrorism and what actions could or should be taken.
I learnt two major things that day:
The first was that there was a concerning spate of young women in West Africa that had re-posted violent terrorism videos. Except, instead of an alarming rise in radicalisation, this was just the easiest way to have your entire Facebook account deleted.*
And the second was that when a Facebook representative explained proudly about their global moderation efforts, there didn't seem to be any long term solution.
“But hang on… surely this isn’t sustainable? If you're planning to bring on another billion or so people, you’ll just have to keep hiring endless moderators, right?” I asked.
The answer was, somewhat uncomfortably, yes.
Up until these changes, Meta was using up to 90 accredited fact checking organisations.
After all, what’s the alternative?
We found out this week. Meta’s new policies on fact-checking and ‘expression’ on their platform, in short:
- Ending third-party fact-checking: Meta is discontinuing its fact-checking program, which relied on external organisations to assess the accuracy of content.
- Shifting to Community Notes: The company will replace fact-checking with a system called "Community Notes," similar to the one used on X (formerly Twitter). This crowdsourced approach allows users to add context and notes to posts.
- Loosening restrictions on certain topics: Meta plans to allow more speech by lifting restrictions on some topics that are part of mainstream discourse.
- Focusing on illegal and high-severity violations: Enforcement efforts will now concentrate on illegal activities and severe violations, such as incitement to violence.
- Higher threshold for content removal: Meta will now require a higher degree of confidence that its terms of service have been violated before removing content.
In my opinion, this is Meta’s near-complete passing of responsibility onto you, me, and all of their users. By the way, that’s an estimated 3.2 billion people—or about 40% of the world’s population.
What’s most concerning to me is a line from their announcement: “Over time, we ended up with too much content being fact-checked that people would understand to be legitimate political speech or debate.”
(Why, I wonder, are we so afraid of speech and debate adhering to facts?)
Instead of managing widespread fact-checking themselves, Meta is now relying on the community to do it. You might think this is a great idea, to avoid Meta being the "arbiter of truth." But in a world where everyone is already time-poor and polarised, do we even have the time for this?
Sure, this might help promote more critical thinking among everyday people. But I’d suggest that if you're working full time, raising kids, and trying to survive a cost-of-living crisis, dedicating time to critical thinking isn’t exactly at the top of the priority list. Especially on a platform designed for quick responses, not deep thought.
I’m not claiming that social media is the root of all our problems, or that it's the sole cause of the divided and polarised society we live in. But it’s certainly a significant factor. (Along with wealth inequality, urban planning, global stress, and more.)
And it's a big part. Especially now, as AI adds jet fuel to a global mis and disinformation crisis. But also just in how much of our lives these platforms have hoovered up already.
Yet, it seems we’ve all accepted the adage that if you're not paying for it, you are the product.
But how much of our time, effort, and brain power are we willing to give?
At the end of the day, nothing about this announcement as a step towards allowing 'more expression' rings true to me. Instead, it feels as if a major company that interacts with and profits from 40% of the global population's time and attention has found a way to save an enormous amount of money, by taking on less responsibility in an time when that is more important than ever.
At the very least, we should be honest about that.
Honesty, transparency, and a re-think about how we engage with social media platforms is the only way the next generation is not going to inherit all the problems we have today.
*I don't endorse this method, but it does go to show how hard it is to disentangle yourself from the platform.