Meta-Free: Three Months Later
Reflections on leaving Meta, facing the deep roots of exploitation, and deepening my commitment to systemic change.
It’s been 3 months since I left Meta, closing all my personal and professional accounts on Facebook, Instagram, and Whatsapp. I made this decision following Zuckerberg’s announcement in January 2025 about eliminating fact-checkers and modifying Meta’s hateful conduct policy, a move that would only fuel more polarisation and digital (and physical) violence.
Closing my accounts was followed by a month and a half of social media rehab and digital detox.
I took some days off and leaned into the energetics of winter, embracing a much-needed ritual and time to honour the feelings of sadness and low energy that we all experience from time to time.
I re-entered the digital world in mid-March, once I had let go of what no longer served me and processed the grief of loss.
How is it going so far?
I know that closing my Meta accounts was part of that letting go, an analogy for the loss of my hopes for big tech (yes, I still had some). Leaving Meta felt liberating, to a point that made me see even more clearly the exploitative, abusive, competitive, all-for-profit nature of the systems we live entangled in.
In the grief and sadness that followed this decision, during my last inner winter, I was finally able to surrender to a truth I had been trying to address in a politically correct way for the last couple of years:
The systems’ degenerative mechanics are all-pervasive.
They run through our veins and intoxicate our minds.
We carry them within.
As someone who is dedicating her life to guiding and supporting others in inner transformation, so that it ripples into outer transformation, this realisation reinforced my commitment to systemic change and to raising awareness about its degenerative dynamics.
How is Meta doing these days?
I had not been paying attention to Meta news since Zuckerberg’s announcements in January. Coincidentally, as I was reflecting on my 3 months of being Meta-free, I stumbled upon these headlines:
‘Meta faces Ghana lawsuits over impact of extreme content on moderators’ (The Guardian).
and
‘Meta's content moderation contractor to cut 2,000 jobs in Barcelona’ (Reuters).
and also
‘Meta’s oversight board rebukes company over policy overhaul’ (Reuters).
What we see here is a very opaque approach to new policies.
Meta’s Oversight Board, which operates independently from Meta to offer nonbinding policy recommendations and is funded by a company grant, cited concerns that Meta had announced the changes “hastily, in a departure from regular procedure, with no public information shared as to what, if any, prior human rights due diligence the company performed.”
If that is opaque, the situation regarding moderation decisions is even worse. Meta had already decided to move its US-based content moderators from California to Texas to reduce bias concerns, as announced in Zuckerberg’s statement on January 7.
Outside the US, the Barcelona office in Spain was until very recently one of the major hubs for content moderation in Europe, collaborating with offices run by other outsourced companies in Bulgaria, Portugal, and Colombia. The Philippines (considered by some the ‘call-centre capital’ of the world) also provides reliable and cost-effective content moderation outsourcing, trusted by Meta, YouTube, and TikTok, among others.
The situation becomes horrifying when it comes to extreme content moderation.
In 2024, Meta already faced a lawsuit from former moderators based in Kenya, where more than 140 Facebook moderators were diagnosed with severe PTSD (post-traumatic stress disorder). The company decided to move its operations to Ghana, where the situation has worsened, according to Martha Dark, Foxglove’s co-executive director (a UK-based non-profit organisation that backed the Kenyan court case):
“After the atrocious treatment of Facebook content moderators we exposed in Kenya, I thought it couldn’t get any worse. I was wrong. These are the worst conditions I have seen in six years of working with social media content moderators around the world.” Moderators in Ghana report suicide attempts, depression, substance abuse, insomnia, surveillance, and threats.
Content moderation operations in Africa, as well as in Spain and the Philippines, are always managed by outsourcing firms, making the supply chain opaque and enforcing NDAs that make it more difficult to hold the client accountable for the harm inflicted on those exposed to extreme and violent content.
What is happening in Africa is modern slavery, forced exploitation.
It is clear proof of the ‘rich-poor’ abusive dynamics of a globalised world that perpetuates colonial mindsets, today performed by big tech companies like Meta.
Leaving Meta was a personal decision rooted in ethical motives, as I no longer wanted to participate in yet another system of oppression and social polarisation. Sadly, it is now revealing its true nature even more starkly than I had expected.