
Recent changes in how the big tech platforms X and Meta operate – specifically, stepping back from the responsibility of moderating disinformation online – have already negatively impacted the online environments in some regions across the world. That’s according to APC members we spoke to, asking them how the governance changes at the two behemoth social media companies have affected their work – and, importantly, what they think civil society organisations should do about it.
“The changes have had tangible negative impacts on our work and on the communities we support,” says Catalina Balla from Derechos Digitales, an organisation based in Chile but working across Latin America. “The dismantling of basic accountability mechanisms – such as third-party fact checking – has facilitated the spread of misinformation. This is especially the case during electoral periods or moments of political crises in the region.”
With X shrinking its moderation team and disbanding its Trust and Safety Council, which provided expertise and guidance on online safety, and Meta ending its support for independent fact checkers for Facebook and Instagram, the platforms have placed an even greater burden on activists and journalists to counter hate speech and disinformation. They now offer even less structural support, and few channels are available to engage them. This means already overstretched journalists have less time to investigate real stories in the public interest, and many activists have to take valuable time away from helping the communities they work with in more productive ways.
“The influx of disinformation has already overwhelmed the journalistic media, especially considering most are barely making ends meet,” says Asad Baig from Media Matters for Democracy, an organisation founded by journalists and working on media rights in Pakistan. “The termination of whatever limited efforts Meta was putting into fact checking doesn’t just translate into additional overhead and pressure for journalistic outlets. It also represents a dangerous shift in the balance of credibility online.”
“When Twitter became X, it quietly removed tools we used to track and stop false rumours during elections,” explains Zaituni Njovu from the Zaina Foundation working in Tanzania. “We spent hours chasing down lies about women journalists after they couldn’t be flagged quickly. And when Meta stopped third-party fact checking in January, false posts calling women journalists ‘enemies of the state’ spread unchecked on Facebook and Instagram. We now have to sift through everything ourselves, which takes away time from supporting our communities.”
For Jalal Abukhater from the Palestine-based organisation 7amleh – The Arab Center for the Advancement of Social Media, the changes at X meant that they have lost a vital channel of communication with the platform to raise their concerns at a critical time: “With Twitter becoming X, a lot of the changes during that period meant that we were no longer able to communicate directly with the company about the increasingly hostile rhetoric and violent speech in Hebrew on the platform, especially inciting against Palestinians, and often leading to actual world harm.”
FLAME, a gender and sexual rights organisation based in Taiwan, has a similar experience to share. “After Meta's policy change, it has become much harder to get content fact checked or flagged in a timely manner,” says the organisation’s Mallie Hsieh. She adds that this goes hand in hand with concerted attempts to destabilise the democracy in her country, giving an example of misogynistic and hate-filled viral posts from 2022 claiming Taiwanese female soldiers were captured during a Chinese military drill as the kind of misinformation that activists need to counter online. This, she says, is not only about gendered attacks that intimidate women, but about attacks that “undermine Taiwan’s democratic sovereignty.”
Fact checking “freedom of expression”
While some organisations, such as Media Matters for Democracy and Jinbonet in South Korea, are more cynical about the impact the platforms were having on moderating disinformation in the first place, activists are also bemused by the claim by the platforms that what is at stake is “free speech”. They say this argument is “hypocritical” and, ironically, “misleading”, and does not properly represent how freedom of expression works in democracy.
“What is truly at stake is not free speech, but the business model of these companies – built on mass data extraction and the amplification of polarising content to maximise engagement and profits,” says Balla.
“Accountability mechanisms and freedom of expression are two distinct issues,” says Rosa Kuo from the Open Culture Foundation, also based in Taiwan. “Whether a mechanism is appropriate – and how it balances with freedom of expression – should be debated separately.”
“Platforms claim that any form of accountability threatens free speech, yet they routinely take down content, shadow ban users, and manipulate algorithmic visibility in ways that disproportionately target pro-Palestinian narratives,” adds Baig. “The selective enforcement is not just hypocritical, it’s strategic. What they are actually resisting is not censorship, but oversight.”
Balla agrees: “We must reclaim the original meaning of freedom of expression as a fundamental right that enables democratic participation, not as a corporate shield to avoid accountability.”
But, in the face of the power that big tech companies wield, how can this “original meaning” of free expression be reclaimed? How should activists best respond? And are there any viable alternatives to the big tech platforms that they can turn to?
There’s strength in collective action
APC members say it is vital that organisations work together, whether to advocate for policy change, or on practical interventions that benefit communities. Suggestions ranged from setting up a rapid-response team to call out harmful policy changes as they happen, to running cross-training sessions to share tips on fighting disinformation, and developing shared educational tools to equip grassroots communities with the skills to detect disinformation and advocate for safer digital spaces.
Some see an important function in the APC network documenting platform governance failures across different regions. “We need to use APC’s platform to document, archive and amplify how these converging setbacks, including but not limited to platform deregulation, authoritarian overreach and funding disruptions, are collectively shrinking civic space,” says Baig. “There’s value in building a shared evidentiary record, not just for advocacy, but to shape future standards and push back against narratives that frame these developments as isolated or organic.”
While many feel that as a network, APC can strengthen alliances and raise collective demands in global spaces, this collective action might mean looking outside of the network for collaborators – perhaps in unusual places. Both 7amleh and Pollicy, a feminist collective based in Uganda and working on harnessing data for social good, feel that tech workers in particular are natural allies. “Our advocacy needs to highlight the poor working conditions of tech workers whose profits the big tech companies depend on,” says Irene Mwendwa from Pollicy. For Abukhater, tech workers, alongside policy makers, are “crucial to this endgame.”
“We must continue to explore all avenues and not place all our eggs in one basket,” he says. “Yes, our traditional advocacy with companies might not work anymore, but it does not mean we do not have leverage over those multinational companies. I believe the path of working closely with tech workers to help coordinate and organise their ranks is something worth exploring, for they are the most well-positioned to influence change.”
Getting governments to work together
It is also necessary that governments work together in regional blocs to take on the power of big tech effectively. “We are dealing with transnational companies, which are large private corporations, building networks to act and influence both the state and society,” says Ana Claudia Mielke from Intervozes, an organisation based in Brazil, where the Supreme Court took the unprecedented move of blocking X after it refused to comply with a court order. “It is very difficult to act against powers of this level and scope with individual actions by countries.”
“What is missing is intragovernmental coordination of the over 40 artificial intelligence companies in the world, besides social media companies,” Mwendwa says, pointing to the need for cross-border regulatory collaboration in Africa.
Organisations offer the European Union (EU) as an example of how this could be done effectively. “European standards have increasingly become global norms – a phenomenon often referred to as the ‘Brussels effect’,” says Byoung-il Oh, the president of Jinbonet. He offers as examples the EU’s AI Act, its Digital Markets Act, as well as the Digital Services Act, which Abukhater says has been particularly useful to 7amleh in its work in exploring ways to hold the platforms to account. In response to the EU’s regulatory interventions, big tech companies have been pushing back, including delaying service launches in Europe, and X withdrawing from the EU voluntary Code of Practice on Disinformation. This shows regulatory power, not failure.
But while regional collaboration among governments is necessary, for Hsieh it is important to work towards a multistakeholder governance model, creating room for civil society to participate in the oversight of big tech companies. Others agree. “Most importantly, we, as in civil society, despite all our challenges and setbacks, should continue and lead in shaping future governance frameworks, ones grounded in our own lived realities,” says Baig.
Exploring alternative platforms, and the need for policy support
A big problem for journalists – and organisations who work with journalists – is that platforms such as X are central to the work they do. Baig gives the example of the shutdown of X in Pakistan in the aftermath of the country’s elections in February 2024, which were marred by allegations of electoral fraud. This, he said, cut activists and journalists off from a vital channel for real-time news monitoring, audience engagement and public discourse. “For journalists, X served not only as a tool for sourcing and sharing news but also for fact checking, amplifying underreported stories, and directly engaging with readers and peers,” he says. “Its suspension, especially in the aftermath of the 2024 elections, has been widely seen as a tactic to suppress dissent and restrict the visibility of independent voices. It narrowed the information space and disrupted newsroom workflows, particularly in fast-paced reporting environments.” The service was only recently restored after a 15-month ban.
As this experience suggests, one of the problems organisations face is that the mainstream platforms are exactly where they need to engage, while for others they sometimes offered the best way to reach a wider audiences. “We felt the changes in X most strongly,” says Mielke. “Many people have left the platform, we've lost some followers, and the reach of our posts has decreased. Before, there was constant growth.”
Several platforms, some decentralised, some not, were mentioned that could be explored as alternatives. These included LinkedIn and the better-known Mastodon – which GreenNet, a United Kingdom-based APC founding member that also runs APC’s servers, finds important but “difficult to use.” Basic tech options also need to be re-looked at, such as going back to using SMS alerts. It is also important to work closely with community radio, which provides grassroots reach for many organisations.
But what exactly works for organisations, and for the kind of work organisations do, is not certain. For Oh, “There is still no clear alternative to big social media.”
“Naver remains an important communication space in Korea, but Naver itself can be considered part of Korea’s big tech ecosystem,” he explains. “There is a community in Korea called Parti that aims to build alternative public forums, but it has not yet become a mainstream platform.”
“Frankly, it is quite difficult,” says Kuo, a sentiment also expressed by Hsieh, who nevertheless remains optimistic that an alternative is possible: “In Taiwan, it’s difficult to fully replace international social media platforms due to their dominance. However, we believe it is possible – with the right resources and policy support.”
“Taiwan has had various local platforms in the past, such as Dcard, Plurk and Discourse, and some using BBS [bulletin board systems]. They have fostered trust and solidarity within smaller, often marginalised, communities,” she says. “While their reach may be limited, they represent early prototypes of relationship-based digital spaces.”
But APC members also say that governments need to support the development of tech alternatives to the mainstream platforms, including through policies and regulations that can sustain them, much like some are doing to support community-centred connectivity networks.
“It requires public policies that make it possible to structure these new spaces, but state public policies that are based on the concept that communication is a human right and that are not changed (or cancelled) at the whim of governments,” says Mielke.
The need to experiment, and rebuild trust
“We must not stop working towards this kind of innovation,” says Abukhater. “There are many who work on developing platforms that centre the user rather than exploit the user. In the meantime, we have to be smart about withdrawing from the already-existing social media spaces as we cannot deny the global reach of those platforms.” Being “smart”, for APC partner The Engine Room, might mean not withdrawing completely from the mainstream platforms. Instead it advocates for working across different platforms and community spaces, depending what each offers organisations. It refers to this as the working in the “pluriverse”, and points out that “to enjoy alternatives, users may need to embrace slower forms of engagement and more ‘low-fi’ experiences with technology.”
“It requires imagination, commitment and collective effort,” says Balla. “Alternatives to big social media platforms not only exist – they are necessary. Federated and decentralised models like Mastodon and the broader fediverse may not yet offer the massive reach of corporate platforms, but they present a real opportunity to reclaim the internet as a commons, not a marketplace.” She adds that in Latin America, from “podcasts and newsletters to in-person events, closed groups, and federated platforms” there is a growing movement to step “outside the algorithm”.
Despite the learning curve, perhaps some doubt and fear, and certainly the challenges in working with the platform alternatives, for GreenNet it is at least important to try them out. “GreenNet has been running Mastodon and we've already said, ‘APC people, you're welcome to use it,’” says the organisation’s Ed Maw, who calls himself the “second least techie person” at GreenNet and admits he finds it a “struggle to get [his] head around” the platform. “But if people want to set themselves up, then it’s there.”
For Baig, it is necessary for organisations to take these sorts of opportunities. “I strongly believe the future of social media is not a singular, infinitely large platform or network controlled by one corporation, but rather a web of decentralised, interconnected communities,” he says.
“The idea that billions of people should gather, speak and be governed under one algorithmic roof has already proven to be not only dangerous but unsustainable.”
“Real connection doesn’t come from scale alone, it comes from trust, relevance and reciprocity, which are things the big platforms were never built to sustain,” he adds.
This article was produced thanks to the contributions of several APC members, particularly from the Global South, who were consulted on the impacts of recent changes in the governance of big platforms and possible alternatives and reactions. For further insight into the topic, you can also refer to this article, which compiles diverse resources and materials recommended by the APC network during the consultation.
The consultations were carried out by Maja Romano in April 2025 and the article written by Alan Finlay.