Information and speech in the new social media landscape
Sweeping changes at Meta and open questions about TikTok’s future have broad implications for information integrity and freedom of speech.
Hey Checklisters,
We knew 2025 would be a year of profound change for social media in the United States, and January has already thrown us a few curveballs. The consequences will reverberate for some time to come.
If you’re running late, here’s your TL;DR Checklist:
✅ Meedan called out Meta’s decision to end third-party fact-checking in the U.S. by saying it was politics, plain and simple.
✅ We’re equally concerned about Meta’s rollback of protections for vulnerable users, and we said so in an interview with Axios.
✅ TikTok’s short-lived U.S. ban is novel for Americans but nothing new for users in the Larger World.
Top Comment
In concert with President Donald Trump’s return to the White House, we’ve seen a raft of troubling policy decisions break out across the tech landscape this month. From Meta’s rollback of protections for vulnerable users to Trump’s executive order nullifying previous efforts to bring accountability to the burgeoning AI sector, we are seeing seismic shifts that could rapidly erode the quality of our shared information ecosystems and drive up harassment and real-world harm for vulnerable populations worldwide. These changes mark a reversal of more than a decade of progress that our partners have achieved in the name of promoting safety, equity, and mutual respect in digital spaces.
Meta divests from fact-checking
In our own statement, we made it clear that we believe Meta’s recent decision — to terminate third-party fact-checking in the U.S. and shift content moderation toward user reporting — is a purely political maneuver intended to please the new administration. Even worse, fact-checking organizations, both within and beyond our network, were effectively smeared by Meta CEO Mark Zuckerberg, who compared their work to censorship in his announcement of the changes.
To be clear, fact-checkers are not censors — and they never have been. Our colleagues at the Poynter Institute’s International Fact-Checking Network (IFCN) highlighted this important truth in a collective response to Zuckerberg signed by more than 100 individual organizations.
“People online have often blamed and harassed fact-checkers for Meta’s actions. Your recent comments will no doubt fuel those perceptions,” the IFCN wrote. “But the reality is that Meta staff decided on how content found to be false by fact-checkers should be downranked or labeled. Several fact-checkers over the years have suggested to Meta how it could improve this labeling to be less intrusive and avoid even the appearance of censorship, but Meta never acted on those suggestions.”
From ‘hate speech’ to ‘hateful conduct’
In tandem with these actions, Meta reversed course for several prohibitions on hateful conduct, which we believe will have a profound chilling effect on speech for women, immigrants, and LGBTQ+ people worldwide, as Meedan’s editorial and policy lead Ellery Biddle told Axios.
“Harassment drives people to silence themselves or leave online spaces entirely,” Biddle said.
Our friends at ARTICLE 19 are raising the alarm about what these moves at Meta could mean for human rights around the globe, given the way the company’s announcement seems to mirror popular right-wing talking points while at the same time insisting its policy changes reflect a genuine concern for freedom of expression.
“All in all, while Mark Zuckerberg attempts to frame the policy changes as a defence of free speech, in reality, what they reveal is a troubling willingness to align with political agendas that may undermine the platform’s accountability and user safety,” the group’s statement read.
Now you see me, now you don’t: TikTok’s uncertain fate in the US
After TikTok’s highly anticipated though brief ban in mid-January, some U.S. TikTok users expressed discomfort with the company’s decision to name-check Trump in notifications about the app’s brief disappearance — and its sudden return. TikTok’s latter notification read, “As a result of President Trump’s efforts, TikTok is back in the U.S.!”
“I have never, ever seen any social media platform call out someone in a political position of power in such a direct way,” one user told The Washington Post.
For many of our partners around the world, the thrust of this story is familiar — from Pakistan to Turkey to Nigeria, there’s no shortage of anecdotes about heads of state growing frustrated with a platform and then threatening to give it the boot. Trump himself brandished the same threat back in 2020, only to reverse course four years later when it became a useful campaign tool. This could seem like something new, but it proves a time-tested point: More than anything, authoritarian leaders tend to see social media platforms either as a threat to their political standing — or as a way to shore up even more power.
Contact us to explore collaboration opportunities today.
Define_strict scrutiny
“Is the TikTok law a content-neutral regulation of TikTok that just happens to affect speech? Or is it directed at TikTok’s content and therefore a content-based law that requires strict scrutiny, the most demanding standard of review that constitutional law recognizes? And most of the time strict scrutiny is fatal because government has to show, not only that it has a compelling interest, but also that this law is the most narrowly tailored way to further that interest.”
— David Cole, “Farewell TikTok?” from WNYC’s “On the Media”
Townsquare
Jan. 30
The Global Policy Fellowship Program at the Institute for Technology and Society in Rio de Janeiro brings together fellows from around the world with a shared interest in the intersections of technology and law. Fellowship applications will be accepted through Jan. 30.
Feb. 10-11
The Artificial Intelligence Action Summit will be hosted by the French government in Paris, bringing together representatives from every sector to discuss the promise and risks of AI.
Feb. 24-27
Access Now’s RightsCon 2025 will be hosted in Taipei, Taiwan. Register today to explore opportunities to advance human rights in the digital age. Meedan will host sessions on gendered disinformation and impact assessment.
Feb. 28
The Open Technology Fund’s Information Controls Fellowship Program cultivates research, outputs, and creative collaboration on topics related to repressive internet censorship and surveillance. Fellowship applications will be accepted through Feb. 28.
What we’re reading
“Silicon Valley is reaching the outer limits of the current regime’s usefulness; contemporary authoritarianism provides new degrees of individual glorification, collective negation, and attacks on the common good — all of which are conducive to profit. The erosion of liberty and social rights may not be the goal of all the tech libertarians, but it is a practical consequence of their project.”
(Brian Chen, Data & Society)
“‘The South Sudan National Communication Authority should have directed telecommunication companies to block specific social media accounts promoting hostility, disinformation, or inhumane content, rather than shutting down Facebook and TikTok entirely,’ said Edmund Yakani, a local civil society activist.”
(Radio Tamazuj)
“Myanmar Internet Project (MIP) documented the internet shutdowns cases that occurred in 2024, based on news and incidents that were reported by independent media outlets, region-specific Telegram channels and information from trusted sources within our network.”
(Myanmar Internet Project)
Did you miss an issue of the Checklist?
Read through the Checklist archive. We’ve explored a diverse range of subjects, including women’s and gender issues, crisis-response strategies, media literacy, elections, AI, and big data.
If there are updates you would like us to share from your country or region, please reach out to us at checklist@meedan.com.
The Checklist is currently read by more than 2,100 subscribers. Want to share the Checklist? Invite your friends to sign up.