Meta’s Content Moderation Rollback Dangerous for Users, Threat to Civil Rights
FOR IMMEDIATE RELEASE
Contact: Mariah Wildgen, [email protected]
WASHINGTON — Maya Wiley, president and CEO of The Leadership Conference on Civil and Human Rights, issued the following statement in response to Meta’s announcement on changes to their content moderation policies:
“People on Facebook, Instagram, and Threads deserve to know that what they’re reading on the platform is trustworthy and safe. Hate speech and mis- and disinformation hurts real people, and Meta knows that.
“Rolling back content moderation when mis- and disinformation has flooded the zone online is an egregious mistake. Whether it’s a lie about where and how to vote or false information about getting help after a disaster, everyone can be harmed by mis- and disinformation. And some communities — including children, people of color, people with disabilities, and other marginalized communities — are even more vulnerable. Mis- and disinformation and hateful attacks are tactics to silence and divide us.
“Last year, we watched as dangerous and false claims about immigrant communities took hold, throwing real voters’ eligibility into question and spurring threats of violence in local communities like Springfield, Ohio. Aid workers were threatened because of online lies following Hurricanes Helene and Milton. We have even seen waves of election administrators resigning because of lies and threats they faced online for doing their jobs. In some instances, false claims were spread by powerful people, including by Elon Musk, the CEO of the company Meta seeks to emulate.
“Meta claims content moderation is hard. What’s hard is being threatened. What’s hard is being attacked for the color of your skin or your immigration status. What’s hard is worrying about your kids’ safety. What’s hard is being doxxed because of who you are or your beliefs. Just because companies believe it is hard to monitor their platforms to protect their users doesn’t mean we should allow it. In fact, we must demand that if they want to mine our data and serve us up to advertisers to drive their profits, the least they can do is invest in reasonable safety and decency policies in a community they have created and control. Meta’s bottom line is tied to users’ trust in their product.
“We already know community notes don’t work to combat mis- and disinformation online. X provided a glaring example. According to one report, in 2024 only 7.4 percent of notes proposed around election content ended up posted. In the month leading up to the election, this number dropped to only 5.7 percent. Those are paltry numbers. X users viewed misleading content 13 times more than their community notes. Crowdsourced fact-checking should only be used to supplement professional fact checkers, not replace them. Attempting to replicate a failed policy of a much smaller platform like X will have devastating consequences for Meta’s multi-billion user base across its three platforms.
“Content moderation, done smartly and carefully, increases accurate information. While no one method will catch all the lies, it will ensure fewer dangerous ones sneak through. As civil rights advocates, we know that rights must be protected to ensure a vibrant and open democracy and diverse online communities. We call on Meta to reverse course. If this misguided policy is implemented, we will hold them accountable for the harms it will cause.”
The Leadership Conference on Civil and Human Rights is a coalition charged by its diverse membership of more than 240 national organizations to promote and protect the rights of all persons in the United States. The Leadership Conference works toward an America as good as its ideals. For more information on The Leadership Conference and its member organizations, visit www.civilrights.org.
###