Leadership Conference and Democracy Groups Urge Social Media Platforms to Address Voter Disinformation Ahead of Midterms

View PDF of letter here.

Mr. Zuckerberg, Mr. Pichai, Ms. Wojcicki, Mr. Agrawal, Mr. Chew, Mr. Spiegel, and Mr. Mosseri:

On behalf of the undersigned civil rights, public interest, voting rights and other organizations, we urge each of your companies to take immediate steps to curb the spread of voting disinformation in the midterms and future elections and to help prevent the undermining of our democracy. In May of this year, a coalition of more than 120 organizations called on your platforms to take several affirmative steps well in advance of the midterm elections to combat election disinformation. While your companies have recently made announcements regarding 2022 policy updates to combat election disinformation, we have yet to see meaningful steps on the actions that we requested in our letter.

The 2022 midterm elections are only a few weeks away, but online disinformation continues to confuse, intimidate, and harass voters, suppress the right to vote, and otherwise disrupt our democracy. Disinformation narratives about voting procedures and policies continue to proliferate, including false information about the use and security of mail-in ballots, drop boxes, and ballot collection. Preemptive false claims of fraud are now spreading before elections take place, and election workers are increasingly harassed online. We expect the disinformation to only increase as the midterms get closer.

In the last several weeks, many of your companies have announced updates to your voter interference and disinformation policies. We appreciate that you took the time to release the policies publicly; the policies demonstrate that your companies understand the importance of addressing voting disinformation. But as we have stated continually over the last two years, the policies have little or no effect unless you enforce them continually and consistently.

Further, most of the updates are largely similar to what was announced in 2020. We appreciate the focus on directing users to authoritative sources on voting and elections, which we have asked for in the past. However, there is very little direction or discussion on what will be done specifically to curb or prevent voting disinformation as it arises. Policies that direct users to authoritative sources, while helpful, are not adequate if steps are not taken that will directly reduce the posting and spread of voting disinformation.

Moreover, there is very little in your 2022 policies that reflects the ongoing and new challenges facing us as the midterms approach. Many of the bad actors and election deniers have continued and stepped up their online spread of false information since 2020, and some are even on the ballots for elected office in 2022. The policies and actions by platforms must do more to specifically combat these trends in 2022 and beyond.

With the November 8 midterm election fast approaching, platforms must take immediate actions on the measures below. This is not an exhaustive list, but it includes pressing needs that require attention before the midterms:

Combatting the ‘Big Lie’: The Big Lie, the false narrative that the 2020 presidential election was stolen from former President Trump, is now embedded in our political discourse. Candidates are continuing to use the Big Lie as a platform plank to preemptively declare voter fraud in order to dispute the results of the 2022 election. In fact, more than 100 GOP primary winners support false voter fraud claims, and 60 percent of Americans will have an election denier on their midterm ballot this November. This is damaging American democracy by undermining faith in the integrity of our elections.

Disinformation around the 2022 midterms is inextricably intertwined with disinformation from the 2020 presidential election, with bad actors recycling many of the false claims made just two years prior. Recent research states that a small group of users is responsible for a large portion of the disinformation that spreads about voting and elections. Your companies state that they understand the problems and concerns about the Big Lie, but this is not reflected in many of your recent policy announcements. In fact, research and investigative reporting revealed that Facebook and Twitter are no longer enforcing content around the Big Lie. Platform civic integrity policies that focus only on the current election cycle and fail to combat the Big Lie are ineffective. Moreover, content that focuses on the Big Lie is not based on opinion but rather on false pretenses and should be actionable under each of your civic integrity policies. Platforms must be much more vigilant and take immediate steps to remove and curb disinformation that spreads and amplifies the Big Lie.

Preventing disinformation targeting non-English speaking communities: Non-English language disinformation has continued to spread beyond the 2020 election. The language gap between content moderators and content has created enforcement disparities, leaving non-English speaking communities vulnerable to false claims and disinformation. The updated policies from your companies have little or no information on specifically curbing non-English disinformation, particularly in languages beyond Spanish. For example, Meta’s new election disinformation strategy includes commitments to working with Spanish-language fact checkers and investing in Spanish-language media literacy resources, but it does not extend similar commitments to languages such as Vietnamese, Chinese, or Korean.

We recognize that this is a very difficult problem to address, but your companies must invest more resources towards preventing the spread of non-English disinformation. All users should be able to use your platforms without being inundated by election disinformation regardless of the languages they speak. Beyond fact checking and the use of authoritative sources, non-English content moderation teams must also have the cultural context and competency to accurately and adequately implement and enforce content moderation policies. Further, content moderation policies and processes themselves must be in tune with and reflect the lived realities and experiences of multilingual and multicultural communities. Finally, as we underscore throughout this letter, these improvements must become standard operating procedure — not merely one-off adjustments made just before an election.

More friction to reduce the distribution of content containing electoral disinformation: We know there were some steps taken by some of your companies to implement friction and similar features just before and after the 2020 election. There is little discussion in the updated policies about taking any of these steps for the midterms. There is mention in the updated policies about the use of labeling to address disinformation, but details on how this would work are scant. While misleading claims should be appropriately labeled to provide context, a growing body of research shows that information-only labels are largely ineffective to halt the spread of disinformation. We have discussed more comprehensive plans with your companies to reduce the distribution of electoral disinformation, including implementing front- and back-end friction in user interfaces, algorithms, and product design to proactively reduce mis/disinformation. This may include modifications to demote or downrank this content and limit users’ ability to engage with it. For example, viral circuit breakers can be utilized to limit the spread of potential disinformation. We would like more assertive steps like these to be implemented as soon as possible.

The spread of disinformation on voting and elections on your platforms is undermining our democracy. If you allow disinformation about elections to spread largely unchecked, your platforms will become known as the dominant threat to a thriving democratic process. But with the proper oversight and protections, your platforms can be helpful tools to promote a strong democracy. It is vital that you take immediate actions to curb voting disinformation before the midterms, and we are eager to work with you to ensure that your platforms do just that. Please contact Dave Toomey of The Leadership Conference at [email protected] to discuss or respond to the issues we have raised in this letter and your plans to address them.

Sincerely,

Asian Americans Advancing Justice – AAJC
Center for American Progress
Common Cause
The Leadership Conference on Civil and Human Rights
UnidosUS
Accountable Tech
All Voting is Local
Center for Democracy and Technology
NAACP Legal Defense and Educational Fund, Inc.
National Coalition on Black Civic Participation and the Black Women’s Roundtable
New America’s Open Technology Institute
Sikh American Legal Defense and Education Fund