Civil Society Organizations Demand Big Tech Protect Voters Against Disinformation
June 13, 2024
Sundar Pichai, CEO, Google Adam Mosseri, Head of Instagram
1600 Amphitheatre Parkway 1 Hacker Way
Mountain View, CA 94043 Menlo Park, CA 94025
Mark Zuckerberg, CEO, Meta Steve Huffman, CEO, Reddit
1 Hacker Way 303 Second St
Menlo Park, CA 94025 San Francisco, CA 94107
Evan Spiegel, CEO, Snap Inc. Shou Zi Chew, CEO, TikTok
2772 Donald Douglas Loop N 5800 Bristol Pkwy, Ste.100
Santa Monica, CA 90405 Culver City, CA 90230
Daniel Clancy, CEO, Twitch Neal Mohan, CEO, YouTube
350 Bush St., 2nd Fl 901 Cherry Ave
San Francisco, CA 94104 San Bruno, CA 94066
Linda Yaccarino, CEO, X
1355 Market St., #900
San Francisco, CA 94103
Dear Mr. Pichai, Mr. Mosseri, Mr. Zuckerberg, Mr. Huffman, Mr. Spiegel, Mr. Chew, Mr. Clancy, Mr. Mohan, and Ms. Yaccarino:
On behalf of the undersigned organizations, we urge you to take immediate steps to protect democracy and meaningfully address the spread of voting disinformation designed to suppress the vote. For years, we have urged your companies to take responsibility to ensure that products and business processes protect civil and human rights and do not result in harm or bias against people, especially historically marginalized groups. Threats to safe and fair elections undermine democracy and can have a corrosive effect on the safety and voting rights of people of color and other targeted communities. Disinformation is often intended to drive wedges between and suppress the vote across vulnerable communities and communities of color.
As we have seen, disinformation propelled the horrific acts of violence on our Capitol and the attempt to halt the constitutional process for Electoral College ballot certification on January 6, 2021. It didn’t stop there. Both organized extremists and distrusting and dangerous individuals have since been empowered and incited by unfounded election integrity claims that in some cases have been elevated and endorsed by elected leaders, business leaders, and celebrities. These actions have deepened distrust and stoked the spread of hate, harassment, and harm, both during and between election cycles.
Platforms have a significant role to play to stem the spread of disinformation about elections and voting. Doing so would help protect democracy by limiting the spread of false content about election processes and curbing the damaging and suppressive effect that disinformation is having on civil rights and racial justice.
Your companies have largely failed to take sufficient action. Disinformation and content intended to suppress or deter people from voting continues to proliferate on your platforms. We reached out to you in 2022 urging you to take steps in advance of the midterm elections. Unfortunately, not only were most of these steps not adequately implemented, but since then a number of your companies have also walked back or even eliminated critical election integrity policies.
With rampant disinformation still spreading about the Big Lie and election processes, increased harassment of election officials, and now the rapid growth of AI, it is imperative that steps are taken immediately to address the spread of falsehoods. To be best prepared for the upcoming elections in the United States and around the world, platforms must anticipate and counter democratic destabilization by applying learnings from recent elections in the United States and abroad.
The undersigned organizations have worked together on an updated set of operational demands that we urge you to implement as soon as possible to address the spread of election disinformation, particularly voter-suppressive content that is focused on vulnerable communities and communities of color.
- Establish and Enforce Civic and Elections Policies: Platforms should establish, maintain, and consistently enforce civic and election-related policies across all accounts and content formats, including ads and AI-generated content. Enforcement should include content moderation across the most widely spoken languages in the United States, utilizing a clearly outlined strike system of escalating account-level “soft interventions” to limit the impact of repeat offenders, such as restricting resharing, curtailing algorithmic amplification, and placing posts behind click-through warning labels with context. Particular attention should be focused on enforcement of content that is directed toward, designed to disincentivize civic engagement of, or disproportionately impacts communities of color, non-English speaking communities, the disability community, the LGBTQ+ community, and other populations disproportionately impacted and targeted by disinformation.
- Address Synthetic Content in AI/Manipulated Media and Election Policies: Ensure AI/manipulated media policies adequately cover synthetic content across audio and visual (image and video) formats, including prohibiting the use of generative AI, manipulated media, and deepfakes to interfere with elections, including the spread of falsehoods regarding election administration and any attempts to suppress the vote.
- Adequately Resource Election Teams: Adequately resource election and related trust and safety, policy, legal, and operations teams at least 6+ months prior to the U.S. general election and maintain this resourcing through Inauguration Day. Staff levels should be set at a target ratio based on the number of total users and should be sufficient to support communities who are highly targeted by disinformation on the respective platforms. These election teams should articulate clear and defensible “break-the-glass” criteria for election risk protection mitigations in emergency situations and be ready for their deployment through Inauguration Day. The election teams should also pay special attention to disinformation aimed at marginalized communities and how these efforts can intersect with existing terms of service against discourse that is defined by the platforms as hate speech or extremism.
- Establish Rate Limits and Limits on Rampant Resharing: Establish usage-rate limits for inviting, messaging, sharing, commenting, and forwarding features — particularly their usage by accounts and entities that are new, demonstrate suspicious on-platform activity, and relate to voting and elections — at least four months before the general election through Inauguration Day. This includes limiting rampant resharing by removing simple share buttons on posts after they have been reshared a multiple number of times (e.g., once someone clicks “re-share” on a post that has already been re-shared four times, the re-share button does not appear on their post).
- Amplify Authoritative Information: Amplify and feature accurate, authoritative content on the time, place, and manner of voting and on election results for the remainder of the U.S. election season.
We would welcome a response detailing plans to implement these recommendations by June 30, 2024 so that we can learn more about steps you will be taking to protect democracy and civil rights. Please reach out to Dave Toomey at The Leadership Conference ([email protected]) if you have any questions or need additional information.
Sincerely,
The Leadership Conference on Civil and Human Rights
Accountable Tech
All Voting is Local
Asian Americans Advancing Justice – AAJC
The Brennan Center for Justice
Center for American Progress
Common Cause
Human Rights Campaign
Issue One
Kapor Center
National Coalition on Black Civic Participation and the Black Women’s Roundtable
North Carolina For the People Action
Protect Democracy
UnidosUS