National View: Media platforms, start combating election disinformation now

From the column: "We believe platforms have a responsibility to ensure their products are not used by vote suppressors and disinformation purveyors."

Jeff Koterba/Cagle Cartoons
We are part of The Trust Project.

It is a haunting video. A man with a megaphone on the steps of the U.S. Capitol reads a tweet from then-President Donald Trump as insurrectionists storm the building on Jan. 6, 2021: “Mike Pence didn’t have the courage to do what should have been done to protect our country and our constitution.”

Trump poured fuel on the fire when he knew the violent and armed mob had already breached the Capitol building.

Social-media platforms have had a starring role as the Jan. 6 committee unwinds the sprawling conspiracy orchestrated by the then-president and his henchmen. The companies were woefully unprepared to tackle disinformation in the last election.

The Big Lie that the 2020 election was stolen has taken root, with 68% to 74% of Republicans believing it. State legislatures have capitalized on it to pass at least 34 laws in 19 states that make it harder to vote or that would sabotage our election process. Many candidates in 2022 make election denial and voter suppression cornerstones of their campaigns. As a result, recent polling shows that 56% of Americans have “little or no confidence” in free and fair elections.

The challenges of election disinformation cannot all be solved for the 2022 elections. But there are steps platforms can take now to help voters fight the spread of disinformation. Common Cause and more than 120 organizations sent a letter to the biggest social-media platforms urging them to combat election disinformation proactively in 2022. We suggested that:


  • Platforms commit to translating voting information into non-English languages, so voters have reliable and accessible information. Non-English disinformation continues to flourish online, and the gap between resources allocated for English-language disinformation content moderation and non-English reveals a large disparity in how violative content is treated. Platforms should increase resources devoted to non-English language content moderation and make the metrics of resource allocation transparent.
  • Platforms can and must ensure that content that calls for political violence is addressed and that posts that promote election disinformation aren’t allowed to go viral. Each platform decides not just what content is allowable but how it is promoted to users through algorithmic amplification — that is, what the platform recommends or shows to you. Platforms can apply “friction” to posts containing disinformation, thus reducing the distribution of disinformation.
  • Platforms must be consistent in their civic integrity policies. This must include removing content spreading false claims about the 2020 election and staffing teams that enforce these policies all the time, not just in the weeks immediately preceding elections. There is no off-year for election disinformation, and the weeks after an election can be some of the most critical. False claims about 2020 continue to affect laws being made in 2022 and must be addressed. This is especially critical in the post-election period to ensure peaceful transitions for newly elected officials. Fact-checking political content should be prioritized, and existing loopholes in content enforcement closed.
  • Platforms should provide real-time access to data so that researchers and watchdogs can identify the harms of disinformation. This includes greater transparency about political advertisements, enforcement practices, and algorithmic models. They should also rescind retaliatory actions taken against researchers and allow academic research on platform impact and algorithmic harm to continue. This includes allowing access to tools like CrowdTangle, which are invaluable for measuring Facebook’s reach and effect on users. 

It’s vital that all social-media platforms take measures to ensure they don’t play a role in future disinformation about the election. In our coalition letter , we’ve endorsed some simple actions social-media platforms can take to reduce the spread of election disinformation and the potential it contains for offline political violence. We believe platforms have a responsibility to ensure their products are not used by vote suppressors and disinformation purveyors.

The 2020 election and its aftermath revealed the dangers posed to our democracy by disinformation. We saw how President Trump used social-media platforms to assemble and incite the mob on the Ellipse on Jan. 6 and then turn it loose on the Capitol where Congress was at work certifying the election victory of his opponent. The results were shocking, and they were tragic, and they could easily have been far worse. We must ensure they are not repeated in 2022 — or 2024 or ever again.

Emma Steiner is a disinformation analyst at Common Cause (, a watchdog group in Washington, D.C., with chapters in 35 states, including a Minnesota chapter in St. Paul.

Emma Steiner.jpg
Emma Steiner

What To Read Next
From the column (sound familiar?): "With its vast and diverse natural resources, Canada is well-positioned to play a critical role in meeting this global demand."
From the column: "Sadly, Holly’s voice had been stilled all too soon. At least his physical voice had. In a larger sense, though, it continued — and grew."
From the column: "Five years and just a few short days later — on Feb. 9, 1964 — the music experienced a rebirth ... (with) the Beatles ... (on) 'The Ed Sullivan Show'."
From the column: "Our democracy is not healthy when inaccurate information abounds ... and when efforts to provide meaningful civic education are quickly shouted down as 'too woke'.”