Platforms for social media are bracing themselves for midterm election chaos.

Misconceptions about voting and elections abounds on social media with less than three weeks until the polls close, despite assurances by technology companies to address a issue blamed for continuing to increase polarization and distrust.

Whereas platforms such as Twitter, TikTok, Facebook, and YouTube claim to have enhanced their efforts to detect and prevent harmful claims that could stifle the vote or even lead to violent altercations, a review of some of the sites reveals they’re still playing tag with 2020, when then-President Donald Trump’s lies about the ballot he lost to Joe Biden helped spark an insurgency at the U.S. Capitol.

“You’d think they’d learned by now,” said Heidi Beirich, founding member of the Global Project Against Hate and Extremism as well as a member of the Real Facebook Oversight Board, which has criticized Facebook’s efforts. “This isn’t their first vote. This should have been dealt with prior to Trump’s defeat in 2020. At this point, the harm is extensive.”

If these US-based tech behemoths can not effectively prepare for a US election, how can they be expected to handle international elections, according to Beirich.

Mentions of a “rigged election” and “vote rigging” have increased in recent months and have become two of the three most common terms included in discussions of this year’s election, based on an analysis of social media, online, as well as broadcast content conducted on behalf of The Associated Press by media intelligence firm Zignal Labs.

Zignal’s analysis of Twitter found that tweets amplifying conspiracy theories about the upcoming election were reposted thousands of times, along with posts restating disproved claims about the 2020 election.

Several major platforms have declared steps to combat voter and election misinformation, such as labeling, warnings, and other measures.Users who continually violate the terms can be suspended.

“Our teams are closely monitoring the midterms and working to delete content that breaches our policies as soon as possible,” YouTube said in a statement.

“We will remain vigilant before, during, and then after Election Day.”

This week, Meta, the parent company of Facebook and Instagram, announced the reopening of its electoral command center, which monitors real-time efforts to combat election misinformation.

The company dismissed criticism that it isn’t doing enough already and denied reports that it has reduced the number of election-related employees.

“We are going to invest substantial resources, with work encompassing more than 40 teams as well as hundreds of people,” Meta said in an email to the Associated Press.

Beginning this week, anybody who searches on Facebook for election-related keywords, such as “election fraud,” will instantaneously see a pop-up window linking to reputable voting resources.

TikTok launched an electoral center earlier this year to assist voters in the United States in learning how to to register to vote and who is on their ballot.

The information is available in English, Spanish, including over 45 other languages.

The platform, which has become a top source of information for youthful voters, also labels misrepresentative content.

“Access to authoritative information is critical.”Also policies meant to hinder harmful misinformation about elections may not always be consistently enforced.

A New York University report released last month faulted Meta, Twitter, TikTok, as well as YouTube for amplifying Trump’s untrue information about the 2020 election.

The study cited inconsistencies in misinformation rules in addition to poor enforcement.

A number of organizations have encouraged tech companies to do more to combat voter and election misinformation.

“The platforms owe more to Americans than lip service and half-measures,” said Yosef Getachew, director of Common Cause’s media as well as democracy program.

“Both foreign and domestic opponents of democracy have weaponized these platforms.”

Election false news is even more widespread on smaller platforms prominent with conservatives and far-right groups, such as Gab, Gettr, and Trump’s own platform, TruthSocial.

However, when compared to Facebook, these sites have tiny audiences.

Beirich’s group, the Real Facebook Oversight Board, developed a list of seven suggestions for Meta in order to reduce the spread of false information in the run-up to the elections.

They included platform changes that would prioritize material from valid news outlets over politically biased sites that frequently spew lies, as well as increased focus on misinformation targeting Spanish and other language voters.

Meta told the Associated Press that it has doubled the number of Spanish-language fact checkers since 2020. In addition, the company opened a Spanish-language fact-checking tip line on WhatsApp, which it also owns.

According to Brenda Victoria Castillo, CEO of the National Hispanic Media Coalition, much of the misleading information aimed at non-English speakers appears to be aimed at repressing their vote.