Russia created an election disinformation playbook but Americans evolved it

Isabelle Niu, Kassie Bracken and Alexandra Eaton, The New York Times

Posted at Oct 26 2020 09:40 AM | Updated as of Oct 26 2020 09:49 AM

Photo by Nikolay Vorobyev on Unsplash

A deluge of misinformation is hitting the U.S. elections, and this time election experts are more worried about it coming from Americans.

Back in 2016, Kremlin-linked Russian trolls surprised social media companies with an online disinformation campaign. Russian trolls developed an effective playbook — they used a large network of fake accounts to spread incendiary political content to millions of Americans, took advantage of existing divisions in American society and sowed doubt about the election process. In the years since, the U.S. intelligence community, social media companies and the public have become aware of the threat of foreign disinformation campaigns. But America’s election information problem has evolved.

“We see that playbook being used by political operatives in the U.S. and we see that same playbook being used by individuals in their basements who are angry and frustrated with life,” said Claire Wardle, the U.S. director of First Draft, a nonprofit organization focused on addressing misinformation and disinformation.

Simply put, disinformation is a falsehood created with the intention to cause harm. Misinformation is also false, but created or shared without the intention to deceive others.

An example of how one person could use disinformation to disrupt an election played out in 2019 in Kentucky’s race for governor. A Twitter user with the handle @overlordkraken1 tweeted a false claim that he shredded Republican mail-in ballots, hours after polls closed. Election officials in the state immediately spotted the piece of disinformation.

“There’s so many checks and balances that we’ve built into the system over the past decades that we know where all the ballots are at all times,” said Jared Dearing, the executive director of the Board of Elections in Kentucky. But within hours, a vast network of partisan accounts amplified the lie on Twitter. It became one of the most viral pieces of disinformation in an already close election and cast doubt on the results.

The same pattern is already playing out this year, with the added layer of uncertainty caused by the coronavirus pandemic. Many states are adopting vote-by-mail on a large scale for the first time because of public health concerns. Isolated cases of errors and breakdowns with mail ballots quickly become fodder for unsupported claims of widespread voter fraud.

State election officials are fighting back, but they are spending limited resources on competing with a constant stream of misinformation coming from partisan groups, conspiracy theorists and even President Donald Trump.

Social media companies are trying to find a tricky balance between combating misinformation and protecting free speech. Facebook recently intensified its crackdown on QAnon, after facing criticism for not doing enough to curb the growth of the pro-Trump conspiracy theory group.

Twitter blocked, and then unblocked, sharing to an unsubstantiated New York Post article about Hunter Biden, citing violations of its policy on hacked materials. Facebook also restricted access to the Post report while waiting for a review by third-party fact checkers. The decisions have prompted outcry of censorship from Republican lawmakers. To date, there is no unified policy from the major tech companies when it comes to combating election misinformation.

“We will never cure the problem of misinformation. But right now, we have horrible levels of pollution,” Wardle said. “My hope is over the next 30, 40, 50 years, it gets to a manageable level and that actually platforms, governments, the public have all learned how we can mitigate this.”


Copyright:
c.2020 The New York Times Company

Watch more in iWantTFC