Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Understanding Foreign Interference in the Election

A new Rand report explores ideas and practices used by Russia and other hostile states to sow doubts about the election. The insights could help state and local government better understand and respond to the problem.

shutterstock_14236885701
(Shutterstock)
Russia is using Cold War tactics to undermine faith in the coming election, according to a report just released by the Rand Corporation. The aim is to amplify existing fault lines and undermine the consensus that is a bedrock of Democracy, says lead author Marek Posard, Ph.D. 

In “From Consensus to Conflict: Understanding Foreign Measures Targeting U.S. Elections,” Posard and his co-authors highlight the concept of “reflexive control,” a technique long used in Russian information warfare that has gained new dimension and power in the Internet age.

Rather than attempting to convince groups or parties that they are adversaries, it assumes that conflict already exists and seeks to amplify it, a strategy well-suited to an already discordant election season.
“The Russians are trying to divide us and lower the probability of the consensus that is a bedrock of American democracy,” says Posard. 

It’s not possible to fully trace the origins or spread of such efforts, but it can’t be denied that division permeates news stories, chat shows, protests and campaign events alike. Mail-in voting will play an unprecedented role in the 2020 election, and according to a recent Pew Survey, 43 percent of Republicans consider it a “major problem,” while 47 percent of Democrats say it’s “not a problem at all.”

A Firehose of Falsehoods

Foreign interference in American politics is not new, says Posard, citing examples such as warnings about the problem in George Washington’s farewell address, European interference during the Civil War and the Soviet “active measures” program during the Cold War. 

“The only difference today is that the internet has reduced the transaction cost to run these types of information efforts,” he says. “It's much cheaper to target and customize targeting of individuals to really sow dissent.”

Reflexive control campaigns are agnostic and as likely to target civil rights activists as white supremacists. The particular group or issue that offers the most opportunities to exploit division at any given time is a moving target. 

Techniques can include tailored disinformation, creating social media groups, paid advertisements, hacks, leaks, and promoting fringe movements or “narrative laundering,” which can move state-fabricated stories into the mainstream. Russian Internet campaigns have also promoted separatist movements in California and Texas.

“They can disseminate a firehose of various types of falsehoods to see what sticks and gets circulated around,” says Posard. When others use disinformation to create their own content, or share it with others without realizing it came from a bad actor, the effect is corrosive.

Overview of Select Russian Interference and Disinformation Measures


russsian-disinformation.png
A wide range of active measures can be employed in foreign campaigns to undermine consensus among Americans. (Chart courtesy of Rand Corporation)

Building Consensus

The Rand report was sponsored by the California Governor’s Office of Emergency Services, and the authors had state and local government in mind when they wrote it. 

“Local government is a key driver in finding consensus on issues of public concern,” Posard says. “My colleagues and I advise governors, secretaries of state and senior people in the intelligence community, but if I want to get them really fired up I ask about issues in their city or town.”

Rand researchers point to an approach advocated by Lawrence Eagleburger, the only career foreign service officer to have served as secretary of state, as a model for government response. Eagleburger advised an approach to Cold War Soviet propaganda campaigns that neither ignored them nor became obsessed by them.

Hitting back at a specific meme or message that targets a group or issue can have the effect of amplifying it. Posard recommends emphasizing shared interests, using channels that include social media outlets or public service announcements to highlight instances where groups with different views have found consensus.

Paul Barrett, deputy director of the New York University Center for Business and Human Rights, wrote the center’s 2019 report “Disinformation and the 2020 Election.” He likes the idea of promoting consensus as a strategy, but sees real challenges in achieving success with it. 

“What the Russians seem to understand is that American society is growing only more polarized, at least in part because of the divide-and-conquer strategy of the current president,” he says. “This trend appears to have accelerated since 2016.”

The Rand recommendations address Barrett’s concern in part by calling for the development of evidence-based interventions to protect likely targets from responding to falsehoods. The current report is one of four that are planned. The second will use a machine learning algorithm to identify online communities especially vulnerable to Russian trolls, the third will be a survey experiment evaluating different types of interventions, and the fourth a follow-up to the experiment.

Posard believes that one big takeaway from the report is that any group could be a potential target for information warfare. “The more we can help people understand that these efforts are underway without wading into partisan debates, the better,” he says. “We just want people to know that these threats do exist and that there is a logic to how they are playing out.”
Carl Smith is a senior staff writer for Governing and covers a broad range of issues affecting states and localities. He can be reached at carl.smith@governing.com or on Twitter at @governingwriter.
Special Projects