Nation State Interference During the US Presidential ‘Pandemic Election’

Written by

The 2020 US Presidential election is shaping up to be one of the most bitterly fought campaigns in modern memory, reaching levels of animosity that surpass even the Trump-Clinton contest of four years ago.

Emotive topics such as COVID-19 and the role of policing will be at the heart of this election, and these tense issues, alongside the highly polarized nature of the electorate, ensure there will be a fertile ground for misinformation campaigns from nation state actors seeking to influence the outcome of the election. Such a phenomenon was observed recently in the so-called ‘Brexit election’ in the UK in December 2019, following which the UK government’s Intelligence and Security Committee (ISC) described Russia as a “highly capable cyber-actor.” 

Extra Dangers Posed by the Pandemic

The threat of nation state actors interfering in the democratic process is heightened further given that this year’s Presidential election is taking place amid a global pandemic. People are spending far more time indoors and using the internet as a greater source of content and entertainment as a result, while many of the campaigning events are taking place virtually due to the continued restrictions on large gatherings. It is not difficult to conclude that the election battleground’s shift to the digital space will provide additional opportunities to influence the electorate digitally compared with 2016.

“We’re in front of our screens probably more than we were pre-pandemic and that means nation states can up their game on manipulation and misinformation campaigns”

Theresa Payton, CEO and president of Fortalice Solutions and former White House CIO explained: “We’re in front of our screens probably more than we were pre-pandemic and that means nation states can up their game on manipulation and misinformation campaigns as they have an audience tuned in all day long.”

Understanding the additional dangers posed by nation state actors to this particular election cycle, and finding ways to negate them, is therefore vital.

The issue of ‘fake news’ being spread by nation state actors on social media was widely discussed back in 2016, and there are plenty of reasons to think it will emerge again, especially given the growing reliance on the online space at the current time. Brandon Hoffman, CISO at NetEnrich, said: “The number one way foreign powers will influence this election is through widespread disinformation campaigns. It is likely the majority of the campaigns will come from platforms that don’t have or don’t require strict vetting of source material. Platforms like video sharing sites and social media will be hotbeds of disinformation campaigns.”

The issue is likely to be exacerbated in this year’s election due to the rise of virtual campaigning and fundraising events as a result of COVID-19 social distancing restrictions, as this provides extra opportunities for manipulation. Payton noted: “We’re now having messages delivered that are not in person or in the normal realm. The DNC convention in America, for example, went on from a remote location with people videoing in – some are recorded and so the opportunity to take snippets of videos and audio, manipulate them using deepfake technology and then play those shorter snippets, is definitely a possibility.”

Addressing Misinformation on Social Media

Social media firms such as Facebook and Twitter have undoubtedly taken greater strides in recent years to try and address the spread of this kind of content on their platforms. For instance, since June 2020, Facebook has started flagging content from state-controlled media outlets, while at the end of last year, Twitter announced a ban on political advertising ahead of the 2020 US election.

Payton welcomes these changes, as they provide a reminder to people that there is no vetting taking place on the content going up on such platforms, and she also acknowledged that major strides have been made in their algorithms to detect fake personas. Nevertheless, she believes this is nowhere near enough to prevent the spread of misinformation being shared across these platforms. In Payton’s view, this is because there is a lack of incentive for social media firms to bring in such measures due to the nature of their business models.

“These platforms were built to connect you and me and allow us to share things that are mutual interests and they want to be the primary platform you stay on all day long. The way they make money is on engagement and interaction because that translates into selling ads to other companies. So the more engagement you have, including arguing, negative posting and things going viral, the more money the social media companies make,” she outlined.

According to Payton, a serious effort to stop misinformation campaigns being spread requires far more collaboration between social media giants. She would like to see them combine to create a fusion center to deal with all global misinformation and manipulation cases, and treat all reports as if they are cyber-incidents. She also believes there should be much greater vetting of companies that buy adverts.

“It’s incumbent upon every person as a voter and a citizen to know what the authoritative source of information is”

With it being unlikely that there will be any substantial strides in these directions in the foreseeable future, the burden is ultimately on citizens to ensure the content they are seeing or reading is legitimate. Len Shneyder, co-chair, Election Security Working Group, M3AAWG, said: “It’s incumbent upon every person as a voter and a citizen to know what the authoritative source of information is; if it looks like The Wall Street Journal logo but it isn’t please don’t assume this is true.”

In such a polarized US election, this is easier said than done. Payton added: “How do you tell people who are now connected to their friends, family and work colleagues pretty much all day long through digital devices to unsee something they’ve just seen? If it ends up confirming a belief system they have, even if you tell them later it was a misinformation campaign, the studies show they shrug their shoulders and say ‘it confirms something I thought all along.’”

Protecting Other Digital Communication Channels

As well as misinformation being spread via social media channels, other forms of digital communication, such as emails, are likely to play an increasingly important role in attempts by nation state actors to try and influence the election. Shneyder believes the double whammy of a crucial election and the COVID-19 pandemic creates the perfect lures to attract people to these communications. “It’s really troubling because these two things converge, one naturally leads to the other somehow because there’s politicization of just about everything. So again these will influence and inform the lures but the actual mechanism will remain the same,” he said.

From a recipient’s point of view, establishing whether email messaging is legitimate or not is not always an easy task. Shneyder outlined that a lot of email senders are not authenticating their message, and that a very high proportion of emails received are from those with nefarious motives, including phishing and those laden with malware.

In an interview conducted with Infosecurity earlier this year, Shneyder described how this can be an effective means of spreading misinformation: “The notion that someone is deploying a complex algorithm to undermine a strong password, or exploiting a system’s backdoor, is the stuff of Hollywood. Hacking people and convincing them that they’re communicating with someone they know, or think they know, is far simpler and significantly more effective.”

Only recently, it was reported by Russia’s Kommersant business newspaper that a database of several million American voters’ personal information has been circulating on the Russian dark web. While this story was subsequently denied by the US Cybersecurity and Infrastructure Security Agency (CISA), it highlights the kind of tactics that could be utilized.

Discussing the story, Tim Mackey, principal security strategist at the Synopsys CyRC, said: “American voters should view this dataset as being very similar in scope to the publicly accessible voter rolls in their state, or a variant on the data available from any number of past breaches. While such data is undoubtedly being used by some to stir the electoral propaganda pot, voters can best defend against such influence by ignoring proactive communications, such as unsolicited emails and texts. In effect, such attempts at influence should be treated as if they were another spam or potential phishing attempt.”

Shneyder therefore believes a far greater emphasis should be placed on election officials to ensure communication lines and voter details are as secure as possible.

“You need to essentially have cybersecurity education as part of every workers’ daily indoctrination into the job. They’re going to rely on basic technological services like Google ads and email and calendaring services and portals so they need to secure every kind of communication with two-factor authentication (2FA) at the very least, and they need to use multiple channels to confirm messages.”

Encouragingly, there are signs this security issue is becoming more recognized in the upcoming election; for example, the University of Chicago has launched an initiative called Election Cyber Surge, which is essentially a matchmaker service, connecting US election officials concerned about cybersecurity with volunteers who are experts in the field.

However, in Shneyder’s view, this is not enough. “I know there is training that happens but that training again is not uniform across all workers, it is essentially a local thing that happens,” he added.

Misinformation During the Pandemic Election

Interference by nation state actors has been a major feature of a number of modern elections. The primary, and simplest means of doing this is through the spread of misinformation, via social media and other digital communication channels. With the 2020 US Presidential election taking place during the Covid-19 pandemic, there are more opportunities to exploit this avenue; the growth of virtual campaigning enables extra chances to manipulate videos and post them online, while there is a larger base of the electorate who are reliant on digital channels as their primary source of information.

Countering actions of anonymous actors operating from anywhere in the world is immensely difficult, and actually pinpointing individuals and organizations responsible is highly unlikely. Actions by social media bodies and election officials to try and prevent these actions occurring could and should go further, but it will be impossible to stop every misinformation campaign in today’s hyper-connected world. It is therefore becoming increasingly incumbent upon citizens to check the validity of the information they are receiving, always adopting a sceptical posture. Whether that is realistic in the current climate is another matter.

What’s hot on Infosecurity Magazine?