These groups try to hack the vote – so that real criminals can’t
Loading...
How do we stop bad guys from undermining Election Day? Role-play them.
In one such exercise, dubbed Operation Blackout and held in August, members of a “red team” thought up tactics like sending armed gun-rights activists to polling stations or posting a “deep fake” concession speech by a candidate. A “blue team” led by public officials looked for ways to counter the fear, violence, or disinformation that such tactics can spread.
Why We Wrote This
Between technology and a distrustful public, the risks of disruption and deception in elections has only risen since 2016. Here’s how some groups are building U.S. defenses.
From private companies like Facebook to the Boston firm Cybereason, which organized the virtual Operation Blackout, a number of groups have been working this year to safeguard the U.S. election by conducting war games or “tabletop exercises.” The idea is to build the civic defense muscles among everyone from election officials to journalists.
“This exercise had a fantastic goal and it succeeded wonderfully,” says John Odum, a certified ethical hacker and the county clerk in Montpelier, Vermont, who participated in the virtual exercise. “But that’s not to say that the deeper problems can be solved quite so simply.”
Maggie MacAlpine is full of ideas about how to cause chaos on Election Day.
Sitting in a suit jacket and T-shirt against a red background with skulls and her team’s moniker, K-OS, she ticks off possibilities from sending armed gun-rights activists to polling places, to spreading rumors that minority voters are being turned away, to a “deep fake” concession speech by a candidate.
Meanwhile, her “opponents” are strategizing how best to protect the election against such mischief. In this virtual tabletop exercise run by Boston-based firm Cybereason, all the attacks are successfully thwarted. But fending off bad actors in real life could be harder.
Why We Wrote This
Between technology and a distrustful public, the risks of disruption and deception in elections has only risen since 2016. Here’s how some groups are building U.S. defenses.
“The biggest thing from my perspective is everything that was done on the red team [K-OS] is very possible technically, and we believe would be very, very effective in sowing chaos in the sense of distrust in the results,” said Cybereason co-founder Yonatan Streim-Amit in the debrief afterward. The Red Team’s measures would also be relatively cheap to implement.
Cybereason’s Operation Blackout, which took place in August, is one of a wide range of simulations being held to help everyone from public safety officials to journalists identify possible Election Day hazards and strengthen defenses against them. They convene a diverse set of participants to try to tackle nightmare scenarios in a holistic way, building a collaborative election security culture.
“This exercise had a fantastic goal and it succeeded wonderfully,” says John Odum, a certified ethical hacker and the county clerk in Montpelier, Vermont, who participated in the virtual Operation Blackout in August. “But that’s not to say that the deeper problems can be solved quite so simply.”
Indeed, while such tabletop exercises provide a valuable opportunity to think through how to anticipate and thwart efforts to undermine the election, they don’t test the hard skills that would be needed to execute a robust defense.
For example, it’s easy for defenders in a tabletop exercise to say they’re hardening their networks against hacking. But in a country with more than 10,000 election administrators, improving the cyberdefenses of every office, vendor, and network – right down to the towns with 100 voters – is complicated.
On Wednesday night, Director of National Intelligence John Ratcliffe alerted voters that Iran and Russia had obtained voter registration information with the apparent intention of trying to undermine public trust in the election.
“This data can be used by foreign actors to attempt to communicate false information to registered voters that they hope will cause confusion, sow chaos, and undermine your confidence in American democracy,” said Mr. Ratcliffe in a joint press conference with FBI Director Christopher Wray, asking voters to do their part in containing such disinformation. “Do not allow these efforts to have their intended effect. If you received an intimidating or manipulative email in your inbox, do not be alarmed and do not spread it.”
From misinformation to physical attacks
Last fall, Cybereason staged a U.S. election simulation in Washington that brought together law enforcement officers from a range of organizations, including the Secret Service and FBI. That exercise, which envisioned autonomous vehicle attacks on polling stations, resulted in a projected 200 people injured and 32 killed – and the cancellation of the election.
Key lessons learned involved establishing alternative communication channels in the event official social media accounts are hijacked, coordinating with private sector companies involved in infrastructure and smart vehicles, and bolstering collaboration among government agencies.
Cybereason is just one of a number of organizations working in this area. The Cybersecurity and Infrastructure Security Agency (CISA), established in 2018 under the Department of Homeland Security, held its third annual “Tabletop the Vote” exercise this summer, bringing together participants from 37 states, including national, state, and local election officials. It included discussions about how best to manage election security amid the pandemic.
The private sector is running similar simulations. Axios reported “unprecedented 2020 war games” in which Facebook, Google, Twitter, and Reddit are working together – and bringing in federal law enforcement and intelligence agencies – to bolster their defenses against foreign interference.
In September testimony before the U.S. House of Representatives’ Homeland Security Committee, FBI Director Christopher Wray underscored the ongoing threat of foreign interference – and not only from Russia. After the 2018 midterm elections, he said, the U.S. government expanded the remit of the Foreign Influence Task Force to include malign operations conducted by China, Iran, and other adversaries.
Crash course for journalists
Much of what could go wrong on Election Day has to do not with technical or logistical failures, but the manipulation of voters’ perceptions and fears.
Rob Walker, executive director of the Homeland Security Experts Group and a participant in Operation Blackout, says America’s greatest vulnerability as a country is “a malleable public” – a trait Russian disinformation campaigns sought to exploit leading up to the 2016 election.
“2016 should have been a wake-up call for all of us on how vulnerable we are to influence operations at a massive scale,” says Mr. Walker, adding that the digital ramp-up of such operations in recent years requires improving civics education and political discourse.
Earlier this month, on a tip from the FBI, Twitter took down nearly 130 accounts it said appeared to originate in Iran, seeking to “disrupt the public conversation” during the first debate between President Trump and his opponent, former Vice President Joe Biden.
Bad actors are targeting journalists to unwittingly abet these operations. Somewhat counterintuitively, when reporters debunk false narratives they often wind up amplifying them, says Claire Wardle, co-founder and director of First Draft, a nonprofit that provides research and training on countering mis- and disinformation.
At the Online News Association confab last fall, Anjanette Delgado, senior news director for digital at the Detroit Free Press, was among a select group of journalists whom First Draft corralled to participate in a simulated information crisis to test their preparedness to deal with misinformation and disinformation.
“Nothing does that like throwing you right in,” says Ms. Delgado, whose newspaper co-sponsored a similar event for her colleagues and other journalists around the Midwest in February.
First Draft has held more than a dozen such trainings across the U.S., simulating a coordinated disinformation campaign around the election and then providing master classes on how to deal with the challenges raised. Participants are then invited to join online forums where they can share tips.
“Bad actors are very coordinated,” from retweeting each other’s messages to sharing tactics in forums, says Dr. Wardle, whose organization has sought to create greater collaboration among journalists, tempering the industry’s traditional competitiveness for scoops to strengthen collective action against a shared threat. “People working in the quality information space are much less likely to be coordinated – it’s all about distinction and originality.”
Improving collaboration is also a key goal for groups like Cybereason.
“I guarantee there’s bad guys doing this type of tabletop exercise right now and they’re not sharing their results with law enforcement, government officials in charge of elections – all the good guys,” says Ms. MacAlpine, an election security specialist and co-founder at Nordic Innovation Labs in Portsmouth, New Hampshire. “What works is getting information about an attack out there so that the good guys can go, ‘Wait a minute, I’ve seen this before.’”
This story was updated on Oct. 22 to reflect breaking news from national intelligence officials.