V. Implications for Policymakers
Cyberspace has long bedeviled policymakers, practitioners, and even novelists alike. As the science fiction author William Gibson admitted when he used the word ‘cyberspace’ in his book Neuromancer: “All I knew about the word ‘cyberspace’ when I coined it, was that it seemed like an effective buzzword. It seemed evocative and essentially meaningless. It was suggestive of something, but had no real semantic meaning, even for me, as I saw it emerge on the page.”277 As with cyberspace generally, and has been shown through this Article the number of threats facing democratic institutions—including with regards to election security and digital repression—is long, and seemingly only growing longer.278 Indeed, the likes of Henry Farrell and Bruce Schneier have argued that: “the open forms of input and exchange that it [democracy] relies on can be weaponized to inject falsehood and misinformation that erode democratic debate.”279 Opinions vary as to whether to consider democracy itself as a cyber threat vector, and how to mitigate the risks, such as by doubling down on democratic institutions or relying on other actors—including the private sector—to better manage these issues such as the spread of disinformation through the EU-organized Code discussed in Part IV. This Part proceeds by summarizing the policy suggestions made throughout using an analytical framework pioneered by Peter Swire, among others.280
276. See Post-Election Audits, NAT’L CONF. OF ST. LEGISLATURES, (Oct. 25, 2019), https://perma.cc/N38D-C63L (listing the only four states with a statutory requirement for risk-limiting audits).
277. JARICE HANSON, THE SOCIAL MEDIA REVOLUTION 113 (2016).
278. See supra Part II.
279. Henry Farrell & Bruce Schneier, Democracy’s Dilemma, BOS. REV. (May 15, 2019), https://perma.cc/RF2K-XHLU.
280. See Peter Swire, A Pedagogic Cybersecurity Framework, 61 COMMC’NS ACM 23, 23–24 (2018) (proposing a multidisciplinary framework for teaching cybersecurity that “organizes the subjects that have not been included in traditional cybersecurity courses, but instead address cybersecurity management, policy, law, and international affairs”).
In 1948, George Kennan, an American diplomat and a historian, defined national security as “the continued ability of the country to pursue the development of its internal life without serious interference, or threat of interference, from foreign powers.”281 Yet such a conception of national security is not so clear cut when the goal is protecting democracy itself, as seen in cases of Russian operatives organizing U.S. citizens to engage in activism during the 2016 election cycle.282 Neither is an analytical framework to ascertain all the necessary steps that must be taken to harden democratic institutions against these attacks. What follows is a suggested path forward. Before turning to the work of Swire, though, it is first necessary to provide some context.
281. Gayle Smith, In Search of Sustainable Security, CTR. FOR AM. PROGRESS (June 19, 2008, 9:00 AM), https://perma.cc/7XEY-U2KW
282. See, e.g., Shaun Walker, Russian Troll Factory Paid US Activists to Help Fund Protests During Election, GUARDIAN (Oct. 17, 2017, 12:13 PM), https://perma.cc/C8D4-N3EJ (reporting that Russian “trolls” offered $80,000 to U.S. activists in order to support the organization of protests and events about divisive social issues).
Numerous regulatory theorists and governance scholars have considered cyberspace, including the best ways to engender change in this dynamic, interconnected environment. Yochai Benkler, for example, has offered a three-layer structure to consider interventions, including: (1) the “physical infrastructure,” including the fiber-optic cables and routers making up the physical aspect of cyberspace; (2) the “logical infrastructure,” comprising necessary “software such as the TCP/IP protocol;” and (3) the “content layer,” which includes data and, indirectly, users.283 This conceptualization, while helpful, only takes us so far in better understanding the various cyber threats facing democratic institutions and what to do about them. It largely ignores, for example, the role played by state and non-state actors in shaping the content layer.284 Lawrence Lessig built from this model,285 advocating for“decentralized innovation” making use of various modalities including interventions supporting layers.286 However, Andrew Murray has concluded that this is “idealistic” and that, “the harnessing of one regulatory modality through the application of another is more likely to lead to further regulatory competition, due to the complexity of the network environment.”287 Instead of solely relying on code, then, laws, norms, and markets also have important roles to play in shaping a polycentric response to addressing vulnerabilities in democratic election systems.288
283. Yochai Benkler, From Consumers to Users: Shifting the Deeper Structures of Regulation Toward Sustainable Commons and User Access, 52 FED. COMMC’NS L.J. 561, 562 (2000).
284. See, e.g., Swire, supra note 280, at 24 (explaining that private organizations and national governments influence cybersecurity risks and responses by taking action to mitigate attacks, enacting and enforce laws, and engaging in dialogue or signing treaties with other nations).
285. See LAWRENCE LESSIG, FREE CULTURE: HOW BIG MEDIA USES TECHNOLOGY AND THE LAW TO LOCK DOWN CULTURE AND CONTROL CREATIVITY 160 (2004) (describing “the interaction between architecture and law” in the context of copyright regulation).
286. See LAWRENCE LESSIG, THE FUTURE OF IDEAS: THE FATE OF THE COMMONS IN A CONNECTED WORLD 85–86 (2001) (arguing that “commons” at the code, content, and physical layers “create the opportunity for individuals to draw upon resources without connections, permission, or access granted by others”).
287. ANDREW D. MURRAY, THE REGULATION OF CYBERSPACE: CONTROL IN THE ONLINE ENVIRONMENT 46 (2007) (“It is highly unlikely that content producers, media corporations and other copyright holders will allow for a neutral system designed to protect cultural property and creativity at the cost of loss of control over their products.”).
288. See id. at 46–47, 124 (“[T]he effectiveness of code-based control mechanisms depends entirely upon their recognition and acceptance within these first-order regulatory environments [competition, society, and hierarchy].”).
One way to think through such a polycentric approach is to make use of Swire’s stack analogy,289 offered in adapted form as Table 3. Under this formulation, the foregoing analysis was concerned with levels seven through ten, but the chart highlights the extent to which it is vital to secure the underlying system architecture including voting machines.
Table 3: Applying Swire’s Expanded OSI Stack to Election Security290
Layer | Vulnerability | Policy Response(s) |
1. Physical | Supply chain attack; wiretap; stress equipment | Employ third-party penetration testing and audits; require NIST CSF compliance; consider smart contracts |
2. Data Link | Cause delays or noise | End-to-end encryption |
3. Network | Domain Name System (DNS) and Border Gateway Protocol (BGP) attacks | Utilize BGP security features as well as DNSSEC |
4. Transport | Man-in-the-middle attacks | Defense in depth & security by design techniques |
5. Session | Session splicing | Enhanced cyber hygiene |
6. Presentation | Attacks on encryption | Stronger encryption (even quantum) |
7. Application | Malware | Proactive cybersecurity measures; cyber hygiene |
8. Organization | Insider attacks; lack of adequate information sharing (between election officials or with allies) | More robust information sharing; require state-of-the-art technical standards and paper ballots along with risk limiting audits |
9. Government | Weak laws for protecting critical infrastructure, IoT, voting machines, and media outlets | Reform efforts such as the Secure Elections Act; push firms to adopt Disinformation Codes of Conduct; train election officials |
10. International | Nation-state cyber-attacks; lack of international agreements to limit the use of cyber-attacks on election infrastructure; inadequate dispute resolution | Agree on new election security international norms (such as through Paris Call or UN GGE process); ratify a treaty designed to safeguard civilian critical infrastructure; create new cyber threat information sharing forums and joint sanctions regimes for rule breakers |
289. See *Swire, *supra note 280, at 24 (explaining that Swire’s model adds three “layers” of cybersecurity vulnerabilities to the seven traditional layers of the Open Systems Interconnection model that computer scientists use to conceptualize computer systems).
290. Id. For a description of these cyber-attacks, see Chapter 3 in SCOTT J. SHACKELFORD, MANAGING CYBER ATTACKS IN INTERNATIONAL LAW, BUSINESS, AND RELATIONS (2014).
As Table 3 shows, there is a great deal that both the public and private sectors can do, locally and globally, to make democracy harder to hack. Particularly on levels eight through ten of Swire’s Open Systems Interconnection (OSI) stack analogy, which is popular among programmers in illustrating the various levels of systems, there is a great deal more that the U.S. and other democracies can and should be doing to secure vulnerable election infrastructure and combat digital repression.
In the United States, despite post-2016 funding, still more than two-thirds of U.S. counties report insufficient funding to replace outdated, vulnerable paperless voting machines, further help is needed.291 Aside from appropriating sufficient funds to replace outdated voting machines and tabulation systems, Congress also should encourage states to follow Colorado’s example292 (and the best practices listed in the EU Compendium) 293 by refusing to fund voting machines that use paperless ballots, and requiring risk-limiting audits, which use statistical samples of paper ballots to check if official election results are correct, to increase confidence in election outcomes. Congress should also require NIST to update their voting machine standards, which state and county election officials rely on in deciding which machines to purchase as in the case of Australia.294 Further, a National Cybersecurity Safety Board could also be created to investigate cyber-attacks on U.S. election infrastructure and issue reports after elections to help ensure that vulnerabilities do not go unaddressed.295 A crash course is also needed for local and county election officials across the nation.296 There is an opportunity for both civil society and higher education to aid in this effort, as Indiana University is doing to help the Secretary of State’s Office prepare for a wide array of scenarios, conduct tabletop exercises, and create a cybersecurity guidebook for use by newly elected and appointed election officials.297 Other states could engage in similar partnerships, along with pooling resources to create repositories of best practices.
291. See Lawrence Norden & Andrea Córdova McCadney, Voting Machines at Risk: Where We Stand Today, BRENNAN CTR. FOR JUST. (Mar. 5, 2019), https://perma.cc/99U9-PVKD.
292. See Nathaniel Minor, Colorado Is a Pretty Darn Safe Place to Cast a Ballot. This Is How We Got Here, COLO. PUB. RADIO (Oct. 25, 2018), https://perma.cc/4X9P-4HDC (describing Colorado’s ballot-counting and risk-limiting audit systems and observing that the Washington Post called Colorado “the ‘safest’ place to cast a ballot” in the United States).
293. See generally COMPENDIUM, supra note 33 (listing the cyber security best practices).
294. See Eric Geller, New Federal Guidelines Could Ban Internet in Voting Machines, POLITICO (Oct. 30, 2019, 4:03 PM), https://perma.cc/U4K6-469Z (“[The Voluntary Voting System Guidelines]—produced by the Election Assistance Commission and the technical standards agency NIST—is not a set of mandatory federal rules. However, most states require voting equipment to pass VVSG-based testing before they buy it.”).
295. See Scott J. Shackelford & Austin E. Brady, Is It Time for a National Cybersecurity Safety Board? Examining the Policy Implications and Political Pushback, 28 ALB. L.J. SCI. & TECH. 56, 68 (2018) (“Such a model would be an improvement on the existing reliance on Cyber Emergency Response Teams . . . and aide in effective policymaking at both the state and federal level given the lack of hard, verifiable data on the scope and scale of cyber-attacks.”).
296. See, e.g., Indiana University to Help Secure Indiana’s 2020 Elections, IND. UNIV. (Oct. 25, 2019), https://perma.cc/SV86-G6VP (noting that Indiana University will host “regional ‘boot camps’ with [Indiana] county clerk offices to train election officials about how to respond to different forms of cyberattacks, such as phishing, phone scams and impersonation calls”).
297. See id. (“[S]tate legislators have awarded Indiana University $301,958 to partner with the Indiana Secretary of State’s Office to review and improve the state’s election cybersecurity incident response plan.”).
Learning lessons from the case studies in Part IV, the U.S. government could build out the capability of DHS to ward off disinformation campaigns similar to Indonesia’s approach, as California is doing through its Secretary of State’s Office.298 Ahead of the 2020 election cycle, the United States could also work with allies around the world to build from the Paris Call for Trust and Security in Cyberspace and the Christchurch Call with these specific actions, perhaps encapsulated in a Call to Safeguard Democracy.299 The UN Group of Government Experts and standing working group should be leveraged in this effort, and new regional cybersecurity hubs created to speed the transfer of information between jurisdictions as has already been accomplished through the EU’s Cooperation Groups.300 One possibility is a regional approach, such as a “South Pacific Elections—Information and Analysis Center (SPE-ISAC),” a potential solution to the lack of a cohesive Pacific regional cybersecurity group.301
Finally, with regards to disinformation in particular, the U.S. government could work with the EU to globalize the self-regulatory Code of Practice on Disinformation for social media firms (thus avoiding thorny First Amendment concerns).302 It could also work to create new forums for international information sharing and more effective rapid alert and joint sanctions regimes.303 The international community has the tools to act and hold accountable those actors that would threaten democratic institutions. Failing the political will to act, pressure from consumer groups and civil society will continue to mount on tech firms, in particular Facebook, which may be sufficient for them to voluntarily expand their efforts in the EU globally, the same way that more firms are beginning to comply with GDPR globally as opposed to designing new information systems for each jurisdiction.304
298. See Ben Adler, California Launches New Effort to Fight Election Disinformation, CAPRADIO (Sept. 19, 2018), https://perma.cc/6YSA-FY2L (“Under a recently-passed law, the office will ‘monitor and counteract false or misleading information’ that could ‘suppress voter participation or cause confusion and disruption of the orderly and secure administration of elections.’” (internal citations omitted)).
299. See World Leaders and Tech Giants Sign Ardern’s ‘Christchurch Call’ to Curb Online Extremism, SBS NEWS (May 16, 2019), https://perma.cc/G433-QEUS (explaining that the Christchurch Call is a pledge to eradicate “violent extremist content on the internet” signed by national governments and major technology companies).
300. See supra notes 199–202 and accompanying text.
301. See IND. UNIV. & AUSTL. NAT’L UNIV., supra note 252, at 101–05 (proposing specific features of a potential SPE-ISAC, considering potential benefits of such an approach, and recommending next steps for its implementation).
302. See supra notes 207–211 and accompanying text.
303. See supra notes 226–232 and accompanying text.
304. DIGITALEUROPE, ALMOST TWO YEARS OF GDPR: CELEBRATING AND IMPROVING THE APPLICATION OF EUROPE’S DATA PROTECTION FRAMEWORK 3 (2020), https://perma.cc/72PY-TM5X (PDF) (“[T]he fact that the GDPR has inspired other data protection regimes around the world, at least regarding its principles, has led many organisations to address data protection not only for their EU operations but also globally . . . .”).
Table of Contents
- I. Introduction
- II. Unpacking the Cyber Threat to Democracies
- III. U.S. Efforts to Protect Democratic Institutions
- IV. Lessons from Other Democracies
- V. Implications for Policymakers
- VI. Conclusion