The United States appears to have dodged a bullet in last month's global ransomware outbreak, as the worst impacts were felt on computers overseas. But the WannaCry attack, in which infected machines were locked down until a ransom was paid in Bitcoin, highlights the need to find better solutions for the enormous vulnerabilities baked into many of our IT systems.
This problem is systemic, and there are no easy answers to ensure effective cybersecurity.
Our nation's economy, increasingly inseparable from our digital economy, has rewarded innovation that brings internet connectivity and software to ever-more products, processes, and sectors. However, security and resiliency too often are an afterthought.
This poses a huge and unacceptable risk. And, truth be told, too much of our commercial software is insecure. It's been estimated that the typical piece of software contains 25 errors per 1,000 lines of code. While not every one of those errors is something that a hacker can exploit, or that will cause a fatal error, we're still living with an unacceptable level of insecurity in software that runs our hospitals, transportation networks, and financial systems.
This is especially true as our economy hurtles headlong into the world of the Internet of Things (IoT), which includes appliances and everyday objects embedded with internet connectivity and data-processing capabilities. According to one study, up to 70% of the most common IoT devices contain easily-identifiable vulnerabilities.
This is a whole-of-society problem, and meaningfully addressing it will require an across the board increase in public understanding of cybersecurity issues—in classrooms, boardrooms, and congressional hearing rooms.
Congress has examined these issues, in fits and starts, for much of the past two decades. The result of those efforts? A simple information-sharing bill Congress passed in 2015.
It's past time to get serious.
First, while Americans overwhelmingly express concern with the security of their data and devices, we must address their sense of helplessness when it comes to protecting themselves. And we need to provide better tools to help them do it.
For instance, despite an obvious demand for greater security, many consumers have difficulty evaluating the relative security of different technology products. Ongoing efforts by the National Telecommunications & Information Administration to drive greater transparency in upgradability for IoT products, including through better packaging labels, represents a good start.
Industry also should get behind efforts to develop meaningful rating systems for consumers to evaluate competing products. And the federal government should use its purchasing power to shape the market towards more secure products. Additionally, the Federal Trade Commission should expand, rather than retreat from, efforts to hold vendors accountable for misrepresentations in the security claims they make to consumers.
Second, today's mostly anemic cybersecurity policy discussion need to be broadened to focus on less sexy topics, such as patch management, IT modernization, and digital identity management.
For many organizations in the wake of WannaCry, maintaining susceptible legacy systems, and failing to install critical security updates, represented clear mismanagement. But patch management is a complex undertaking, particularly for large organizations and enterprises.
For instance, patches —even from leading vendors—can render an entire IT system inoperable for a period of time. What's more, a piece of capital-intensive equipment may have embedded software: that means you cannot upgrade its operating system without replacing the entire machine.
The United States has what appears to be a longer-term economic problem when costly, critical systems with lifespans measured in decades are dependent on software that's supported only for four or five years. This economic tension will only grow more acute as we see growth in software-supported, digital infrastructure.
Third, a critical part of the nation's response to our pervasive insecurity must come from greater innovation and professionalization across the broader cybersecurity industry. Today, the US has an estimated 350,000 unfilled cyber job openings, and future success means we must focus our efforts much earlier – including developing better cybersecurity competencies, and wider computer science education, at younger ages.
To be sure, none of these recommendations alone will completely solve the problem. But the sheer scale of the challenge and the scope of possible solutions should not deter us from making progress as quickly as we can.
After all, digital insecurity threatens to impede the growth of the digital economy itself. Just last year, Census Bureau surveys revealed that 45 percent of online households indicated concerns about online privacy and security had deterred them from conducting financial transactions, buying goods or services online, or posting on social networks.
If our pervasive vulnerability continues, the explosive growth in software which propels the digital economy may ultimately contribute to the very undoing of the digital economy.
Warner, a Democrat, is a former Virginia governor, venture capitalist, and co-founder of Nextel. He serves on the Senate's Banking, Budget, Finance and Intelligence committees.
Subscribe to Science Solved It, Motherboard's new show about the greatest mysteries that were solved by science.