Nuclear Power Plants Have a 'Blind Spot' for Hackers. Here's How to Fix That.
Engineer reflected in glass of fuel rod handling machine in nuclear power station. Photo: Getty Images


This story is over 5 years old.


Nuclear Power Plants Have a 'Blind Spot' for Hackers. Here's How to Fix That.

Malware hunters, regulators, and plant employees are hunting further down the supply chain for vulnerabilities as hackers continue to target critical infrastructure.

Billy Rios likes to hack the machines that make modern society function. Take the Morpho Itemiser 3, a prototype of the device the Transportation Security Administration uses to screen airport travelers for explosives and narcotics.

Rios, a security researcher, decided one day in 2013 to probe the Itemiser 3, which TSA had tested in the lab but never deployed. He bought the device online, took it apart, and found that a password was hardcoded into the equipment. This shortcut allowed technicians easy access to the machine, but also potentially left it prey to hackers who could remotely commandeer it.


In August 2014, at the annual Black Hat Conference in Las Vegas, Rios presented his findings to the public. “Once someone else discovers the technician password, it [becomes] a backdoor,” Rios said at the time, calling the shortcut “dangerous” for the access it allows a hacker.

These days, companies in charge of some of the United States’ most critical infrastructure hire WhiteScope, Rios’s cybersecurity firm, to breach systems and then explain how they did it, all to prepare for the real thing. He and his team of researchers have picked apart the communication systems used by airplanes and cars. But sometimes Rios’s tests stray into unforeseen territory. It was only shortly after he presented his Itemiser findings four years ago, for example, that he learned how the device was used in the nuclear sector.

For all of the hardcore services WhiteScope offers, Rios, a veteran of the Iraq War and a former incident response leader at Google, has only begun scraping the surface of the complex supply chain that feeds the thousands of digital components that go into a nuclear facility.

A nuclear power plant’s critical systems are well fortified from run-of-the mill cyberattacks launched from outside a plant. That makes the supply chain, with its often far-flung production sites, a logical target for well-resourced hackers looking for a foothold into a facility. As a result, meticulous regulators, seasoned nuclear plant employees, and cunning penetration, or “pen” testers like Rios are all playing their part in the ceaseless effort to make the supply chain more cyber-secure.


The supply chain is “a huge blind spot right now,” Rios told me. “We can test the security of someone’s environment and we can test the security of someone’s devices in these environments. But testing an actual supply chain attack is really hard because it involves coordination between a lot of different players.”

Easy Avenue

A typical American nuclear plant has between roughly 1,000 and 2,000 “critical digital assets,” or digital components and support systems that impact safety, security, or emergency preparedness, according to Jim Beardsley, a cybersecurity official at the US Nuclear Regulatory Commission. With many analog components going out of stock, the onus is on nuclear operators and their suppliers to conduct rigorous tests to ensure that equipment installed at plants is bug-free.

Meanwhile, as that vital work continues, hackers are operating on their own schedules, biding their time and looking to infect critical-infrastructure supply chains where they can. The Department of Homeland Security warned in March that Russian government hackers had been targeting the nuclear industry, among others, as part of a broad two-year campaign that looks to exploit “trusted third-party suppliers with less secure networks.”

Publicly-reported data breaches at nuclear facilities are rare. Generally, breaches have been limited to IT networks that do not affect critical safety and security systems (with a notable exception being Stuxnet, the infamous, advanced computer worm widely believed to have been developed by the US and Israel to strike a uranium enrichment facility in Iran in 2009—more on this in a moment.) Nuclear operators carefully isolate critical systems from public-facing networks; many systems are behind a “diode” that only allows data to flow in one direction, thus shielding from outside hacking.


As many nuclear power plants were built decades ago, the industry has long employed analog equipment, gear that has no digital component and is therefore immune to hacking as we know it today. While such equipment will continue to feature in plants for safety and cyber and physical security reasons, more and more gear has digital features whose cybersecurity operators must guarantee.

The International Atomic Energy Agency, the UN’s nuclear watchdog, cautions that updated components for nuclear plants, such as pressure sensors and flow meters, are increasingly coming with embedded software. In some cases, IAEA guidance states, nuclear plant employees “specifying and purchasing instrumentation may not be aware that a supplier’s product contains embedded software.” And product manuals, the IAEA adds, may not clearly indicate as much.

Stuxnet showed what was possible when attackers disguise their malware as a trusted computer program in the supply chain.

The hackers behind Stuxnet, as previously reported in WIRED, stole the digital certificates of two Taiwanese hardware companies and used them to sign computer drivers. The use of authentic certificates then fooled the defenses of Windows operating systems, allowing the malicious code to load. With the help of portable media to reach an “air gapped” system with no direct or indirect connections to the internet, the attackers were able to destroy about 1,000 centrifuges at the Natanz enrichment facility.


The incident illustrated how hard it can be to spot compromises to the supply chain, something more recent hacking operations targeting critical infrastructure have exploited.

In 2013 and 2014, for example, members of Dragonfly, an advanced Russian hacking group, infiltrated the websites of industrial control systems (ICS) software vendors. Customers visiting a vendor’s website risked downloading malware that had been bundled with a legitimate software update. While there is no evidence that the vendors have clients in the nuclear industry, experts say that attack vector—one that exploits publicly-available software updates—is a logical one in any industry. And the persistence of groups like Dragonfly is a reminder that attackers have time on their side and may wait years before leveraging a backdoor buried in the supply chain.

Hackers look for new pathways into a network as environments become more locked down, said Liam O’Murchu, a security specialist at Symantec who was one of the first malware analysts to dissect the Stuxnet worm. “The supply chain attacks that we’re seeing right now,” O’Murchu told me, “are a very easy avenue compared to some of the older avenues that have become more difficult.”

History That Is Not Reflected

The nuclear industry has long paid close attention to supply-chain security. Cybersecurity concerns, the notion that chips, routers, and other gear can be maliciously exploited, have intensified in recent years around high-profile events like Stuxnet, but also as authorities in the US, Britain, and elsewhere have moved to regulate the supply chain.

A set of US Nuclear Regulatory Commission regulations initiated in 2009—but that, in general, wasn’t due for full implementation until the end of 2017—requires American nuclear plant operators to demonstrate stricter oversight of the cybersecurity of their supply chains. It is a more dynamic program than past procurement practices as regulators, operators, and suppliers have to continuously assess the cyber-threat environment, according to George Lipscomb, a former NRC inspector.


Ahead of a fresh round of plant inspections, US nuclear operators further scrutinized their supply chains. Over the course of three days in April 2017, Cooper Nuclear Station auditor Talisa Chambers and her colleagues at utility Nebraska Public Power District (NPPD) went through a range of security checks at the production site of a supplier of critical digital equipment. In the process, Chambers told me, they learned new things about the supplier’s security environment.

Prior to the audit, Cooper Station officials did not know the vendor had re-tested equipment that arrived from sub-suppliers on site, according to Chambers. In lieu of that practice, Cooper Station could have required the unnamed vendor to show that those sub-suppliers had cybersecurity controls in place. Working out these policy differences, in other words, is crucial to minimizing the number of blind spots in the supply chain.

Read more: Why We Should Worry About Ancient Viruses Infecting Power Plan ts

Alvin Hays, a half-century veteran of the nuclear industry and senior systems analyst at NPPD who participated in the April 2017 audit, said the vendor performed well overall, but that there were still some security kinks to work out. “There were areas where they didn’t realize they needed to have a policy,” Hays told me. “It was definitely still in draft form.”

In one case, Hays explained, a documented control had an effective date of the audit, meaning it had been assembled expressly for the inspection. Hays said it was NPPD policy to test all the gear it gets for vulnerabilities, even if the equipment comes from a trusted vendor.


Beardsley, the NRC official, told me the commission plans to review the regulations, which analysts credit as the most scrupulous cybersecurity standards in the ICS industry, in 2019, and update them where necessary. But some security experts still worry about the time that has passed since those regulations’ inception in 2009.

“You have more than eight years of industrial cybersecurity history that is not reflected in those regulations,” said Michael Toecker, a cybersecurity engineer for industrial systems. “The longer in the tooth that those get, the more an adversary is adapting.”

The Culture to Make It Better

Despite rigorous equipment tests performed by nuclear facilities, the elusive nature of software bugs means some inevitably do slip through the cracks. The extent to which the nuclear industry can work with outside researchers who identify vulnerabilities that plant officials miss will be key to supply-chain cybersecurity.

A few days after presenting at Black Hat in August 2014, Rios, the pen tester, got a call from an employee at a US nuclear facility asking him for more details on the Itemiser’s password backdoor.

“I had no idea that the same devices used to detect explosives at airports were also used at nuclear facilities,” Rios told me.

Rios isn’t the only security researcher to point out vulnerabilities in commercial devices used at nuclear facilities.

Last July, Ruben Santamarta, principal security consultant at cybersecurity firm IOActive, showed that an attacker could exploit vulnerabilities in radiation monitoring devices to falsify radiation readings. And in over a decade of running vulnerability assessments at power plants, including multiple nuclear facilities, Bryan L. Singer, IOActive’s director of industrial cybersecurity services, told me he has always uncovered some vulnerability, whether computer worms or Trojan viruses disguised as legitimate software, previously unknown to plant officials. According to Singer, many of those vulnerabilities were introduced early in the supply chain.


“It is as simple as: Do people understand their exposure to risk?”

Discovering such vulnerabilities is not necessarily a cause for concern. In every industry, from nuclear to retail, a good malware hunter strives to find something. A reluctance to go looking for vulnerabilities, however, would be a problem. And while the nuclear industry has made strides in more proactively probing networks for bugs, observers like Tom Parkhouse say, as in other industries, more can be done.

Parkhouse, the top cybersecurity official at British regulator the Office for Nuclear Regulation, said he came across an organization working in the nuclear sector that ties employee bonuses to having a clean sheet for reporting cyber vulnerabilities. Meaning, if you flagged no vulnerabilities, you could be eligible for a bonus. Parkhouse declined to say whether that was a vendor or some other organization, but stressed that the practice is the exception rather than the norm and that he’s had success in stamping it out. (That type of practice is very rare indeed because, industry insiders say, nuclear security culture encourages people to report security concerns whenever they arise.)

“I want people to understand what the risk is rather than deny the problem or overblow it,” Parkhouse said, adding that the UK nuclear sector is accomplishing this in part by sharing threat information as the cyber landscape changes.


Pen testing can make nuclear operators aware of vulnerabilities in the supply chain, though it is ultimately the operator’s decision, guided by regulation, on what to do about it.

“It is as simple as: Do people understand their exposure to risk?” said Parkhouse, who in his military career was deployed on a nuclear security site. “Do they have the agility to respond to the unexpected and have they got the culture to make it better or worse?”

The nuclear industry, he added, “recognizes all those things as its inherent responsibilities anyway when it comes to safety.”

How well the industry continues to apply that experience to cybersecurity will determine its ability to keep adapting to the digital age.

“You can’t hardly buy anything now that isn’t digital in some regard,” said Hays. “And because of that we have to be extra vigilant.”

Reporting for this story was supported by a grant from The Pulitzer Center on Crisis Reporting.

Get six of our favorite Motherboard stories every day by signing up for our newsletter .