Forty seconds after it launched into the skies of French Guiana in 1996, the Ariane 501, an enormous rocket carrying four satellites, exploded rather unceremoniously. Scientists from the European Space Agency, which spent ten years working on the project, watched in horror as their spacecraft—which had cost $7 billion to develop—tumbled back to earth in a cluster of smoke and flames.
It turned out that the catastrophewas caused by a few lines of faulty code that tried to convert a 64-bit number into a 16-bit memory space (the digital equivalent of trying to park a semi truck in a broom closet). That was enough to trigger a chain reaction that caused the craft to veer so wildly off course that G-forces tore the boosters from the body of the rocket.
We rely on code like that in nearly every aspect of our daily lives. Cars use between dozens of microprocessors to get us from A to B. Banks rely on colossal databases and algorithms to store and retrieve our financial data. And one of the main features of the Affordable Care Act, a.k.a. Obamacare, which went into effect late last year, was a website that was supposed to allow people to sign up for health insurance. Unfortunately, the launch of Healthcare.gov was a notorious disaster: The site gave users frequent error messages and had glaring security flaws. “The basic architecture of the site, built by federal contractors overseen by the Department of Health and Human Services, was flawed in design, poorly tested and ultimately not functional,” wrote Michael Scherer of TIME magazine in October.
Though we rely heavily on software engineering, the industry is still struggling to deliver products that don’t fall to pieces immediately after launching. According to the CHAOS Manifesto 2013, a yearly review of the tech industry published by the Standish Group, only 39 percent of software projects were completed on time and within budget, while nearly one in five projects failed. Worldwide, IT failures cost businesses an estimated $3 trillion a year.
“Where cars were in the 1930s and 40s is about where we are in software today,” said Dennis Frailey, a member of the Board of Governors for the Institute of Electrical And Electronics Engineers. “We don’t build [a product] to be high quality, generally speaking. The pressure is to get it out quickly. And this was one of the problems with the healthcare system is that they tried to get it out too quickly and something that massive really needed more time to do it right.”
Software failures can cost governments and businesses millions at a single stroke. In Britain, a melee of finger pointing ensued after an investigation concluded that the country’s nine-year, $17.7 billion-dollar effort to produce an electronic health records system would have to be scrapped. In 2005, British food retailer J Sainsbury had to hire an additional 3,000 clerks to stock its warehouses after the company’s automated supply-chain management system failed to move products off the shelves. And residents of Long Beach, California, got a free pass on their parking tickets when one of the city’s antiquated software systems failed to collect $17.6 million in fines.
So how can the industry improve so that commissioning a large software project isn’t like betting millions of dollars on a single spin of a roulette wheel?
Frailey suggests that software engineers should be required to demonstrate a certain level of expertise before offering their services to the public, just like professionals in disciplines like medicine and law.
“Anybody can claim to be a heart surgeon but how do you know whether they’re any good?” Frailey said. “You start by saying, ‘Do you have a medical license?’”
Every state has a licensing board, which grants certification in areas that range from education to medicine and beyond. Last year, 30 states included software engineering on a list of professions that require practitioners to pass state licensing exams in order to work on projects that could affect public safety. The National Council of Examiners for Engineering and Surveying (NCEES) administers the Professional Engineering (PE) exams, which were offered for the first time last April.
By requiring professional certification, the states are recognizing that the people who create programs to monitor and control public infrastructure, like the electrical grids that keep our stoplights and sewers working, should be held to the same standards as the civil engineers who design those systems.
The licensing effort was supported by nearly two-thirds of software engineers surveyed in a 2008 poll. Phillip Laplante, the chairman of the committee charged with developing the exam, explained the reasoning behind the licensing requirements this way: “Don’t you want some level of confidence that the person who wrote that software that is controlling the nuclear plant is who they said they are? That they have the experience they claim to have and are at least minimally qualified? Are you willing to roll the dice?”
Case in point: The 2003 northeast blackout, in which almost 55 million people in the US and Canada lost power, has been partially blamed on a programming error that allowed the outage to spread.
Laplante said he expects all 50 states to require software engineering licenses within the next decade, and possibly much sooner.
Another safety concern unique to the digital age is the rising threat of cyber attacks. The second largest portion of the new PE exam focuses on topics related to safety, security, and privacy.
“We recognize that security vulnerabilities are a real problem. Almost every day, there’s some hacking news story,” Laplante told me. “We want to ensure that the people who are working on these systems are thinking about this all the time and that it’s not an afterthought.”
Last year, the Government Accountability Office (GAO) reported that the number of cyber attacks climbed from 5,503 in 2006 to 48,562 in 2012—an increase of 782 percent. The report warned, “Cyber attacks could have a potentially devastating impact on the nation’s computer systems and networks, disrupting the operations of government and businesses and the lives of private individuals.” The GAO has issued numerous reports and recommendations asking the federal government to make improving cybersecurity a national priority.
Despite all these warnings, in 2012 a group of Republicans in the Senate, led by John McCain, blocked a bill that would have created new standards to oversee cyber threats to the nation’s infrastructure because, they argued, it would be too expensive for corporations to follow the proposed new rules.
That attitude is dangerously shortsighted—as society becomes increasingly dependent on technology the average person (or average congressperson) doesn’t understand, it’s necessary to have regulations in place to minimize coding failures that can leave people without power or worse.
“I do worry that this is going to be a sleeper topic until one day, something very bad happens and it’s traced back to some failing in software that could have been prevented,” Laplante told me. “Everything is connected to software-based systems now and all kinds of stuff can happen.”
Topics: licensing software engineers, public safety, America’s Affordable Care Act, Affordable Care Act, healthcare.gov, software engineering, CHAOS Manifesto, health insurance, tech industry, Software failures, how much software failures cost governments, technology, Institute of Electrical and Electronics Engineers