Artificial intelligence might seem ephemeral, but the incredible recent advancements in machine learning have an environmental cost.
According to a new study, the cost of training one off-the-shelf AI model using a single high-end graphics card (or GPU) generates the carbon equivalent of a flight across the United States. The training of a different AI model was estimated to pump as much as 626,000 pounds of carbon dioxide into the air—roughly five times the emissions created by one car in America during its entire lifecycle, from manufacture to death, the MIT Technology Review pointed out.
The research was carried out by researchers Emma Strubell, Aranya Ganesh, and Andrew McCallum at the University of Massachusetts, Amherst. A preprint of the paper is available online.
For an AI program to, say, play Quake III, it first needs to be "trained" on a large amount of input data. This means continually running computations on huge collections of data like images, sometimes tens of millions. Like cryptocurrency mining—which in many cases requires running GPUs 24 hours a day—this heavy computation requires a lot of energy.
“I'm not against energy use in the name of advancing science, obviously, but I think we could do better in terms of considering the trade off between required energy and resulting model improvement,” Strubell said in an email.
To quantify the carbon footprint of AI, the researchers trained eight off-the-shelf natural language processing AIs—models designed to generate convincing language, such as OpenAI’s GPT-2 model—for 24 hours using a single Nvidia Titan X GPU, and in one case three Nvidia GTX 1080 Tis. These GPUs are definitely high-end, but they are available to consumers.
The researchers measured the CPU and GPU power consumed by the training the AI over 24 hours. Then they pulled the total number of training hours from the AI’s original research papers to calculate the total energy consumed to train the AI. The researchers converted that number into pounds of carbon dioxide based on average American usage data provided by the Environmental Protection Agency.
While the realization that training an AI model can produce emissions equivalent to a cross-country flight may be shocking, it’s important to remember that once that intensive training process is complete, their carbon footprint decreases. But the carbon footprint of artificial intelligence is still an important consideration for researchers and companies going forward.
“I think large tech companies that use AI throughout their products are likely the largest contributors to this type of energy use,” Strubell said. “I do think that they are increasingly aware of these issues, and there are also financial incentives for them to curb energy use. For example, Google's data center efficiency is far above the industry average and they are sourcing much more renewable energy than the US average.”
AI is poised to impact more aspects of our daily lives, including in self-driving cars (assuming that technology ever really gets off the ground). As with cryptocurrency mining, it's important to remember that seemingly abstract computer processes have real effects.
Listen to CYBER, Motherboard’s new weekly podcast about hacking and cybersecurity.