As the number of people around the world who are connecting to the internet continues to mushroom, the physical infrastructure necessary to support all that data is being upgraded and improved. The International Telecommunication Union estimates that by the end of this year, 47 percent of the global population will be online. Earlier this year, Google estimated it handles, on average, about 40,000 search queries every second.
Tech giants like Microsoft and Google are forever updating their data centres—the giant server farms that handle every download and search query thrust into the ether. As the demand for these centres grows, the innovation that goes into building them gets increasingly more sophisticated.
Operating this gargantuan network of servers means that these data centres are notoriously power-hungry and pump out an enormous amount of waste heat. At the same time, these facilities often need to be as close as possible to major population centres to reduce the costs associated with transmitting all this data to where it's needed.
Microsoft attempted to reconcile the overheated output with population proximity by launching an experiment in 2015 called Project Natick. Since 50 percent of the global population lives near a coastal area, they would use the cooling power that can be obtained from seawater. Natick involved submerging a self-contained data centre underwater as a test case to see if submersible cloud computing is a viable technology.
The goals of Project Natick were twofold: to determine if ocean waters off the coast of California, at a depth of hundreds of metres, could be used in a heat-exchange system to draw off the thermal energy generated by the humming microprocessors in a submerged data centre; and to determine if wave energy could be captured to provide some of the power needed to crunch the massive quantities of data being handled by the underwater system.
Natick was declared a success by Microsoft and they say they have plans to expand the program, but they are keeping the details of the next phase of the project a secret. They refused to discuss their future plans with Motherboard.
A few thousand kilometres to the northeast, Google has been running a seawater-cooled data centre near the Finnish city of Hamina since 2011. The idea may be less radical than operating an internet facility deep beneath the waves, but Google's Hamina operation has shown that it's possible to run a power-hungry data centre with minimal environmental impact.
Google's Hamina centre started life originally as the Summa paper mill, an industrial facility built in the 1950s. Nestled on a quiet bay in the Gulf of Finland, about 130 kilometres northwest of Helsinki, the former mill features a huge seven-by-four-metre tunnel that runs under one of the buildings and directly into the Gulf. Google opened the data centre after a €200 million investment ($240 million USD) that took advantage of the mill's unique architectural feature.
"The data centre here is one of the largest data centres in the world—if not the largest—to be using seawater as a coolant," said Arni Jonsson, senior facilities manager of the Hamina centre, on the phone from the site. The tunnel is connected to a massive intake chamber that feeds directly into the centre's cooling system. From here, the water is drawn by pumps into a series of heat exchangers, sucking the thermal energy from the racks of servers inside the centre before it's discharged back into the Gulf. When the water returns, it's a few degrees warmer, and actually cleaner, than it was when it went in. "It's all free-flow," said Jonsson. "We don't use any energy in getting the water from the sea."
High water temperatures around power plants that occur when warmer water re-enters the ecosystem, for example, have been shown to result in algal blooms and dead fish zones. But at Hamina, the outgoing water is mixed with seawater at the original temperature so this effect is mitigated. "We were concerned with the impact of this heat coming back into the bay on the fish living there," said Jonsson. "We are doing a study where we measure the impact of the site on the fish, the quantity of the fish. So far, the impact has at least been positive. We have seen an increase in the fish population." In addition, no chemicals are used in the heat-exchange process, according to Jonsson.
Other environmentally-conscious organizations are looking into more radical concepts for data centre design. Nevertheless, a new effort is underway underscoring the delicate balance between industry and nature. These data centres need to exist and we need to understand how they interact with their local environments.
One resource that all data centres require is vast amounts of water to cool their overheated circuits. Whether it comes from a huge gulf or not, it's incumbent on us to ensure it's managed responsibly.