This article is more than 1 year old
Super cool: Arctic data centres aren't just for Facebook
Going north: More than an icy blast in your chiller cabinet
Dotted around the near-Arctic are several data centres, each taking advantage of the cold air in that region. We know that low temperatures are great for cooling, but it isn’t the only reason that operators chose those locations.
Facebook opened its data centre in Luleå, northern Sweden in 2011. Google rolled out its Hamina data centre in Finland the same year, using seawater from the Bay of Finland to chill the servers that serve up your search results.
Cold climates certainly play a part in the siting of these data centres. They can reduce operating expenses, because conventional cooling systems slurp up a lot of electricity. The temperatures in near-arctic area are so low that companies can simply run cold air or water into their data centres or use heat exchangers to bring temperatures down.
But the price and availability of local power sources is at least as important.
“To say that the main reason you should come to Iceland is because of free cooling? I would disagree,” said Tate Cantrell, chief technology officer of data centre firm Verne Global, which opened its Icelandic data centre in 2012.
“That would be like me telling you that the main reason to come and put your data centre in Iceland is because you can go to the Blue Lagoon. That’s not the reason that people are coming to Iceland.”
He would know. The firm’s Icelandic data centre is located at a former NATO base in the southwestern corner of Iceland, close to the Reykjavik city centre, and close to two geothermal generation facilities.
There’s plenty of power to go around in Iceland. The country generates more power per capita than anyone else, given its sub-350,000 population and its focus on energy-intensive aluminium smelting. Its power is almost entirely generated by hydro and geothermal power, providing it with a long-term sustainable power source that is nowhere near at capacity yet.
All of this makes power incredibly cheap, Cantrell pointed out. “People are coming to Iceland because we have a campus that’s positioned to connect you to the most sustainable grid on the planet at a cost that’s in some cases a fifth of what you would pay in a major city centre in continental Europe,” he added. With power representing around half of the cost of a data centre over its lifetime, that’s a substantial saving.
Free air cooling does bring many benefits, though, particularly when it comes to building these data centres. On average, facilities cost around $15m to construct for every megawatt of capacity, according to Andrew Donoghue, European research manager at analyst firm 451 Research. Using free air cooling with sufficient temperature enables the builders to do away with mechanical coolers altogether, cutting up to 40 per cent of the capital cost.
“The whole mechanical cooling side is a significant proportion of the overall build cost,” he said. “You’re more able to be able to do that in a cooler climate.”
More companies are looking at near-Arctic locations to capitalize on benefits like these. Located in northern Sweden, Hydro66’s near-Arctic data centre relies on free-air cooling. It sits in the Node Pole, a popular data centre corridor just 50 miles away from the Arctic circle. Facebook’s facility in this region too.
On Hydro66’s website are its constantly updated statistics: electricity, it says, is 1.65 times more expensive in London than there. It has saved more than a million kilowatt-hours on cooling since it opened in October 2015.
There are still risks in building cold-climate sectors, but connectivity isn’t one of them. Finland and Sweden come second and third on the World Economic Forum’s Network Readiness Index (NRI), which measures the maturity of connectivity in different countries.
“There’s often been the political will to get high-speed bandwidth across the whole country, so even though the north of Sweden is always relatively underpopulated, it’s not a telecoms wasteland,” said Steve Wallage, managing director of data centre and IT infrastructure analyst firm Broad Group.
Government and local infrastructure firms are also driving the industry, Wallage added. Dominant Scandinavian telco Telia Sonera is extremely supportive of the data centre industry and has been competitive for business, Wallage claimed. The Finnish government has also supported the connectivity needs, rolling out a new undersea cable between Finland and central Europe. Slightly higher latency is a risk for some operations in these remote climates according to the EMEA marketing chair at the Green Grid, Roel Castalien. For hyperscale vendors such as Facebook, it makes sense to open large datacentres in areas closer to growing markets like Europe, he pointed out, but local European firms may find the decision harder.
“If you're not a Scandinavian company, you always have the calculation on latency that you have to make,” Castalien said.
Nevertheless, is perfectly acceptable for many workloads. It was certainly fine for BMW, which moved some of its high-performance computing capability into Verne Global’s Icelandic centre in 2012. In 19thplace with its three undersea cable connections, Iceland is still in the top 13th percentile of the NRI. Donoghue points out that with around a 20ms response time to London, Iceland enjoys roughly the same level of latency as New York to Chicago.
BMW’s focus on simulating crash data and other number-crunching jobs are similar to Bitcoin mining in some ways, in that the focus is on intensive computing, and latency is not a strong consideration. Flash trading systems might have more trouble.
There are two classes of data centre operator in these northern regions. Hyperscale firms like Facebook and Google are processing extreme volumes of data with low operating expenditure and solid infrastructure. The likes of BMW and Verne Global are tackling specific corporate workloads and taking advantage of the carbon-free operations that bolster corporate sustainability reports.
In the same northern climates, there’s another, emerging class. Some operators rely on large amounts of replaceable equipment and capitalize on multiple alternative power sources, perhaps doing away with backup generators, relying instead on the abundant local sources of power and redundant grids.
“They're looking at driving costs down to 5-6m per megawatt or less by stripping out everything that's not going down to the bare amount of redundancy,” Donoghue said.
This isn’t something that a Facebook or a Google would do, but for companies with a singular focus on one workload such as Bitcoin-mining and the ability to treat commodity ASIC equipment like cattle rather than pets, it can be a good way to justify a near-Arctic build.
“One facility was more or less just a shed. It was one of the warmest days in Iceland that we happened to be there for, and they literally had the doors of the data centre open,” Donoghue concluded. “You call it a data centre, but it would stretch the notion in some people’s minds.” ®