Cloud and hosting provider Rackspace will put the chilly air of England to work cooling servers in a new facility – though its approach differs from other energy-saving techniques used by Google and Facebook.
The company announced on Wednesday that it had broken ground on a new data center site in Crawley, West Sussex, with partner Digital Realty Trust.
With the new bit barn, Rackspace is going to use a free air cooling system that lets it do away with energy-intensive chillers – and instead put the craptacular English weather to work in the service of its humming servers.
Its approach will use technology from Brit infrastructure firm Excool, whose heat exchanger coils use the outside air to chill the inside air, without mixing the environments. This means Rackspace won't have to put external filtering systems in place to deal with particulates in the air – like smoke from a nearby fire, say – or changes in humidity levels from England's famous drizzle.
"For a large part of the year that [air cooling] process is sufficient," Gary Boyd, Rackspace's senior director of data center project engineering, explained to The Reg. "At certain times of the year when it gets warmer we spray additional water onto the face of the heating system. There's a lot less complexity – it's a very simple cooling process. It also means significantly reduced energy consumption."
This approach is something of a halfway house between power-heavy HVAC-based cooling, as found in typical data centers, and the flashy infrastructure-heavy air cooling projects used by companies like Google and Facebook. Facebook, for instance, uses external air to cool all of its internal servers in its Prineville facility, but to do so had to spend lots on infrastructure and site its facility in the dry, high deserts of Oregon. It's hard to find equivalent weather in Blighty's Crawley, Rackspace acknowledged.
"We do think that there is a lot more complexity with the direct air system," Boyd says. "We do think it's more prone to external influence and could potentially harm data control conditions internally – our approach is to keep it separate."
Rackspace expects that the power usage effectiveness – the amount of energy expended on IT gear versus cooling and management infrastructure – of its new facility will be around 1.17, versus 1.55 for its current bit barn in Slough, he said. (Facebook's facility hovers around 1.09, reflecting the energy gains from its direct air cooling.)
Rackspace's new facility will have 130,000 square feet of IT space and has secured 10MW in power capacity – a significant upgrade on the 50,000 or so square feet it has in its Slough facility.
By comparison, Facebook's 30MW server vault in Prineville, Oregon, has about 650,000 square feet of IT space.
Rackspace hopes to open up two of four planned data halls in the first half of 2015, and will run the cabling overhead rather than under a raised floor. It will also do hot aisle and cold aisle separation – but then again, most new build facilities do this if they have the space as it's one of the easiest ways to save on cooling.
"The addition of another data center in the UK is in response to the demand in the market as well as growth opportunities for the business across Europe," Rackspace's chief operating officer Mark Roenig said in a canned statement.
Rackspace had not got back to us with employment figures for the new facility at the time of writing.
Though Facebook and Rackspace are rightfully patting themselves on the back for using the air to cut their electricity bills and simplify their infrastructure, they've got nothing on Google – in 2011 the search company revealed that it was harnessing the chilly Baltic sea to cool a vast data center on the southern coast of Finland. ®