This article is more than 1 year old
Green your data centre – without ending up in the Job Centre
Budget and planet-saving technology ideas
Cunning cooling
Similarly innovative is the concept of “adiabatic” air handling systems. The term adiabatic is defined as: “Of, relating to, or being a reversible thermodynamic process that occurs without gain or loss of heat and without a change in entropy” – or to take a more comprehensible definition: “[a process] that occurs without transfer of heat or matter between a system and its surroundings”.
Adiabatic air handling also has the benefit that you don't need half as much equipment as you do with a traditional air-con system — and so, as with the UPS removal, you've the potential to save in that respect.
You may well be thinking by now that these examples have become increasingly believable and decreasingly bonkers the further in you've read – and you'd be right: the adiabatic stuff in particular can be plonked in the result-of-ongoing-innovation pot rather than the fruit-loop-research-for-the-sake-of-it one. So let's carry on that progression into the day-to-day activity both you and your mainstream service provider can do in a green sense.
Mainstream greenness for the data centre provider
OK, so let's look at some of the things you're likely to find in the average data centre these days.
First is the imposing control on airflow by building doors, walls and ceilings around cabinets (the “cold aisle” approach). If you can maximise the delivery of cold air to cabinets you'll maximise the ingestion of it into the servers; this will allow them to run cooler, spin their fans less, generate less heat, prolong the life of the equipment, and minimise the amount of cold air you waste by sucking it straight out of the room without it passing through a server.
Next up is providing an incentive for the customer to think hard about power consumption, by lowering the default amount of power provision per rack and charging for overages.
If a customer knows it's going to cost them a couple of hundred quid per month per extra kilowatt their equipment draws, they'll be sure to use it efficiently (not least because the cost of putting in more efficient servers is offset by the tangible, automatic financial saving in power costs).
Thirdly, it's becoming more common to see data centre providers using solar generation to some extent – even if the primary use is for powering the non-system-critical services such as lighting and the canteen coffee machine.
It's free power and if you're spending a few-million pounds building a data centre, why wouldn't you spend a few thousand on solar panels? Of course, the likes of Google, being Google-sized, do this in spades and take it to extremes. But solar generation is renewable power for the masses these days, so more and more people are doing it.
Finally, service providers often provide more than just good old co-location (where you buy cabinet space and power). It's common to see hosted services, often badged with the trendy “cloud” label, offered by service providers, because if you've spent millions building a data centre installation, it's actually a relatively modest further step to become a hosted services provider.
And hosted/cloud services means sharing less hardware dynamically among more customers, which by definition brings down both the space requirement and the power footprint.
What you can do – and why
In a similar vein, there's plenty you can do – which is fortunate if, as I mentioned earlier, your provider is charging you through the nose if you want more than a handful of electrons.
First is to consider whether you need data centre space at all: shove your applications in the cloud and it's somebody else's problem (and generally speaking, the larger the provider the greener the systems they can afford to invest in).
If you do decide to host your own, think virtual: a blade-based server with single-power supplies and fan units shared between server modules, with VMware or Hyper-V plonked on top running dozens of virtual machines will have a fraction of the power footprint of the physical server equivalent (and will need fewer cabinets, too).