Once upon a decade ago, green computing was a big thing. Nowadays it is an actual thing, thanks to the usual suspects: virtualization and cloud computing.
Take for example, the United States' data centres: collectively they chewed up about 70 billion kWH in 2014, about 1.8 per cent of total US consumption, according to a June 2016 study by Berkeley Lab. The authors estimate that US data centre power consumption will grow about four per cent between 2014-2020, a time of “drastically increased demand”.
But if data centre operators deployed best energy efficiency practices - and customers continue shifting more workloads to hyperscale operators - consumption could fall a whopping 45 per cent - or 33 billion kWH - in 2020.
Data centre power usage increased 90 per cent between 2000 and 2005; by 24 per cent between 2005 and 2010; and by 4 per cent between 2010-2014. A huge uptake in server virtualization in 2005-2010 meant that fewer servers were needed to do the same jobs. (The global recession in 2009 also dampened demand.)
Since 2010 the most significant factor in dampening electricity consumption is the emergence of the hyperscale data centres, which accounted for nearly all server shipment growth during the period. Higher server utilisation rates than smaller service providers and on-premise operators is the key here.
More power-efficient server, storage and networking nodes and better data centre cooling designs also have a big part to play in all sectors of the market.
The Lawrence Berkeley National Laboratory report, United States Data Center Energy Usage Report, is here. ®