IT now 10 percent of world's electricity consumption, report finds

New analysis finds IT power suck has eclipsed aviation


The information and technology ecosystem now represents around 10 per cent of the world's electricity generation, and it's hungry for filthy coal.

In a report likely to inspire depression among environmentalists, and fluffy statements from tech companies, analyst firm Digital Power Group has synthesized numerous reports and crunched data on the real electricity consumption of our digital world.

In "The Cloud Begins With Coal – Big Data, Big Networks, Big Infrastructure, and Big Power", the research group argues that much of the cost of our digital universe is hidden from us because of the distant nature of cloud services and the lack of information about the power it takes to make our IT gear.

The coal-boosting study was sponsored by the National Mining Association and the American Coalition for Clean Coal Electricity.

"Although charging up a single tablet or smart phone requires a negligible amount of electricity, using either to watch an hour of video weekly consumes annually more electricity in the remote networks than two new refrigerators use in a year," they argue.

This example uses publicly available data on the average power utilization of a telco network, the cost of wireless network infrastructure, and the energy that goes into making a tablet, although it ignores the data centers the video is served out of, and tablet charging.

In other words – though Google has argued that the cost of a Google search is 0.0003 kWh of energy – the likely cost is higher due to the power cost lurking in the non-Google systems used to deliver the data and perform the search.

The report's figure reflects not just the cost of data centers – according to a 2007 report by the EPA US data centers consumed 1.5 percent of US electricity production, and was projected to rise to 3 percent by 2011 – but also the power involved in fabbing chips and the power consumption of digital devices and the networks they hang off.

It finds that the whole spread of technologies draws down about 1,500 terawatt hours of energy per year, representing 10 percent of all power consumption. And it's going to get worse over time.

"Unlike other industrial-classes of electric demand, newer data facilities see higher, not lower, power densities," the group writes. "A single refrigerator-sized rack of servers in a data center already requires more power than an entire home with the average power per racks."

Unless ARM chips take off in the data center in a phenomenally huge way – and that is doubtful until we see 64-bit chips come along with benchmarks to back them up against AMD/Intel – this will continue to hold true.

This combines with our voracious hunger for more data on smartphones to grow data center power usage faster than efficiency gains can keep a lead on consumption, the report argues. In one example, a Chinese telco managed to increase the power efficiency of its data-carrying network by 50 percent, but still saw its use jump through the roof as more and more people grabbed mobile devices and started browsing.

And these forces will drive the use of coal, the coal-backed study claims. "80 percent of global ICT electricity use is highly dispersed and not consumed at the visible warehouse-scale data centers," it says. "Cost and availability of electricity for the cloud is dominated by same realities as for society at large – obtaining electricity at the highest availability and lowest possible cost."

The company highlights numerous examples, including Greenpeace's investigation into the major US tech companies which found that they loved filthy coal, and anecdotal evidence from new Chinese data centers that tout their access to the cheap black stuff as a major selling point for capacity-conscious punters.

The report concludes that until we can get a true reflection of not only the power used by our devices, but also the power sucked down by the networks that get us our data and the inputs that form the basis of our power generation, we will have very little idea of the exact footprint our habit for lolcats, frequent emails, brand new fondleslabs and streaming video takes up – and that's a bad thing. Unless people can get a clear idea of the overall impact of their digital world, then the cost to the planet will remain forever obscured. ®

Similar topics


Other stories you might like

  • Oracle shrinks on-prem cloud offering in both size and cost
    Now we can squeeze required boxes into a smaller datacenter footprint, says Big Red

    Oracle has slimmed down its on-prem fully managed cloud offer to a smaller datacenter footprint for a sixth of the budget.

    Snappily dubbed OCI Dedicated Region Cloud@Customer, the service was launched in 2020 and promised to run a private cloud inside a customer's datacenter, or one run by a third party. Paid for "as-a-service," the concept promised customers the flexibility of moving workloads seamlessly between the on-prem system and Oracle's public cloud for a $6 million annual fee and a minimum commitment of three years.

    Big Red has now slashed the fee for a scaled-down version of its on-prem cloud to $1 million a year for a minimum period of four years.

    Continue reading
  • Mega's unbreakable encryption proves to be anything but
    Boffins devise five attacks to expose private files

    Mega, the New Zealand-based file-sharing biz co-founded a decade ago by Kim Dotcom, promotes its "privacy by design" and user-controlled encryption keys to claim that data stored on Mega's servers can only be accessed by customers, even if its main system is taken over by law enforcement or others.

    The design of the service, however, falls short of that promise thanks to poorly implemented encryption. Cryptography experts at ETH Zurich in Switzerland on Tuesday published a paper describing five possible attacks that can compromise the confidentiality of users' files.

    The paper [PDF], titled "Mega: Malleable Encryption Goes Awry," by ETH cryptography researchers Matilda Backendal and Miro Haller, and computer science professor Kenneth Paterson, identifies "significant shortcomings in Mega’s cryptographic architecture" that allow Mega, or those able to mount a TLS MITM attack on Mega's client software, to access user files.

    Continue reading
  • HashiCorp tool sniffs out configuration drift
    OK, which of those engineers tweaked the settings? When infrastructure shifts away from state defined by original code

    HashiConf HashiCorp has kicked off its Amsterdam conference with a raft of product announcements, including a worthwhile look into infrastructure drift and a private beta for HCP Waypoint.

    The first, currently in public beta, is called Drift Detection for Terraform Cloud, and is designed to keep an eye on the state of an organization's infrastructure and notify when changes occur.

    Drift Detection is a useful thing, although an organization would be forgiven for thinking that buying into the infrastructure-as-code world of Terraform should mean everything should remain in the state it was when defined.

    Continue reading

Biting the hand that feeds IT © 1998–2022