Boutique data center server and storage maker Rackable Systems has unveiled an homage to Google's original home grown servers - which were essentially bare motherboards thrown on cookie sheets with rubber mats and stacked in bakery racks.
You can't charge a lot money for Google's server design, so Rackable's CloudRack racks and related storage trays look a little more rugged, quite a bit more organized, and certainly more professional - not that any of this matters to an upstart, always-right company like Google.
The CloudRack rack and tray design is not just about creating servers that are cheaper than standard rack-mounted or blade servers, in as much as they have a lot less metal in them. (Thanks to the formerly exploding and still rapidly growing Chinese economy, metals of just about every kind are increasingly expensive).
Taking the metal skins off the servers not only saves money, it decreases the weight of the rack of servers (meaning data center floor strain and human back strain is lower) and makes the gear easier to cool (since there is no metal obstructing air flow across the machinery).
The CloudRack setup is also about making servers more serviceable. (If you have never tried to slide out a dust-encrusted, wire-entangled server from a rack and then open it to fix it, you can't appreciate you much of a pain in the neck this is). And for companies with thousands or tens of thousands of servers, serviceability is a big deal because someone has to run around and fix broken components, and this takes time, and time is money, especially when the time is related to human beings. In a funny way, as Google has discovered in so many ways, less is more.
Rackable is obviously targeting the same cloudy customers with the CloudRack - meaning massively scaled out server infrastructures with either scientific, data warehousing, or Web 2.0 workloads. That's what I've been doing with its other rack designs. And it is not clear if the new CloudRack machinery is more or less expensive than prior designs either, because Rackable hides behind the fact that its setups tend to be heavily customized as an excuse for not providing list pricing for its components. (Which is obviously silly but maybe necessary for Rackable to sell against the tier one, general purpose rack and blade server makers like IBM, Hewlett-Packard, Dell, and Sun Microsystems).
The presumption any customer should take into the deal is that the CloudRack racks should be about the same price as prior racks from Rackable and that the server trays should be cheaper than rack designs of equivalent computing power at equivalent density, because the metal is gone. If Rackable doesn't agree with that assessment, IBM has dense iDataPlex gear, HP has two-server blades for its c7000 chassis, and Dell is thrilled to bring in its Data Center Solutions unit to create custom-made servers and data center designs for you.
The CloudRack comes in 22U or 44U sizes and a tray that slides into in the rack takes up 1U of space, just like a regular rack server. Rather than putting tiny muffin fans and power supply fans in each tray, the rack itself two or four highly efficient axial fans that span the width of the servers.
A large fan moves air much more efficiently and quietly than a collection of smaller fans (often put in series two or three deep and arrays many fans wide) that moves the same volume of air. Saeed Atashie, director of server products at Rackable, says a half-rack of standard 1U servers has something on the order of 200 muffin and power supply fans, and these are replaced with two large axial fans in the 22U CloudRack.
Making the rack be the only skin for a collection of servers and therefore having the large and efficient cooling fans be the only things moving air is obviously simpler. And the heritage of rack servers being an offshoot of tower servers (which do need a skin and their own cooling) is the only explanation as to why rack servers have had metal skins, lots of fans, and dedicated power supplies for all these years.