This article is more than 1 year old
BusinessWeek novel turns Google's cloud into epic hero
Search giant invented science and the future
Comment In a rather desperate bid to attract wealthy technology advertisers, BusinessWeek lowered itself this month by publishing data center erotica.
The business publication issued an immense cover story titled: Google and the Wisdom of Clouds. The piece covers Google and IBM's creation of a cluster for use by students and researchers. The two companies announced the cluster way back in October, publicizing their efforts to nudge coders toward parallel programming techniques.
BusinessWeek's story, while colorful and sometimes informative, borders on the delusional.
At its core, the piece hangs Google's "cloud computing" approach on this single cluster. You're meant to understand that the cluster points to Google's future where the company may - or may not - give outsiders access to its data centers in much the same way that Sun Microsystems, Amazon.com, Salesforce and others do today. Way beyond that concept, however, you're told that Google has pioneered a new method of giving students and researchers extra horsepower - a feat that may lead to amazing discoveries and a general peace on Earth.
Have we gone too far? Judge for yourself.
In building this machine, Google, so famous for search, is poised to take on a new role in the computer industry. Not so many years ago scientists and researchers looked to national laboratories for the cutting-edge research on computing. Now, says Daniel Frye, vice-president of open systems development at IBM, "Google is doing the work that 10 years ago would have gone on in a national lab."
The story's author Stephen Baker has an annoying habit of going back and forth between the cluster and Google's grand cloud - blech - vision and confusing the two ideas as one. So, let's try and dodge that issue by separating out the relevant bits.
Google invents national labs. Oh wait
First off, Google and IBM have supplied a few parties with access to a "large cluster of several hundred computers that is planned to grow to more than 1,600 processors," according to the two companies.
So, we're talking about something that any university or corporate customer could buy from IBM, HP, Dell or Sun Microsystems with a few clicks on a web site. Universities and research labs have spent years building similar clusters all on their own as well and can tap into far larger systems today.
Google and IBM have then outfitted the hardware with Linux, the Xen hypervisor and Apache's Hadoop software, which is an open source take on the MapReduce and Google File System (GFS) code used by Google. Yahoo! is now the largest corporate backer of Hadoop. As mentioned, this software helps teach programmers how to spread their jobs across hundreds and thousands of machines.
Without question, Google is doing some pioneering work in the parallel software field. The suggestion, however, that students and scientists have access to something new as a result is ludicrous.
Yes, the national labs have in the past led crucial computing efforts. But how could IBM's Frye forget his company's own work or that of, say, HP, Sun, Microsoft, Hitachi, DEC, Cray - the list goes on and on and on. There has always been a mix of public and private computer science work, and, in fact, much of that work has been with open source software and open networking protocols.
Is Google doing work that may have taken place at a national lab? Of course. Have national labs and other vendors given up on this type of work too? Er, no. To portray Google and IBM as unique godsends here is just wrong.
What's even more hilarious is Baker's suggestion that Google's cluster is "changing the nature of computing and scientific research." Computer scientists and researchers have been the biggest users of shared clusters and have led much of the work around parallel programming. This is all very commonplace stuff to them.
Lastly, Baker fails to mention some key words like mainframe and time-sharing in his cover story. It took Google CEO Eric Schmidt - in a separate piece - to remind the author that these concepts are decades old. But why bother pointing that out when you can make the need for more horsepower in academia seem like a problem that only Google can solve?
Many [students] were dying for cloud knowhow and computing power - especially for scientific research. In practically every field, scientists were grappling with vast piles of new data issuing from a host of sensors, analytic equipment, and ever-finer measuring tools. Patterns in these troves could point to new medicines and therapies, new forms of clean energy. They could help predict earthquakes. But most scientists lacked the machinery to store and sift through these digital El Dorados.
Who knew?
And now to the cloud.