Letters Readers have responded to the prospect of the IBM/Sony Cell chip with measures of skepticism and enthusiasm. The former by far outweighing the latter.
"I haven't been so excited about a new chip architecture since ... Transmeta!" writes Bruce. "And look how that turned out!"
Referring to the Cell chip's ability to scavenge for resources over the world wide web, Tzetan Mikov sounds a warning note, echoed by many readers.
"This has very little to do with microprocessor or system architecture, but is almost exclusively an issue of software - OS, supporting tools, etc. Software that apparently doesn't still exist, and if it did, there is nothing preventing it from running on more ''conventional' hardware.
"As we know, the success of any new computer hardware has always depended almost entirely on its ability to execute _existing_ software well. The idea that IBM, Sony and Toshiba will create an entirely new software paradigm, environments and tools and make them successful is ludicrous (as you point yourself, Intel attempted a far more humble undertaking with Itanium and hasn't quite succeeded yet).
"Migrating a thread transparently across a network - that I sincerely doubt. I don't know what Cell is, but I am fairly certain that you are putting the wrong emphasis on 'massively distributed, global computing infrastructure'. This is a problem that will not be solved by a chip !
Zillion similar articles
The processor I have in my desktop computer is the same kind of processor that's used in some supercomputer clusters, so I don't see the novelty, writes Tom Kerrigan. "Even if we had software that allowed a cluster to recruit my desktop's processor for a calculation, there would be no point, because of the communication overhead. Likewise, I have no interest in using a cluster to, say, compress a DVD, because by the time I transferred the 8GB of data to compress, I might as well have compressed it myself.
"I read a zillion similar articles - about 14 years ago - about how the IBM-Apple PowerPC chip was gonna knock Intel and Microsoft into the toilet. Apple went from 30% market share [actually, 9% - ed.] to 1.75% in that time - and Intel and Microsoft are $40 billion/year companies today."
"Speaking of which [writes one chip designer who must remain nameless] "am I the only one who finds a multi-core Itanic funny? I mean, the whole point of the architecture was to exploit instruction level and thread level parallelism in a single core with lots of execution units. Cell seems to do that in a more elegant way, with partitioning at run time rather than compile time."
The whole idea of a global grid gets short shrift. "How many businesses actually need complex programs running on vast super computers? It seems to me that apart from oil, weather and stock market very few," writes Stu Paulin. "Manufacturing needs local process control of machinery, while most of the businesses in the world actually use computers for analysis and accounting. The 'analysis' does require grunt, but not necessarily (or even likely) a complex algorthrym that requires millions of processes. Present day stand alone boxes handle complex CAD, accounting, publishing, communications, graphic and sound manipulation.
"Another issue is the cost of telecomunications, a significant offset to the cost of local processing. Sure the world had more need than '4 or 5 computers', but I think a belief that it needs the opposite, some kind of unlimited processing power, is equally absurd."
So if they build it, they won't necessarily come. On the other hand, some readers welcome our new Cell overlords.
"Consolidating cycles throughout our single office would, even if only 25% efficient compared to separate boxes, give us much more power," writes a user with 10 boxes to look after. "Also smoothing demand would also make it easier to plan adding capacity. Out in wild blue sky territory it'd be nice to rent out my playstations capacity for those hours when i'm asleep, at work, or otherwise having a life."
"The Cell sounds like Skynet to me... 50 million interconnected Playstation 3s will become self aware and destroy everything!" writes Franki.
Several readers pick apart unsubstantiated claims in the article - and quite right, too.
"You write that 'no American or European technology company has conquered the living room, or really made itself pervasive in any aspect of our lives except ... in computing itself' says Simon. "Surely you'd have to say Nokia and a few others have managed to make themselves pervasive in our lives - and precisely because of the two things you argue - that they worked out the user experience was important and that they were providing a service that met end user needs. Ie, talking to/messaging people on the move) rather than a platform."
And Jim Gillespie debunks the comparison between outsourcing computing cycles and the West's dependency on oil.
"To my mind at least, 'outsourcing' refers to an economic decision taken on the basis of competing costs: you do it if it's cheaper to hire someone to do something than it is to employ someone. Oil is a natural resource which just happens to be most concentrated in the Middle East. Nobody chose to put it there, and so to talk about it as outsourcing makes no sense and reduces the credibility of an otherwise interesting article."
And completely off-topic, Dave Williams reminds us:
"Jeff Goldblum didn't save the world because he was a hacker. He succeeded because he was using Mac OS 7.5.3 on his Powerbook, the most buggy and unreliable OS Apple ever released. Even an alien supercompuiter couldn't cope with such a pile of ****, and so Jeff and Will got to save the world, thanks to Apple." ®