This article is more than 1 year old
Line up for parallelism
It could be the way of the future
While doing some research for another project I came across an arguably old idea being re-visited - or perhaps that should read "disinterred". Parallel processing is, a growing number of people believe, not just the way of the future but the only way real development progress is likely to be made in future.
Parallelism is not remotely new, of course, and in hardware terms it is even old hat. Today's dual-core processors may be touted as the dog's pride and joy, and can even be touted as the sign that parallelism is emerging from the primeval soup. But compared to the Inmos Transputer from the early 80's they are still several brain cells short of a Nematode worm. The IBM/Sony/Toshiba-spawned Cell processor technology gives an idea of where things are heading, and one just has to note the recent stories about Cell-based supercomputers, here and, perhaps more relevant to everyday life, Cell-based BladeServer systems here to see that a start has been made in bringing them into the business IT world as well as games machines.
The real issue with parallelism, however, is software and while many say parallelised software is a good idea, there is also a generalised acknowledgement that it is probably too difficult in practice to make it successful or even widely implemented across the board in typical business systems.
The other side of that coin is that in the long term it needs to happen, if only because conditional-based sequential programming is now becoming one of the serious inhibitors of performance improvement in the future. There is even an argument that suggests multi-core processors of the Intel/AMD varieties could make matters worse. The greater the number of cores available, the greater the amount of management software that will be needed to be run to ensure that tasks spawned by current sequential applications are processed correctly. It is possible to imagine a time when the management software overhead exceeds that required by applications and that performance actually starts to degrade.
There is, however, also an argument that other wheels are turning that could have a part to play in making parallelism work "for the masses". The common response to talk of parallel programming is that it is too difficult. Well, there are two ways of looking at that issue. One is that "too difficult" is just a question of education, and that it might prove a great deal easier than currently thought likely. The other is that, even if it is too difficult for most developers to get their heads round it, the move towards service-based architectures and even dedicated, applications-specific virtual servers, means that there is now an architectural approach available that will allow the few who do understand the intricacies of parallel processing to service the needs of the many who don’t but are very likely to need it.
Then mere mortal developers could call up really fast, dedicated parallel-processing applications servers as "black-box" services, just like any other service in a loosely-coupled, composite application in an SOA environment. That may well be the way to do it – at least as an intermediate step. But it is a step that will need to be taken, and will need developers with the right skills.®