Concerns about using Linux on servers to crunch huge data workloads are evaporating, according a survey.
Fears that the open-source operating system isn't up to the job of processing big data have fallen by 40 per cent in the last 12 months, according to the Linux Foundation's annual Enterprise Linux report.
Last year 20.3 per cent had expressed reservations – today that number is 12.2 per cent. The poll found 75 per cent had expressed concern about the need to support big data, while 72 per cent plan to run big data sets on Linux. Just 35.9 per cent will pick Windows.
The Linux Foundation surveyed 1,900 individuals with 428 respondents at organisations with $500m or more in annual revenue or more than 500 employees. It reckoned these are shops running mixed Linux and Windows systems.
A pleasantly surprised Linux Foundation executive director Jim Zemlin told The Reg he'd raised an eyebrow upon reading how concerns had dropped away during the last year.
Zemlin claimed one reason for the change was that the development process behind Linux "works really, really well". Another, Zemlin said, was that Linux is already suited to such workloads thanks to HPC – and people had been "grabbing and adapting the code because it's open and available""
One other factor could be the 2010 release of Red Hat Enterprise Linux 6 by Red Hat which percolated through the customer base during 2011.
RHEL 6 was updated to exploit latest multi-core chips, and support 4,096 cores per system image and addressable memory of 1TB – up to 64 cores and 64TB on RHEL 5. RHEL 6 also switched virtualisation technologies from Xen to KVM.
According to Zemlin, the biggest problem for Linux today is not refining the technology but a shortage in skilled people to program and support the penguin. He expects 2012 will be a big year for his operation in providing training to crank out enough engineers skilled in Linux. ®