This article is more than 1 year old

Software-defined everything: So, WHEN is the 'future'?

Claims of invisibility and IT ‘as a service’ need to be tested

Market revenues hitting $3.7bn by 2016

But VMware is far from the only one pushing this software-defined agenda. The trend has since been more widely adopted across the IT industry – from vendors to service providers – spawning other comparable monikers such as SDx, virtual data centre, or software-defined everything, all of which have taken hold to a lesser degree.

There has been evidence of growing interest in various "software defined" technologies. For instance, many claim the SDN market – which is estimated to have only around 500 deployments globally at this point – is set for strong growth in coming years. IDC forecasts revenues will hit $3.7bn by 2016.

There are many firms competing in the field, from startups such as PLUMgrid to relatively new players such as Arista, and of course the more established names of Cisco and Hewlett-Packard.

The software-defined storage market has seen greater activity, too, with EMC's ViPR, NetApp's ONTAP and from Nexenta, and Nutanix. Despite this early activity, there is some way to go before the real benefits of software automation are felt.

451 Research analyst John Abbott says SDDC will need the full support of white-labeled hardware, coming from low-priced Asia-based OEMs and new entrants. "The ideal in the future would be that you use white-box systems underneath: commodity x86 servers and x86 storage modules and even white box x86 switches, rather than the current Cisco or Juniper hardware dependent switches," he said.

"When you get to that stage you can really start getting some of the benefits of improved utilisation; more flexibility, faster provision of services, and you can then drive down the price of your hardware procurement and standardise that as well."

The huge scale-out data centres deployed by big web firms such as Facebook, Amazon and Google rely heavily on software - in some cases container-based virtualisation - to manage their vast infrastructure

The approach bears some resemblance to the huge scale-out data centres deployed by the big web firms such as Facebook, Amazon and Google, which rely heavily on software – in some cases container-based virtualisation - to manage their vast infrastructure. These companies have led the way with smarter management of data resources, allowing servers and other hardware to be shut down or run at lower power levels.

Some expect enterprise businesses to adopt similar types of infrastructure. According to Gartner, this web-scale architecture will be used by 50 per cent of enterprises by 2017. But while Google can rely on customised hardware because it employs teams of engineers to modify and manage equipment, it is debatable whether most businesses will have either the skills or the inclination to do so.

"It is very tricky because you can't just pull over the ideas lock stock and barrel. It is more a question of how do you take what the web-scale guys do and package it up in a way that traditional enterprise can consume and actually use. And I don't think anyone has really cracked the nut on that," the chief executive of OpenStack start-up CloudScaling and OpenStack board member Randy Bias told us.

"The danger of SDDC for businesses is that it can involve a lot of Frankenstein thinking: 'I am going to marry some of what the web-scale guys do, like automation and scale-out, but I am going to use classic enterprise patterns, and I am not going to give up my ways of doing things'. The result is that you can't recognise all of the value."

More about

TIP US OFF

Send us news


Other stories you might like