This article is more than 1 year old

Software-defined everything: So, WHEN is the 'future'?

Claims of invisibility and IT ‘as a service’ need to be tested

Holy Grail needs a plan

OpenStack has gained traction with a number of vendors including HP and IBM, and is beginning to see more interest at enterprise level, despite difficulties implementing the technology

Although some of these techniques have attracted the interest of service providers and telcos, such thinking is still a long way off for more traditional businesses at this point. In fact, many continue to rely on complex, heterogeneous infrastructure, and some - such as large banks - are tied into legacy non-x86 systems.

At the same time, certain applications built for the client-server world would need to be re-architected if they are to fit the SDDC model of distributed systems.

“For enterprises to get to this Holy Grail they actually need to rethink the entire architecture from top to bottom,” Bias said.

It should also be noted that not all will take this software-defined route across all of their infrastructure. Vendors such as Oracle and IBM continue to invest in developing dedicated systems, and there are many customers who see these engineered systems as the desired approach to ensure reliability and security for mission-critical workloads.

It may be that some users may deploy a combination of both software-defined and hardware-dependent infrastructure, depending on their own application demands.

For those intent on using software defined, one potential solution to managing the pools of virtualised resources has begun to emerge, in the shape of open source cloud operating system OpenStack.

OpenStack has gained traction with a number of vendors including HP and IBM, and is beginning to see more interest at enterprise level, despite difficulties implementing the technology. Also VMware, which to a large degree competes with OpenStack, has been keen to show support for the project.

Increased uptake could go some way to addressing the problem of interoperability that is one of the main advantages supposedly offered by the software defined data centre, as well as avoiding moving vendor lock-in at the software level.

The problem is, though, that OpenStack is still developing, and people are not confident enough that they can adopt it and implement it without a lot of pain and agony at this stage. It might be that the OpenStack model (rather than OpenStack as it stands today) is the outcome that prevails.

Along with initiatives like the Open Compute Project and OpenFlow, OpenStack is an attempt to provide an open standards-based approach to developing data-centre designs.

This could be key to a software defined data-centre strategy, offering workload portability and interoperability that allows applications to be easily moved to where they are best suited, both on-premise and into the cloud.

Bias is generally dismissive of the SDDC term but agrees that while OpenStack is not a standard yet it does have the potential to form the basis of internal clouds in the future: "OpenStack has a shot at becoming an open standard. Can it become the control plane for the 'software defined data centre'? Certainly it is a possibility."

But, he continues, OpenStack is just a part of the puzzle. "OpenStack doesn't meet the need for low-cost hardware, or having an operation service delivery model that assumes virtualised resources such as VMs fail all of the time, and that your applications will route around those failures."

Modernising data centre infrastructure is just one challenge on the road towards that SDDC Holy Grail.

More about

TIP US OFF

Send us news


Other stories you might like