This article is more than 1 year old
Self-aware storage? It'll be fine. Really - your arrays aren't the T-1000
Smarter, savvier boxes will change the way we look at infrastructure
Awareness
Traditional storage systems don’t have a clue about what is happening around or in them. They just serve files and IOPS while protecting data against hardware failures.
However, next-generation storage systems are smarter and aware of their behaviour, the workloads they are serving or the data they are storing. This is a leap forward and can radically change the role of storage in the infrastructure.
These kinds of systems can actively contribute to lower infrastructure TCO and, sometimes, they can become active components of the infrastructure and application stack.
Examples? Nimble Storage InfoSight is a tool capable of analysing the behaviour of your whole stack, starting from the storage point of view up to networking, hypervisor and VMs. And it helps to understand quickly if something is (or will be) going wrong, by leveraging cloud-based analytics.
Others, like Data Gravity, work on stored data and can analyse the content and exploit all of its value.
These two characteristics are not concurrently necessary or complementary to each other, but they enable the development of different features that can be deployed locally or from the cloud.
Except for a few notable cases, we are still at the first-generation stage of these systems, but the potential is massive and I’d like to give you some idea of what will probably happen in the near future.
Smarter storage is just around the corner
One interesting thing I saw last week came from Coho Data, a scale-out storage solution. Coho is developing the ability to run containerised code into their servers triggered by events.
For example, if a new movie file lands in a specific portion of the storage, it could be re-encoded in different formats thanks to Coho APIs and a few lines of code. Another example is that with a similar feature, you can build applications that can run into the storage analysing and managing data very intelligently and in function of your exact needs.
The number of applications is endless:
- A specific back-up procedure that sends your snapshots to the cloud?
- Adding new interfaces and protocols to your storage?
- Implementing in-place data analytics? Data streaming analytics?
- Constantly looking at access patterns to find potential data breaches or leaks?
The only limit is your imagination.
Yes, there are physical limits imposed by CPU usage, but I’m sure that, as it happens for standard hyper-converged infrastructures, vendors will be able to find the right balance or to propose specialised nodes.
In fact, from a certain point of view this is very similar to a hyper-converged infrastructure, but with an major difference: no hypervisor and no VMs. A lightweight, dockerised approach which can be very powerful while consuming fewer resources. It’s like building your very specialised appliance out of commodity hardware and software.