What might HPE do with SimpliVity?

Where could the tech end up?

Sysadmin Blog HPE recently purchased SimpliVity for $650m. Some folks, like me, think this was a heck of a bargain for HPE. Others – most notably SimpliVity's competitors – think SimpliVity wasn't worth all that much anyways.

The debate has gone back and forth for over a week now and it's time to focus on what HPE might actually do with SimpliVity.

On the surface of it, the easiest thing for HPE to do is keep on selling SimpliVity as a hyperconverged solution. SimpliVity has some loyal customers, and with the might of HPE's sales force behind it I'm sure it would do well. It's not quite that simple.

HPE has a few Azure in a can solutions to sell. Azure Stack will only be available as a pre-canned solution, meaning HPE is one of the few with the resources to sell them. This follows from a pledge that Azure will be the preferred cloudy underpinning for HPE.

Preferred or not, HPE also has some VMware-based HCI solutions to sell. The lineup is something of a confused mess. This is partly as a result of the spectacular failure of HP's VMware EVO:RAIL project and partly because HP already bought a hyperconverged solution (Lefthand) that it traditionally tries to peddle with VMware. Oh, and VMware has VSAN now, just for fun.

Unless you just happen to have some insider info on SimpliVity, the purchase doesn't seem to make a great deal of sense, at least on the surface. SimpliVity is partnered with seemingly everyone except HPE for hardware. (Dell, Lenovo, Cisco and Huawei at current count.) SimpliVity uses a proprietary hardware card to perform their deduplication magic*, which means extra testing and integration work to weld Simplivity's tech to HPE's boxen.

Perhaps more importantly, up until a few weeks ago, SimpliVity wasn't all that good at Hyper-V. Not good for HPE's pledge to get out there and peddle Azure stack.

That was, however, a few weeks ago. SimpliVity's Hyper-V support is finally worth talking about. It would make a heck of a storage layer for some Azure stack racks and I'm sure we'll see those in relatively short order.

Whether we see a new generation of VMware-based SimpliVity nodes is more of an open question, and this depends on who ends up in charge of what.

Odd rumours

I have heard from multiple people that some of the brass hats at HPE honestly believe that the reason they keep losing 3PAR-based deals to Nutanix is that Nutanix has deduplication. If true, it says a lot of things.

First off, it says SimpliVity's sales people were dropping the ball. If deduplication were actually something the customer cared about, SimpliVity would have crushed Nutanix like a bug. I'm not sure I buy SimpliVity's sales team being that incompetent. That's not the only reason I'm not entirely sure of this hypothesis for the acquisition.

Nutanix does inline deduplication only on its flash tier, and does post-process on the capacity tier. This is not exactly world changing stuff, and the 3PAR guys could have matched this easily. 3PAR gained deduplication for the 7450 all-flash-array back in 2014; if they felt they needed some hybrid deduplication love, I'm sure they could have accomplished this with relatively little effort.

Nutanix could always go down the "hyperconvergence is easier, unified storage and servers" route. HPE could counter with "run your workloads on Nutanix with dedupe and compression on, then on our servers + external storage and let's have a conversation about workload density". Great catfights would, I am sure, ensue, but I don't see Nutanix having enough of an upper hand over 3PAR to trigger this purchase.

Levelling up

Whether or not Nutanix was the impetus for the purchase, there is some credibility to the idea that some of what motivated the SimpliVity purchase was a desire to beef up the 3PAR line. The dedupe in the 7450 came courtesy of a 4th generation ASIC. That means that, like SimpliVity, 3PAR offers hardware-assisted deduplication.

I'm willing to bet SimpliVity's deduplication is better than what 3PAR is using now, and I do expect SimpliVity tech to eventually end up in HPE storage arrays, but it's a bit of a question as to whether or not SimpliVity's dedupe tech is $650m worth of better.

One thing a lot of folks seem to forget is the way in which SimpliVity implemented its dedupe tech – and that it is remarkably efficient when used for multi-site solutions. SimpliVity has some great customer stories – one of which involves a boat – that they love telling which reinforce this, but it plays right into HPE's plans.

The pure-play on-premises data centre is basically dead. Microsoft and Amazon are both trying to put HPE out of business with their public clouds and cut-price tin-shifters like Supermicro and Inspur are more than happy to help.

In order for HPE to still be around 10 years from now, it has to make the hybrid cloud "a thing", convincing enterprises to keep some data locally and only shift data to the public cloud as needed, and preferably on a temporary basis. Having technology that can bring data from remote office/branch office (ROBO) sites back to the primary data centre, or move workloads between the primary data centre and the public cloud without racking up exorbitant bandwidth costs would be really useful.

SimpliVity kept its tech restricted to hyperconvergence, but I remember having lengthy conversations with them about how it could be adapted to everything from cloud gateways to Internet of Things networks. A hyperconverged competitor or levelup for 3PAR are both short term uses for the SimpliVity tech.

Long term, it could be the basis of a new generation of reasons for HPE to actually exist: their servers and other IT gear could come with the tech inside that makes all this internetworked madness over ancient DSL lines and thready mobile connectivity actually workable. It'll be interesting to see what they do with it. ®

*Yes, Jesse, I hear you climbing the walls, shouting that "it's not just dedupe". Hush. I'm not explaining that all over again.

Other stories you might like

  • Robotics and 5G to spur growth of SoC industry – report
    Big OEMs hogging production and COVID causing supply issues

    The system-on-chip (SoC) side of the semiconductor industry is poised for growth between now and 2026, when it's predicted to be worth $6.85 billion, according to an analyst's report. 

    Chances are good that there's an SoC-powered device within arm's reach of you: the tiny integrated circuits contain everything needed for a basic computer, leading to their proliferation in mobile, IoT and smart devices. 

    The report predicting the growth comes from advisory biz Technavio, which looked at a long list of companies in the SoC market. Vendors it analyzed include Apple, Broadcom, Intel, Nvidia, TSMC, Toshiba, and more. The company predicts that much of the growth between now and 2026 will stem primarily from robotics and 5G. 

    Continue reading
  • Deepfake attacks can easily trick live facial recognition systems online
    Plus: Next PyTorch release will support Apple GPUs so devs can train neural networks on their own laptops

    In brief Miscreants can easily steal someone else's identity by tricking live facial recognition software using deepfakes, according to a new report.

    Sensity AI, a startup focused on tackling identity fraud, carried out a series of pretend attacks. Engineers scanned the image of someone from an ID card, and mapped their likeness onto another person's face. Sensity then tested whether they could breach live facial recognition systems by tricking them into believing the pretend attacker is a real user.

    So-called "liveness tests" try to authenticate identities in real-time, relying on images or video streams from cameras like face recognition used to unlock mobile phones, for example. Nine out of ten vendors failed Sensity's live deepfake attacks.

    Continue reading
  • Lonestar plans to put datacenters in the Moon's lava tubes
    How? Founder tells The Register 'Robots… lots of robots'

    Imagine a future where racks of computer servers hum quietly in darkness below the surface of the Moon.

    Here is where some of the most important data is stored, to be left untouched for as long as can be. The idea sounds like something from science-fiction, but one startup that recently emerged from stealth is trying to turn it into a reality. Lonestar Data Holdings has a unique mission unlike any other cloud provider: to build datacenters on the Moon backing up the world's data.

    "It's inconceivable to me that we are keeping our most precious assets, our knowledge and our data, on Earth, where we're setting off bombs and burning things," Christopher Stott, founder and CEO of Lonestar, told The Register. "We need to put our assets in place off our planet, where we can keep it safe."

    Continue reading

Biting the hand that feeds IT © 1998–2022