Docker death blow to PaaS? The fat lady isn’t singing just yet folks

Could they work together? Yeah, why not

Logically nestled just above Infrastructure-as-a-Service and just beneath the Software-as-a-Service applications it seeks to support, we find Platform-as-a-Service (PaaS).

As you would hope from any notion of a platform, PaaS comes with all the operating system, middleware, storage and networking intelligence we would want — but all done for the cloud.

However, as good as it sounds, critics say PaaS has failed to deliver in the practical terms. PaaS offers a route to higher level programming with built-in functions spanning everything from application instrumentation to a degree of versioning awareness. So what’s not to like?

Does PaaS offer to reduce complexity via abstraction so much that it fails through lack of fine-grain controls? Is PaaS so inherently focused on trying to emulate virtualised hardware it comes off too heavy on resource usage? Is PaaS just too passé in the face of Docker?

Proponents of Docker say this highly popularised (let’s not deny it) containerisation technology is not just a passing fad and that its lighter-weight approach to handling still-emerging microservices will ensure its longer-term dominance over PaaS.

Dockerites (we’ll call them that) advocate Docker’s additional level of abstraction that allows it to share cloud-based operating systems, or more accurately, system singular.

This light resource requirement means the Docker engine can sit on top of a single instance of Linux, rather than a whole guest operating system for each virtual machine, as seen in PaaS.

There’s great efficiency here if we do things “right” – in other words, a well-tuned Docker container shipload can, in theory, run more application instances on the same amount of base cloud data centre hardware.

Ah, but is it all good news? Docker has management tool issues according to naysayers. Plus, Docker is capable of practically breaking monitoring systems, so say the IT monitoring tools companies. But then they would say that wouldn’t they?

Let’s remember, Docker is only one container technology (Google has one too) and containers have been around for almost a decade now, but there’s still a heated and often slightly confused debate going on around this topic.

Things get more complex. We know that deploying and running Docker containers in live production requires the ability to view container performance data. But we need to be able to see that data right next to performance data from the rest of the infrastructure (i.e. the compute, network and storage components) or we can’t optimise for application scale out.

Suddenly, Docker ain’t quite so perfectly suited to some kind of Taylor Swift level of ubiquity then? Maybe yes, maybe no.

The big question is: does Docker isolation granularity and resource consolidation utilisation come at the expense of management tool-ability? “Yes it might do, in some deployment scenarios,” is probably the most sensible answer here.

PaaS proponents argue Docker’s initial promise of a brave new container world has become distended with additional complexity through controls that it needs to stay afloat. Further, anti-Dockerites say from this complexity comes security vulnerability.

But, equally, PaaS inherently presents itself as an abstracted simpler interface that effectively hides much of the intelligence beneath. So could this also sometimes be a bad thing? Nobody likes an over-engineered, modern BMW or Mercedes that’s impossible to fix when it breaks down due to all the embedded technology inside, right?

“In the case of PaaS you don’t have much control over many of the operational aspects associated with managing your application, for example the way it handles scaling, high availability, performance, monitoring, logging, updates. There is also a much stronger dependency on the platform provider in the choice of language and stack,” said Nati Shalom, CTO and founder of cloud middleware company GigaSpace.

So does Docker effectively replace PaaS or does Docker just drive the development of a new kind of PaaS with more container empathy and greater application agnosticism?

PaaS has been criticised for forcing an “opinionated architecture” down on the way cloud applications are packaged, deployed and managed. Surely we should just use Docker, but with an appropriate level of orchestration control too right? It’s not going to be that simple is it?

“Yes, it can be that simple,” argues Brent Smithurst, vice president of product at cross-platform development tools company ActiveState.

“Containers, including Docker, are an essential building block of PaaS. Also, PaaS offers additional benefits beyond application packaging and deployment, including service provisioning and binding, application monitoring and logging, automatic scaling, versioning and rollbacks, and provisioning across cloud availability zones," he added.

"A good PaaS goes far beyond what Docker alone offers. However, if a PaaS uses Docker, then it can take full advantage of the burgeoning Docker ecosystem," he said.

The confusion is starting to clear then: we should consider PaaS with Docker rather than PaaS instead of Docker.

Clive Longbottom, service director at analyst house Quocirca, says: “In a world where flexibility, performance and management will be all important, PaaS probably ties the user in to too much in the way of engineered constraints."

"However, containers will have to go through the pain virtualisation did: a container will have to be seen and managed in exactly the same way as the physical assets and base level operating system for it all to work. Some management tool and cloud orchestration vendors offer some of this now — the trick is to understand when there is enough of a solution in place to meet your real needs,” he added.

Broader topics

Other stories you might like

  • Despite global uncertainty, $500m hit doesn't rattle Nvidia execs
    CEO acknowledges impact of war, pandemic but says fundamentals ‘are really good’

    Nvidia is expecting a $500 million hit to its global datacenter and consumer business in the second quarter due to COVID lockdowns in China and Russia's invasion of Ukraine. Despite those and other macroeconomic concerns, executives are still optimistic about future prospects.

    "The full impact and duration of the war in Ukraine and COVID lockdowns in China is difficult to predict. However, the impact of our technology and our market opportunities remain unchanged," said Jensen Huang, Nvidia's CEO and co-founder, during the company's first-quarter earnings call.

    Those two statements might sound a little contradictory, including to some investors, particularly following the stock selloff yesterday after concerns over Russia and China prompted Nvidia to issue lower-than-expected guidance for second-quarter revenue.

    Continue reading
  • Another AI supercomputer from HPE: Champollion lands in France
    That's the second in a week following similar system in Munich also aimed at researchers

    HPE is lifting the lid on a new AI supercomputer – the second this week – aimed at building and training larger machine learning models to underpin research.

    Based at HPE's Center of Excellence in Grenoble, France, the new supercomputer is to be named Champollion after the French scholar who made advances in deciphering Egyptian hieroglyphs in the 19th century. It was built in partnership with Nvidia using AMD-based Apollo computer nodes fitted with Nvidia's A100 GPUs.

    Champollion brings together HPC and purpose-built AI technologies to train machine learning models at scale and unlock results faster, HPE said. HPE already provides HPC and AI resources from its Grenoble facilities for customers, and the broader research community to access, and said it plans to provide access to Champollion for scientists and engineers globally to accelerate testing of their AI models and research.

    Continue reading
  • Workday nearly doubles losses as waves of deals pushed back
    Figures disappoint analysts as SaaSy HR and finance application vendor navigates economic uncertainty

    HR and finance application vendor Workday's CEO, Aneel Bhusri, confirmed deal wins expected for the three-month period ending April 30 were being pushed back until later in 2022.

    The SaaS company boss was speaking as Workday recorded an operating loss of $72.8 million in its first quarter [PDF] of fiscal '23, nearly double the $38.3 million loss recorded for the same period a year earlier. Workday also saw revenue increase to $1.43 billion in the period, up 22 percent year-on-year.

    However, the company increased its revenue guidance for the full financial year. It said revenues would be between $5.537 billion and $5.557 billion, an increase of 22 percent on earlier estimates.

    Continue reading

Biting the hand that feeds IT © 1998–2022