This article is more than 1 year old

Ready for pull rate limits? Docker outlines 'next chapter' as Google tells customers how to dodge subscriptions

Pulling containers from Docker Hub for free will be throttled from 1 November

From 1 November, Docker will clap limits on how many container images free users can pull from its Docker Hub before requests are declined, a move that could cause problems for users of public cloud platforms.

Docker stated in August that anonymous users will be limited to 100 container image pulls per six hours, and authenticated (but free) users limited to 200 pulls per six hours. Subscribers get unlimited pulls.

Those limits are generous, but Google's Cloud CI/CD product lead, Michael Winser, explained that "Docker Hub treats GKE [Google Kubernetes Engine] as an anonymous user by default. This means that unless you are specifying Docker Hub credentials in your configuration, your cluster is subject to the new throttling of 100 image pulls per six hours, per IP. And many Kubernetes deployments on GKE use public images... examples include nginx and redis."

The issue is worse for users of private GKE clusters since "all image pulls will be routed via a single NAT gateway". Winser warned that "any service that relies on container images may be affected, including Cloud Build, Cloud Run, App Engine etc."

Google uses caching to mitigate the issue, and will further increase cache retention times. You can fix the issue by subscribing, but still need to ensure that the pulls are reconfigured to use Docker credentials. Winser suggested switching to Google's Container Registry, in which case customers pay Google for associated cloud storage and network traffic. Docker plans start at $5.00 per month for individuals, or $7.00 per month for teams.

Docker itself has posted about its "next chapter", and said that it has "11.3 million monthly active users sharing apps from 7.9 million Docker Hub repositories at a rate of 13.6 billion pulls per month – up 70 per cent year-over-year".

That is a substantial burden to carry for free users, and the company said that "establishing upper limits on 'docker pulls'" is "the first step in our move toward usage-based pricing". That said, Docker CEO Scott Johnston told the press that "we remain committed to requiring 100 per cent free code to cloud experiences for developers. What's evolving is that 100 per cent all-you-can-eat free is not sustainable... We are starting to put limits on the upper bounds." Johnston intends that the company will reach a point where anything beyond individuals or the smallest teams will be paying for usage, but that is not yet the case.

Johnston reminded us that "a year ago, November 2019, we sold off three quarters of the company's employee base, all of the enterprise customers, all of the enterprise proprietary product IP." The idea (aside from reducing cost and raising money) was to "refocus on development teams". Docker has developed integrations with Amazon Web Services (AWS) and Microsoft Azure to automate building and deploying containerised applications. The company intends to build on and increase its partnerships to extend its developer reach.

Docker said that it will work on new tools for application development and container management, integration with more CI/CD environments, collaboration tools, and an extension of its Docker Official Image scheme. It is all a bit vague.

Questioned about whether Docker has a viable business model, Johnston said: "The path to profitability is measured in years... our investors right now are interested in us scaling into the market opportunity... they are quite patient.”

Can Docker recapture the innovation that drove adoption in its early days? "Smoothing the integration of these pipeline tools that are between the source code and the cloud actually is innovative because it's a big struggle for development teams," Johnston told The Reg. "But you'll see additional innovation across the tool chain and the development experience." He pointed to the increased capability of Docker Compose to abstract deployment of multiple services and to enable easier switching between clouds. "Look for additional innovation that Compose becomes the de facto way of describing a cloud native app," he said. "That can land on any infrastructure regardless of orchestrator."

The issue is that while the Docker format and Docker Hub is wildly successful, the path to converting that huge user base into paying customers using additional Docker tools is by no means clear, particularly when the big public cloud providers as well as GitHub and GitLab and other CI/CD companies are all focused on this same issue – simplifying the path from writing code to cloud-deployed applications.

Google's latest advice to developers shows how attempts to monetise the Docker platform may also have the affect of reducing its usage, shifting users away from Docker Hub. That said, if reduced free usage is part of the process towards a sustainable Docker business, it may be no bad thing. ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like