Ignite Microsoft announced the release of Windows Server 2016 and System Center 2016 at its Ignite event in Atlanta. The commercially supported edition of the Docker engine is included at no extra cost.
Server 2016 and System Center 2016 are available for download this week and will be on sale from October 1 2016.
A key feature of Server 2016 is its support for Windows containers, a lightweight virtual machine designed for the deployment of microservices. This has enabled the open source Docker engine, already widely used on Linux, to be ported to Windows.
Docker has been in preview on Windows for some time, but the release of Server 2016 means that it is now generally available.
“There’s a commercial relationship now between Microsoft and Docker,” said Scott Johnston, Docker COO. “We provide the commercially supported Docker Engine, which is the hardened, validated, patched version of the open source product, as part of the Windows Server 2016 SKU. Microsoft will provide Enterprise support for that. We’ll have Docker datacenter support as well, to manage these workloads.”
Despite this relationship, the Docker Engine is not integrated into Windows Server setup, nor is it serviced via Windows Update. "There’s a PowerShell [script] to install," said Johnston.
Docker was keen to extend its business to Windows, said Johnston. “Sixty-three per cent of the Intel server workloads are run on Windows. Most of the workloads in the enterprise are Windows workloads, not Linux. In Silicon Valley startups it’s easy to think the world has gone 100 per cent Linux and it’s just not true. And on the dev side, 8 million .NET developers.”
Azure CTO Mark Russinovich told the Reg that Docker Windows work started two years ago. “We had our own internal container-type technology, but it made sense to leverage the open ecosystem that Docker had pioneered. That 63 per cent of server workloads on Windows, those same companies are also using Linux some place, so it makes sense to have a common technology.”
In order to make Docker work, Microsoft had to build new container primitives into the Windows kernel. However it also made many contributions to the Docker Engine itself. “One of our kernel developers, John Howard, is a maintainer with committer rights,” said Russinovich.
What about the Azure Container Service, currently Linux-only, will that now have a Windows option? “We don’t have a timeline yet, but I did demonstrate it at DockerCon,” Russinovich said.
Some types of applications, such as those using Java or even the cross-platform .NET Core, can be run on either Windows or Linux; but Johnston and Russinovich will not be drawn on performance comparisons.
“It’s a technology choice,” says Johnston. “But very soon we’ll see mixed application stacks and you’ll have native Windows and Java or Linux applications in the same application, each taking a role that’s optimal for them.”
“That’s our use case, says Tyco CTO Daryll Fogal, a Microsoft customer. “We have a bunch of .NET developers, and we’ve got to try and get scalabilty of platforms. With the tools chain it’s very easy for .NET developers to get going. Second, when you get out of individual VMs and into Docker containers, you get a certain amount of scalability by choosing to be in that environment. We can do a little bit of refactoring and get it running in Docker containers, and then we’ve got time to go do the proper refactoring with a microservices architecture.”
“The other part is this hybrid environment. We acquire a company, they’ve got Java developers. We have to have a hybrid solution.”
Is Microsoft using Docker internally? “It’s hot off the press, we don’t have anything in production,” says Russinovich. “But we strongly believe that every new application should be run in containers. This applies to all Microsoft services as well. You’re going to see integration of Docker containers on Linux and Windows for all our services everywhere, in the fullness of time.”
Why use Docker? “It boils down the fundamental benefits you get out of containers, which is the ecosystem of images, the developer experience, the reliability of the experience, so you can take something that worked in one environment, you move it to another one and it works, the efficiency you get by leveraging shared resources, and the resource isolation that controls what resources these applications are getting. Then you layer on microservices on top, and that’s what we view as cloud native,” said Russinovich.®