Why rapid proliferation of cloud native apps requires faster, more efficient toolsets
Kubernetes enables easy, rapid AI app development, making it the industry standard for AI workloads
Sponsored feature At this time, the most important trend involving AI development in cloud native environments is around the rapidly multiplying integrations of AI/ML workloads with cloud native architectures and practices, also known as Cloud Native AI or AI-Native Cloud. This trend signifies a fundamental move away from standard development practices toward building and deploying AI applications in a way that best utilizes the scalability, flexibility, resilience, and efficiency inherent in cloud native approaches.
Why is this so important? It's not super science: It's all about speed and efficiency, as is all good development, and it often requires new-gen tooling to do it right. Utilizing the correct tools – including the cross-platform workload orchestrator Kubernetes – can impact an enterprise's bottom line directly by improving an IT system's key performance indicators (KPIs). These include:
- Scalability and resource efficiency: Cloud native environments provide the elastic infrastructure needed to handle the demanding computational resources of AI/ML workloads, allowing for scaling up or down as needed and optimizing costs.
- Agility and faster innovation: The modularity and automation offered by cloud native practices enable faster development, testing, and deployment of AI applications, accelerating the pace of innovation.
- Portability and flexibility: Containerization and orchestration provide the ability to deploy AI models across different cloud providers or hybrid environments, avoiding vendor lock-in and offering greater flexibility.
Sellers of products and services in the cloud – this now comprises 16 percent to 20 percent of all ecommerce trading in the U.S. – need to take note here, Dan Ciruli, senior director of product management at Nutanix, told The Register. "It doesn't matter what you sell, you are also a software company," Ciruli said. "If you don't modernize your IT operations, you will go out of business. That is why cloud native (development) is so important to business."
Other significant results of cloud native development using the right tools – as documented by an increasing number of experienced cloud developers – include improved collaboration, more accurate governance and reproducibility, and optimized control of costs.
Cloud native platforms and MLOps (machine learning operations) practices promote better collaboration between data scientists, developers, and operations teams, which streamlines the AI lifecycle. Cloud native tools and MLOps workflows facilitate model versioning, tracking, monitoring, and explainability, which are essential for responsible AI development and deployment. Pay-as-you-go models and efficient resource utilization in cloud native environments can lead to significant cost savings for AI development and deployment.
Use case in point: A popular pizza restaurant goes big-time by adapting to the cloud
Each of the above results were realized in an important early use case involving a new-gen cloud native development project, Ciruli said.
"Marc Andreessen (founder of Netscape) said 25 years ago that 'Software is eating the world,'" Ciruli said.
"I didn't know what that meant until about five or six years ago, when I was at the headquarters of a Kubernetes user. It is a brick-and-mortar company that was established in 1958 and went through a lot of IT changes over the years.
"By the 21st century, they had 6,000 locations across North America, but they were losing market share – not because their products were bad, but because their digital experience was bad. Their app was bad, and the modern generation wants to order things on an app. They realized that they needed to do a bunch of things to revamp their experience, so a customer could order successfully on an app, and so they could track manufacturing all the way through to delivery, where that delivery was all the way to the door."
If you haven't guessed by now, the company was Yum Brands' Pizza Hut.
"The company realized that it needed to be able to do all that sales work and tracking within a minute," Ciruli said. "They needed an API to enable their sales so they could sell through channels. They needed partners to be able to sell their products, and they needed to be able to innovate rapidly. They need to be able to change that up when necessary."
It wasn't just that they needed to be able to do it quickly, Ciruli said, they also could never have down time because their 6,000 locations would need digital services to do business at all. So the way they did this was to build a modern app, build a modern back end and run it on a modern, Kubernetes-based infrastructure.
"They did this, and it saved their business," Ciruli said.
The solution for cloud native developers: Nutanix Kubernetes Platform (NKP)
Cloud native IT, especially involving Kubernetes, is not simply hype. These new-gen tech tools – only a decade old – are making AI workloads a reality by providing the flexibility, scalability, and efficiency needed for modern AI applications. Much of this is automated, making expedited development a reality for time-pressed IT shops. GenAI and agentic AI is also shortening dev-and-test times by offering cogent suggestions in coding.
NKP is a full-featured solution for deploying and managing Kubernetes clusters across various environments, including on-premises environments, edge locations, and public clouds. Key assets included in the new NKP package:
- Kubernetes for AI workloads: Kubernetes, which distributes microservices in a just-in-time manner, is a great fit for AI workloads due to its ability to manage dynamic and resource-intensive tasks. It enables efficient sharing of expensive resources – such as GPUs – and supports rapid iteration and deployment, which is crucial for AI development. Kubernetes allows AI applications to be truly portable across public clouds, private clouds, and the edge.
- Real-world adoption of Kubernetes for AI: Leading AI companies such as OpenAI, Spotify, and Uber run their models on Kubernetes, highlighting its effectiveness in managing complex computational demands. A related data point: The Cloud Native Computing Foundation (CNCF) 2024 Cloud Native Survey [PDF] found that "it’s a Kubernetes world today: 93% of companies use it in production, are piloting it or are actively evaluating it."
- NAI and Kubernetes integration: Nutanix Enterprise AI (NAI) can run on Kubernetes clusters, making it easy for organizations to put new AI models into production. Nutanix integrates AI into Kubernetes capabilities, such as the AI Navigator chatbot, to assist engineers with troubleshooting and system insights.
- Flexibility and portability: Kubernetes-run containers ensure that AI applications can be deployed across on-prem data centers and public clouds without vendor lock-in. This is important for organizations pursuing hybrid and multicloud strategies that prioritize cost, performance, and data sovereignty.
- Transformative potential of cloud native development: Cloud native-powered AI applications have the potential to transform business operations by enabling better, faster data-driven decisions and higher productivity. The modular and portable nature of Kubernetes-run containers supports this transformation.
- Easy deployment with Nutanix GPT-in-a-Box: With GPT-in-a-Box running on Kubernetes, Nutanix makes it easy for organizations to deploy AI models into production. This solution simplifies AI Day 2 operations and adapts to new advancements in GenAI, providing a validated stack with streamlined operations, infrastructure, and services.
It's a Kubernetes world after all
The proof is in the results. The AI boom is turbocharging cloud native app development, and Kubernetes is the engine. As enterprises rush to build scalable, fast, and flexible AI-powered services, conventional tools fall short. Enter next-gen platforms such as NKP, which streamline deployment, maximize uptime, and slash dev cycles — all showing as positive on the company bottom line. From pizza chains to power players like OpenAI, cloud native AI is saving businesses, modernizing infrastructure, and unlocking agility — because in today’s market, either innovate fast or risk being left behind.
Sponsored by Nutanix