How homegrown AI cuts through the hype to deliver real results

Nutanix leverages customer interactions to develop GenAI infra solution and the AI tools to support it

Sponsored feature Across sectors and industries, the velocity with which AI is reshaping the way businesses connect with market opportunities is nowhere more evident than in customer service and support operations.

Even before the advent of widely-available generative AI (GenAI) tools frontline interaction between businesses and their customers was an early use-case for pre-AI automation, and the function continues to be a starting point for forays into AI adoption.

Customer service and support encompasses the operational and strategic tasks that ensure smooth and efficient customer contacts – everything from managing inquiries and resolving issues to training support agents and optimising transactions.

For digitally-transformed organisations, AI-enabled customer service can go further by increasing customer engagement, resulting in extended cross-sell/upsell, account expansion and renewal opportunities while reducing ‘cost-to-serve’ expenditure. AI is also making customer support centers more efficient, automating tasks and providing 24/7 coverage.

An ever-growing range of AI-powered applications – chatbots, AI agent tools, automated ticket routing, personalised responses, predictive customer service, real-time translation, self-service portals, sentiment analysis – are being deployed to handle routine inquiries and offer more engaging personalised support.

And yet the bigger picture is not quite as breezy as these benefits might indicate. GenAI’s shift from disruptive arriviste to business imperative has placed stresses on the enterprise customer service function. According to Gartner's Top 5 Priorities for Customer Service in 2025 study, GenAI’s popularisation has put pressure on customer service leaders to gain AI literacy they have not previously needed. More than 75 percent of customer service leaders also "feel pressure from other leaders in their enterprise to implement GenAI," Gartner reports.

From a technology perspective provisioning and integrating the tools and technologies that businesses need to get AI-enabled platforms up and running can prove cumbersome and complex. Such hurdles significantly inhibit faster AI uptick where AI must demonstrate speed-to-market returns to justify its initial investment. This leaves apps that take an age to be developed and rolled-out floundering, as costs overrun, and AI apps are overtaken by faster-moving rivals.

While existing AI development platforms might claim to be super-efficient, the Support Readiness team inside Nutanix reckoned that the factors that hamper timely AI solution delivery went deeper than was being recognised by other software vendors. For example, while using chatbots for customer support has become common practice, Nutanix’s various development team leaders were underwhelmed by the chatbots they saw being employed by many financial or consumer services organisations.

They realised that, in order to truly help customers address complex IT system issues that must be diagnosed and fixed with clear and concise technical details, support bots needed to draw on a range of data resources in real-time. It set-off a train of innovative solutions development that has led to Nutanix’s SupportGPT platform using the Nutanix GPT-in-a-Box 2.0 solution.

Box clever

Nutanix GPT-in-a-Box 2.0 is a turnkey solution that delivers ready-to-run AI infrastructure that allows enterprises to securely develop, implement and operate AI and GPT models.

The latest version comprises a full-stack enterprise AI platform built on web-scale data services to deploy GenAI apps, large language models (LLMs), and AI Operations (AIOps) anywhere – from core data center through to edge and to cloud.

GPT-in-a-Box 2.0 provides access to a comprehensive range of GenAI models and tools to simplify the top GenAI use-cases for the enterprise with a validated AI ecosystem that includes technology partners like NVIDIA and ML tools developer Hugging Face.

“GPT-in-a-Box 2.0 enables organisations to ‘jump-start’ their AI and Machine Learning ventures while maintaining secure control over their data and applications,” says Jason Longpre, VP Worldwide Support at Nutanix. “You can also preselect some models so that you are not starting totally from scratch. IT teams and development teams can run it on the Nutanix platform in a secure environment, so they don’t have any concerns about data leakage.”

Moreover, GPT-in-a-Box 2.0 leverages open source AI and MLOps frameworks on the Nutanix Cloud Platform to deploy a curated set of LLMs. It also includes a unified user interface for foundation model management, API endpoint creation, end-user access key management, and it will integrate Nutanix Files and Objects, plus NVIDIA Tensor Core GPUs.

Other onboard features include a graphical user interface, role-based access control, auditability, and ‘dark site’ (a data center or IT infrastructure section that’s purposefully unconnected to external networks) support.

It includes most everything needed to build AI-ready infrastructure, including: Nutanix Cloud Platform infrastructure on GPU-enabled server nodes; Nutanix Files and Object storage for running and fine-tuning GPT models; open source software to deploy and run AI workloads, including the PyTorch framework and the KubeFlow MLOps platform. GPT-in-a-Box 2.0 support for NVIDIA GPU Direct and NX-9151 are under development.

“The response to Nutanix GPT-in-a-Box 2.0 has validated the needs of enterprise customers for on-premises software solutions that simplify the deployment and management of AI models and inference endpoints,” Longpre says. “Enterprise is the new frontier for GenAI.”

Support your local GPT

Further, Nutanix works to ensure its GPT-in-a-Box 2.0 customer support teams leverage their own optimisation tools in order to improve the customer support they provide. While creating the GPT-in-a-Box solution to help IT infrastructure teams scale out their AI capabilities, Nutanix decided its best option was to develop its own GenAI app for system reliability engineers, called SupportGPT.

SupportGPT is a chatbot for providing answers to complex and demanding questions about using Nutanix’s hybrid multicloud software for IT operations.

SupportGPT is used principally to support internal operations. The tool also serves as a repository of experience and expertise that can be drawn on by all Nutanix support teams, to catalogue best practices and ensure that support processes benefit from an ongoing improvement cycle, Longpre explains.

“With SupportGPT we realised that there are substantial benefits to be gained by leveraging AI to analyse and understand what’s happening in customer support situations in real-time,” says Longpre. “That way the system can continually supply our support teams with more information for faster resolution, and drive additional improvements to prevent future recurrence.”

Nutanix System Reliability Engineers used to search a Nutanix database using keywords from questions or customer requests. Once they found a relevant article, they would read and find the information the customer needed. Now, they just search in SupportGPT using natural language processing, generating the answer in seconds, rather than hours as was the case with the previous method.

“In developing SupportGPT Nutanix's customer service team cross-collaborated across our AI team, SaaS Engineering team and the customer service team,” Longpre explains. “This collaboration ensured that the specific needs and goals of each team were clearly communicated throughout the process. The SupportGPT tool now sees hundreds of queries per-day.”

Ongoing SupportGPT application efforts feed into the Nutanix SaaS Engineering’s Learning Management System, which integrates all aspects of the company’s data-driven operations.

“At Nutanix we’re always looking at opportunities to make improvements for both customers and our own employees,” says Longpre. “They are in parallel streams to both impact the customers, and also showcase those attributes within our own organisation.” Nutanix personnel are sometimes asked why the company decided to develop Nutanix SupportGPT tools internally with GPT-in-a-Box 2.0, says Longpre, especially as there’s a perception that Nutanix is not first and foremost an AI specialist.

“We did certainly evaluate other product options before making the decision to build in-house, and in fact the evaluation confirmed our decision to go the internal development route,” Longpre says. “In general, it comes down to seven driving factors that we wanted to be able to totally assure: accuracy, compliance, customisation, extensibility, security, tuning and TCO.”

Longpre adds: “Also, Nutanix’s expertise and involvement in the open source AI community provides our customers with a strong foundation on which to build their AI strategy. Our credentials include participation in AI standards advisory board, technical leadership in defining the ML Storage Benchmarks and Medicine Benchmarks and serving as a co-chair of the Kubeflow (MLOps) Training and AutoML working groups at the Cloud Native Computing Foundation.”

Companies are doing it for themselves

Meanwhile, takeup of GPT-in-a-Box 2.0 continues to be boosted by the growing trend of customers wanting to develop their AI apps internally, Longpre reports: “We see multiple factors for this trend, including a desire for greater control and customisation, cost savings, compliance control and the potential for AI agents to handle tasks that previously had to be outsourced to third-party specialists. Customers cannot wait for third-parties to come back with prototypes and beta versions bound by feedback and redevelopment loops”.

Longpre adds: “There’s also the fact that in many respects it’s gotten easier to undertake software development in-house. Simplified-use platforms like GPT-in-a-Box enable in-house tech teams to address the challenges of AI software development in a way that would not have been feasible even five years ago.”

GPT-in-a-Box 2.0’s functionality reflects the realisation that GenAI apps are unlike other generations of software development, Longpre points out. “AI development typically pulls in more stakeholders than traditional software development processes that follow a linear, sequential approach,” he says. “More people want to be involved. The rationale is now ‘who else can be brought in to help drive this innovation?’”

Sponsored by Nutanix

More about

TIP US OFF

Send us news