Putting AI challenges in perspective with partnerships
How organizations en route to AI sufficiency can ease their journey by working with tech providers like HPE
Sponsored Feature As the technology becomes more widely deployed across more vertical sectors and industries, the capacity of artificial intelligence (AI) to transform business processes, strategic decision-making and customer experiences is being roundly lauded by IT strategists and economic analysts.
Even chief executives once wary of approving the investment AI needs to deliver optimal value are coming to recognize its potential to improve operational efficiency and pave the way for new revenue streams.
Forecasts by venerable market-watchers like PwC support their view. Its 'Global Artificial Intelligence Study' reckons that AI could be contributing up to $15.7 trillion to global economies in 2030. Of this, $6.6 trillion could come from increased productivity and $9.1 trillion could come from 'consumption-side effects', PwC asserts.
The recent roll-out of several generative AI tools is deemed a breakout point for what had previously been a highly-specialized and 'futuristic' branch of computer science. In the UK in 2022 the Office for Artificial Intelligence reported that around 15 percent of businesses had adopted at least one AI technology, which equates to 432,000 companies. Around 2 percent of businesses were piloting AI, and 10 percent planned to adopt at least one AI technology going forward (62,000 and 292,000 businesses, respectively).
It's still complex stuff
Amid this AI-fervor organizations should remember that AI is still a relatively young technology, and it can be challenging to set up for the first time. What's more, associated return on investment (ROI) is highly dependent on very precisely managed implementation procedures and configurations that are often less robust in the face of errors than conventional IT deployments.
AI poses estimable tests for the IT teams tasked with implementing AI/Machine Learning initiatives and workloads, for example, which can include overcoming skills gaps and compute constraints. They may also involve resource trade-offs with other enterprise workloads already using a common IT infrastructure.
"AI is a journey, not a destination – it's not about being adoption-ready or automating processes simply for more efficiency," says Matt Armstrong-Barnes, Chief Technology Officer for Artificial Intelligence at Hewlett Packard Enterprise (HPE). "Rather, it's about the realization of long-term value, enabling better outcomes, and recognizing that AI demands a fundamentally different approach to IT deployment. For enterprise technologists it's a 360-degree all-round learning curve."
Armstrong-Barnes's point is evidenced by Deloitte's latest 'State of AI in the Enterprise' survey of global business leaders. Its respondents identified a heap of challenges AI sprung on successive phases of their AIimplementation projects. Proving AI's business value was an issue cited by 37 percent – projects can prove costly, and a compelling business case can be hard to validate faced with investment-wary boards and C-Suite executives.
Scaling up those AI projects over time can hit further identified hurdles, such as managing AI-related risks (cited by 50 percent of those taking part in the Deloitte survey), lack of executive buy-in (also 50 percent), and lack of maintenance or ongoing support (50 percent again).
"Quite understandably, corporate leaders need to be convinced that AI will pay its way," Armstrong-Barnes says. "This is where working from the outset with a tech partner that has been involved with proven AI implementations for many years helps win the case. Its track record will lend credibility to project proposals and help to convince execs that AI's risks are as manageable as any other IT venture."
And while technology and talent are certainly needed, it's equally important to align a company's culture, structure and ways of working to support broad AI adoption, according to McKinsey, with distinctive characteristics sometimes acting as barriers to AI-driven change.
'If a company has relationship managers who pride themselves on being attuned to customer needs, they may reject the notion that a "machine" could have better ideas about what customers want and ignore an AI tool's tailored product recommendations,' McKinsey suggests.
"I confer with HPE peers and HPE customers frequently about the range of challenges they are encountering with AI deployment," reports Armstrong-Barnes. "Some common evidential characteristics come up again and again. One is an underestimation of how fundamentally different AI deployments are from traditional IT implementations. Organizations must deploy AI in a primarily different way than the IT projects they have implemented in the past. Data management and scaling are significantly different for AI. This means that sometimes, hard-won tech experience has to be learned anew."
The inclination to experiment with AI pilots before deploying it directly into a real use-case that supports a pressing business need should be avoided, Armstrong-Barnes explains. "The try-before-you-buy approach seems reasonable – AI is complex and investment-hungry," he explains, "But with AI, dry runs and test projects don't really replicate the challenges user organizations will encounter with an actual implementation. What starts 'in the lab' tends to stay in the lab."
At the other end of the adoption scale Armstrong-Barnes sees companies that try to apply AI wherever it can be applied, even where an application is working optimally without AI: "The takeaway here is – just because in AI you have a massive hammer, you should not then see everything as a nut to be cracked."
People and infrastructure not readily available
Even the most advanced AI systems have yet to attain total end-to-end autonomy – they need to be trained and fine-tuned by human expertise. This represents a further challenge for AI-aspirant companies: how best to acquire the necessary skills – retrain existing IT personnel? Recruit new team members with requisite AI knowledge? Or explore options to defer the need for AI expertise to technology partners?
McKinsey reports that AI's potential is being constrained by a shortage of skilled talent. A typical AI project requires a highly proficient team including a data scientist, data engineer, ML engineer, product manager and designer – and there simply aren't enough specialists available to occupy all those open jobs.
"We see enterprise technologists generally having to upgrade their abilities in five key respects," Armstrong-Barnes says. "Principally, they lie in the areas of AI expertise, IT infrastructure, data management, complexity management, and to a lesser degree, the aforementioned cultural barriers. None of these challenges is insurmountable given the right approach and partnership support."
AI also likes super-powerful hardware to run on. Provisioning high-performance compute platforms continues as an abiding challenge because few organizations want – or can afford – to make the necessary investments into their server estates without a provable increase in ROI ratios.
"When planning AI implementations, at a very early stage IT planners need to make some key decisions regarding the core enabling technology," says Armstrong-Barnes. "For instance, are you going to buy it, build it – or take a hybrid approach that encompasses elements of both?"
The next important decision relates to partnerships. A defining condition of successful AI delivery is that nobody can go it alone, Armstrong-Barnes points out: "You need the support of technology partners, and the best way to establish those partnerships is through an AI ecosystem. Think of an AI ecosystem as a supportive consortia of expertise that, coming together, will give you access to the right knowhow, data, AI tools, technology and economics to develop and operationalize your AI endeavors."
Armstrong-Barnes adds: "Customers sometimes ask how HPE came to be so experienced in AI use-cases – did we foresee its impact years ago and start preparing well ahead of the market? The fact is we saw AI's impact coming not years but decades ago, have been establishing AI centers of excellence and ecosystems for a long time, and have been making strategic acquisitions to augment our existing expertise in line with customer requirements and growth opportunities."
No train, no gain
One such augmentation is Determined AI, which became part of HPE's HPC and AI solutions offerings in 2021. Determined AI's open-source software addresses the fact that building and training optimized models at scale is an exacting and critical stage of ML development – one that increasingly requires non-technologists like analysts, researchers and scientists to take on the challenges of HPC.
These challenges include setting-up and managing a highly parallel software stack and infrastructure that spans specialized compute provisioning, data storage, compute fabric and accelerator cards.
"Additionally, ML exponents need to program, schedule and train their models efficiently to maximize the utilization of the specialized infrastructure they have set up," says Armstrong-Barnes, "which can create complexity and slow down productivity."
These tasks have to be done, of course, with a rigorous level of competence which, even with the support of overstretched inhouse IT teams, is not easily assured.
Determined AI's open source platform for ML model training is designed to close this resource gap, making it easy to set-up, configure, manage and share workstations or AI clusters that run on-premises or in the cloud. And on top of premium support, it includes features such as advanced security, monitoring and observability tools – all supported by expertise from within HPE.
"Determined AI is about removing barriers for enterprises to build and train ML models at scale and speed, in order to realize greater value in less time, with the new HPE Machine Learning Development System," Armstrong-Barnes explains. "These capabilities include quite techie stuff necessary to optimize AI/Machine Learning workloads, like accelerator scheduling, fault tolerance, high-speed parallel and distributed training of models, advanced hyperparameter optimization and neural architecture search.
"Add to that disciplinary tasks like reproducible collaboration and metrics tracking – it's a lot to keep on top of. With Determined AI's help project specialists can focus on innovation and fast-track their time to delivery."
More HPC resource and regulation play their part
The power of HPC is also increasingly being used to train and optimize AI models, in addition to combining with AI to augment workloads such as modelling and simulation – long-established tools to speed time-to-discovery in sectors across the manufacturing industry.
The global HPC market is set for estimable growth over the rest of the 2020s. Mordor Intelligence estimates its value at $56.98 billion in 2023, and expects it to reach $96.79 billion by 2028 – a CAGR of 11.18 percent over the forecast period.
"HPE has been building HPC infrastructure for a long time, and now has a HPC portfolio that includes Exascale Supercomputers and density-optimized compute platforms. Some of the biggest HPC clusters are built on HPE innovation," says Armstrong-Barnes. "HPE has unmatched expertise in high-performance hardware platforms."
With the introduction of HPE GreenLake for Large Language Models earlier this year (2023), enterprises – from startups to Fortune 500 – can train, tune and deploy large-scale AI using a sustainable supercomputing platform that combines HPE's AI software and the most advanced supercomputers.
Clearly, adopting AI is challenging for organizations of all sizes, but it's not just about the technology, Armstrong-Barnes points out: "Increasingly, all AI adopters will have to stay up-to-date with emergent AI regulations and compliances. Legislation like the US AI Bill of Rights, EU AI Act and the forthcoming regulatory proposals set out in the UK Government's AI White Paper – generally expected to inform a compliance-ready AI Framework – are immanent examples of this."
For businesses that operate internationally, this looks like another hurdle wrapped in red tape, but Armstrong-Barnes suggests that regulatory compliances may not be so onerous as they might appear – with a little help from a well-appointed AI partnership ecosystem.
"Check if your AI ecosystem partners could also help you in compliances – if you are already in a heavily-regulated business environment, it could well be that you are already half-way there with existing observances."
Sponsored by HPE.