Getting on board with AI

How partnerships can provide the key to getting up and running with AI faster than the competition

Sponsored Feature Artificial intelligence (AI) has dominated the business and technology headlines lately. You can't check a news site without seeing a story about how AI is poised to change the way we do business.

Naturally, just about every company is eager to begin or advance their journey into AI, but most don't quite know how to proceed.

AI is already having a major impact on modern business. In fact, according to statistics gathered by AuthorityHacker, 35 percent of businesses have adopted AI, 77 percent of devices are using some form of AI, and nine out of 10 organizations support using AI to gain competitive advantage. AI is subsequently expected to add $15.7 trillion to the global economy by 2030. And as with any new technology, there will be jobs lost and jobs gained. AI could potentially eliminate 85 million jobs by 2025, but on the flip side, it could also create 97 million new jobs.

Companies looking either to adopt AI for the first time, or expand their existing use of it, face several layers of challenges on both the staffing and technology sides.  There are also regulatory and ethical concerns about the technology. And as AI systems are fueled by data, companies inevitably face concerns about ensuring the quality, relevance, and availability of the information they're feeding into the AI algorithms. Ensuring those datasets are accurate, up to date, and as comprehensive as possible is likely to present an enduring challenge. The same is true when it comes to handling the complexities of the requisite hardware, infrastructure and energy provision, and the associated costs.

Overcoming the challenges to reap the rewards

Matt Armstrong-Barnes, Chief Technologist for AI at HPE, believes that organizations often make the mistake of approaching AI without a strategic plan. "They are running at the technology too quickly. They don't have a common strategy," he says. "They create interesting science projects, but they're not adding business value".

First and foremost, companies need to develop an AI strategy which identifies and prioritizes use cases, and makes sure they're tackling real problems and not just building something that will live and die in the lab. There are of course practical questions around this process: "How are you going to build these AI platforms? How are you going to monitor them?" Armstrong-Barnes asks. "How do you make sure they're still operating efficiently? How are you going to realize you've achieved benefits you thought they were going to achieve? How do you allocate the budget to fund initiatives in the right way?"

There's little doubt that asking the right questions and having a solid plan in place can help reduce the time it takes to realize the benefits of AI. But getting any AI system from the experimental model to an actual working model also presents a major challenge. "The biggest challenges are around 'operationalization,' which is how you get an AI system from initial data gathering to constructing a model to production deployment," Armstrong-Barnes explains.

And making sure employees have the proper skills is essential. Attracting and retaining staff with the right attributes, or partnering with an organization that can provide that expertise, will be a major focus. "There's still a lot of misunderstanding around what the technology can do, so education not only builds skills, it also builds buy in," he adds.

One approach that companies can take to resolve some of the skill set and infrastructure issues is to seek out partnerships, he advises: "You can partner to bring in those skills; partner to access infrastructure, platform, and model services."

An AI-native architecture has many layers. AI infrastructure service components can include GPUs and accelerators for example, alongside compute, storage and networking elements, containers and virtual machines and AI libraries. Likewise AI platform services can incorporate ML applications, and data, development and deployment services. And let's not forget model services encompassing foundational models, fine-tuning, vector stores and prompting, alongside AI business services designed to promote trustworthiness by eliminating bias and drift to deliver valuable use case presentations.

HPE has already built a robust human focused framework which can be applied to customer requirements, centred on privacy, inclusivity and responsibility, says the company.

"That means you can focus on the data and the business problem," says Armstrong-Barnes.

It's all about the data

Focusing on the data when you design and deploy AI systems can be critical. Organizations are being hit by tsunami of data every single day. What AI enables them to do is find hidden patterns in that data, which helps to accelerate their ability to derive value from it. Then they can make significantly better-informed decisions about the applications, processes and services they want to build or enhance.

A major component of that data-centric focus is having a solid strategy in place for how to gather, manage, and monitor the data – one which is closely aligned to the business, builds a data culture and includes elements around governance, data quality, privacy and metadata, says HPE.

"You need to understand what the business is trying to do," explains Armstrong-Barnes. "You need to understand how you're driving data quality, who accessed it, how do you dispose of it, what metadata are you storing."

Another problem the data can present is silos. When data is locked away, extracting and getting value out of it can be problematic. And once that data is accessible and available, there comes the issue of training the data that will inform the AI platforms. When it comes to building AI systems, at a high level there are several stages: data gathering; refining data to make it ready for model construction; building the models; tuning the models; and then deploying them. Each of these stages presents specific challenges.

But using an AI-native architecture from HPE Greenlake can go a long way to putting the right foundation in place to expedite these processes, says HPE. And the company's Machine Learning Development Environment (MLDE) is also designed to help reduce the complexity and cost associated machine learning model development.

Training these AI models also requires significant processing power. As companies move to adopt or increase their use of AI, they must first have the technological capacity to handle the load. The HPE GreenLake platform can provide that capacity in the form of a high-performance processing architecture and streamlined data pipeline that organizations will need to ensure access to high quality, relevant data to build and deploy AI models and workloads.

Successful projects can light the way

It's often helpful to look at companies that have already done a good job of adopting and implementing AI for guidance. One of these is Seattle, WA-based esports team Evil Geniuses. Throughout its 25 year history, the company has entered teams in a variety of esports playing Call of Duty, Fortnite, Halo, Rocket League, and VALROANT. Evil Geniuses' teams have been quite successful. The company's Call of Duty: WWII team won the 2018 Call of Duty Championship for example, and the VALORANT team won the 2023 VALORANT Champions.

"We're here to change the face of gaming," says Chris DeAppolonio, CEO for Evil Geniuses. "We're an esports and gaming entertainment organization. We play games professionally across the globe. Technology and data are the backbone of everything we do. Our games are built on ones and zeroes. They're based on data, and how do we process that and create insights from that?"

One of the more pressing concerns facing Evil Geniuses is identifying potential professional gamers. The company processes large quantities of complex data to find talent across the globe, and uses HPE's AI services and solutions to help it do that. "We want to find data on that future pro," he says. And it seems to be working. "We want to win. We want to find better talent. We want to be more efficient with coaches and scouts. We can use insights to unearth the next superstar."

The future for AI - from both a productivity and business benefit perspective - looks promising. "AI is a team sport, it's about skills," says HPE's Armstrong-Barnes. "When it comes to successfully implementing AI systems, one approach is to partner with an organization with a trace record in building scalable, efficient and effective AI systems. With a deep heritage in AI going back decades HPE offers the tools, techniques and skills to accelerate AI initiatives."

Being data-driven and fully understanding the data and what they're going to use it for will help an organization take a use case centric approach to help identify how it can fuse its data with AI techniques to drive business value. Once that understanding is in place, it becomes easier to build on the benefits.  

Armstrong-Barnes advises companies to build platforms that let them start small but have all the foundations in place to enable them to scale up when required. Then they just have to work out what they want to do and how it's going to add value, and grow with their needs over time. HPE emphasizes its ability to build 'AI factories' combining hardware, software and services that provide that enterprise scalability, supported by integrated systems which make life easier for end users.

"You want to keep up with your competitors already on the AI journey," he says. "Adding partners into Team AI is a critical success factor when it comes to building an AI-native architecture that scales with your needs and allows you to focus on your data and business challenges instead of the complexities of the underlying foundations."

Sponsored by HPE.

More about

More about

More about


Send us news