How DPUs accelerate AI/ML

Hyperscalers spent years evolving data processing unit technology – AI/ML makes it critical for the enterprise too

Sponsored Post Data processing units (DPUs) are the advanced forms of programmable network adapters that act as an interface between the server and the rest of the network in datacenters to help improve server utilization and power consumption.

Hyperscale cloud providers have been using them for years to offload functions like network security (firewalls and encryption), network traffic management and control, load balancing, and NVMe over Fabric storage protocols.

But these benefits are not limited to hyperscalers – DPUs can also be used in smaller datacenters and server farms. This is particularly relevant when it comes to hosting and processing the large language models (LLMs) associated with generative artificial intelligence (AI) and machine learning (ML) enabled applications that put considerable strain on system CPUs and GPUs, and where offloading other functions to the DPUs can help to improve performance.

DPUs can handle control plane functions such as storage and network, making more server capacity available for AI/ML workload processing. They can also help enable multi-tenant use, or the ability of multiple users to share large AI training systems, to improve resource management and help share data more securely amongst users.

These benefits can be magnified by integrating high-performance FPGAs, such as the Achronix Speedster7t, into DPUs. FPGAs contribute parallel processing capabilities to aid real-time AI/ML processing, enhance computational power, optimize energy consumption, and reduce latency. This can make the deployment of AI/ML technologies more sustainable and efficient. Moreover, the hardware reconfigurability of FPGAs allows for quick adaptation to new or evolving AI algorithms, helping datacenters to remain cutting-edge with minimal operational costs and can use their existing hardware over a longer period of time.

You can learn more by watching this webinar - The Rise of the DPU – which discusses market trends and technological integration while analyzing the architectural benefits of putting DPUs, augmented with Achronix Speedster7t FPGAs, rather than CPUs into servers and other systems. It goes on to consider the advantages of adding Achronix Speedster7t FPGAs to DPUs before looking into the crystal ball and considering how the technology will evolve over the next few years, particularly when it comes to its impact on AI and ML-enabled applications and workloads.

Speakers include Ron Renwick, Director of Product Marketingand Scott Schweitzer, Director of Product Planning at Achronix, alongside Baron Fung, Senior Research Director at analyst firm Dell'Oro Group, and Patrick Kennedy, Editor-in-Chief at ServeTheHome, an online resource dedicated to exploring the latest technology in servers, storage, networking, and high-end workstation hardware, as well as open source.

You can watch the webinar by clicking this link.

Sponsored by Achronix.

More about

More about

More about


Send us news