On-Prem

AMD reveals Azure is offering its SmartNICs as-a-service

Still smiling despite Q1 PC chip sales slumping 65 points and flat server sales


AMD has revealed it's scored a big customer for its Pensando Data Processing Units (DPUs, aka SmartNICs): Microsoft’s Azure cloud, which is offering them as a service.

The DPUs rated a mention from AMD CEO and chair Lisa Su on AMD's Q1 2023 earnings call, as well as a post explaining that the accelerators power Microsoft's recently announced Accelerated Connections service.

Microsoft suggests deploying Advanced Connections on the virtual NICs of cloudy firewalls and load balancers to increase their throughput. Now we know there's AMD Inside making that happen – a notable win, as while hyperscalers routinely use DPUs they're integrated into their services and not rented like other cloudy resources.

News that Azure has adopted AMD's Pensando for a rent-a-DPU service is welcome news in the context of the chip shop's otherwise tepid Q1 results.

Revenue of $5.35 billion was down nine percent year on year, producing a $139 million loss compared to Q1 2022's $786 million of profit.

Revenue from processors destined for client devices declined 65 per cent year-over-year to $739 million as AMD "shipped significantly below consumption to reduce downstream inventory."

Su said this quarter represents "the bottom for our client processor business" but added that she expects the entire PC market to contract 10 percent in 2023, leaving 260 million units for suppliers to scrap for.

AMD's Datacenter segment delivered flat revenue of $1.3 billion, with higher sales of EPYC processors to cloud customers offset by lower enterprise server processor sales.

Operating income for the segment was $148 million, or 11 percent of revenue. That's a big dip compared to figures of $427 million and 33 percent for Q1 2022.

Su said the drop was "primarily due to product mix and increased R&D investments to address large opportunities ahead of us." Some of that R&D was on GPUs to ensure AMD can cash in on AI. Q1 was also the first period in which AMD counted expenses accrued by Pensando, which it acquired in mid-2022.

She didn't explain exactly what that means, but did mention that AMD's Bergamo silicon – data center CPUs with up to 128 cores intended for cloud native workloads – will debut "later this quarter."

The CEO also offered her opinion that AMD's Zen 4 architecture, and the Genoa processors within which it can be found, leave the company "extremely well positioned for enterprise where we have been underrepresented." That's a long-standing bugbear for AMD which, despite producing top-notch products, has struggled to convince buyers they have an X-factor worthy of displacing Intel's Xeons.

But Su also warned that adopting Genoa, and Bergamo, won't be swift. It's a new platform which includes PCI 5 and allows use of DDR5 memory – two techs new to the cautious folks who build and operate datacenter infrastructure.

The CEO predicted the second half of 2023 would deliver better results, as both client and server sales pick up.

Several analysts asked Su if AMD is positioned to catch the wave of generative AI. Her answer was essentially "yes" – AMD has the processors and GPUs needed to make AI work, so years of strong sales are in prospect.

Su was also asked how hyperscalers' increasing appetite for their own silicon might impact AMD. She responded with an argument that AMD's ability to offer CPUs, GPUs, FPGAs, adaptive SoCs, and DPUs – plus its operation of a semi-custom team that's delivered tech such as CPUs for gaming consoles – means AMD can serve those who want custom chips rather than being threatened by them. ®

Send us news
Post a comment

AI startup Lamini bets future on AMD's Instinct GPUs

Oh MI word: In the AI race, any accelerator beats none at all

Desktop AI isn’t happening, says AMD, and might not for quite a while

Chip designer has extended support for modest desktop CPUs, citing Intel setting expectations for cheap and not-so-speedy silicon

Microsoft kills classic Azure DaaS, because it isn't really Azure

Users get three-year deprecation and migration warning

AMD's latest FPGA promises super low latency AI for Flash Boy traders

Letting more advanced ML loose on the stock market? What could possibly go wrong?

Intel slaps forehead, says I got it: AI PCs. Sell them AI PCs

People try to put us down, talkin' 'bout ML generation

Microsoft hiring a nuclear power program manager, because AI needs lots of 'leccy

Envisions a 'comprehensive small modular reactor and microreactor integration roadmap'

UK judge rates ChatGPT as 'jolly useful' after using it to help write a decision

PLUS: Coca-Cola's AI-designed drink to debut; chip startups struggle to compete with Nvidia as funding flees

OpenAI warns folks over GPT-4 Vision's limits and flaws

Plus: Mistral emits uncensored model, Meta expands Llama 2's context window, Alexa drills into your voice

Colleges snub Turnitin's AI-writing detector over fears it'll wrongly accuse students

By the time they graduate, employers will be making them use LLMs anyway

NSA hopes AI Security Center will help US outsmart, outwit, and outlast adversaries

Agency boss warns enemies trying to nick AI advances and 'corrupt our application of it'

Cloudflare loosens AI from the network edge using GPU-accelerated Workers

Isn't that how Skynet took over?

Medium asks AI bot crawlers: Please, please don't scrape bloggers' musings

OpenAI and Google might respect robots.txt but how about the others?