Memory-maker Micron predicts new wave of server consolidation
AI boosted revenue last year, demand for more RAM in devices to help in 2025
Micron has told investors it expects a new round of server consolidation to add to its already strong growth.
The memory giant on Wednesday announced its Q4 FY 2024 results, which saw revenue of $7.75 billion – up 93 percent year over year – and net income of $887 million. Full year revenue of $25.1 billion was $9.5 billion up from the prior year, and net income improved from last year's $5.9 billion loss to $778 million of black ink.
It's 2024, so it will not surprise that president and CEO Sanjay Mehrotra attributed some of its sales spike to demand for memory-hungry AI servers. He also had the pleasant duty of reminding investors that Micron has already sold all the high-bandwidth memory (HBM) it plans to make this year and next, and assured that the biz has taken steps to ensure its manufacturing operations can meet demand.
Mehrotra predicted datacenter demand will remain strong in 2025 – helped by low-single-digit percentage range growth for traditional servers.
"We expect traditional server demand to benefit from a refresh cycle, as a single latest-generation traditional server can replace multiple older-generation servers to provide valuable space, power and performance improvements to improve datacenter efficiency," Mehrotra explained.
That's a reference to the forthcoming release of next-gen datacenter CPUs from Intel and AMD, which will bring hundreds of cores to servers in 2025. Both chipmakers have predicted their manycore offerings will spark a new wave of server consolidation, and Mehrotra appears to agree.
Micron wins in any event. "We see increasing DRAM and NAND content both in traditional as well as AI servers," Mehrotra noted in prepared remarks.
The CEO was also upbeat about the AI PC, saying manufacturers have started buying memory already, because they fear having to compete with datacenter buyers. AI PCs also need more memory and storage.
"Leading PC OEMs have recently announced AI-enabled PCs with a minimum of 16GB of DRAM for the value segment and between 32GB to 64GB for the mid and premium segments, versus an average content across all PCs of around 12GB last year," Mehrotra revealed. The CEO also pointed to AI-enabled smartphones currently shipping with 12GB to 16GB of DRAM, compared to an average of 8GB in flagship phones last year.
- Micron mega-fab mildly endangered by definitely endangered American bats
- Micron told to pay $445M in memory patent infringement case
- Intel nabs Micron exec to oversee foundry business ambitions
- Lenovo and Micron first to implement LPCAMM2 in laptop
Mehrotra told investors Micron's new manufacturing process – 1-gamma, which works at 10nm – is on track to produce product in 2025, and its 1-beta DRAM and G8 and G9 NAND nodes are ramping in high volume and "will become an increasing portion of our mix through fiscal 2025." That G8 node produces 232-layer NAND.
Execs didn't mention Micron's troubles in China, where it was banned for vague reasons. The surge in revenue perhaps rendered that matter irrelevant. China was mentioned in the context of work to expand a factory in Xi'an, which is apparently going well. So is construction of an assembly and test facility in India. Mehrotra also reported "progress on the construction for our new fab in Idaho" and on obtaining permits to build another facility in New York state.
Guidance for Q1 2025 was for $8.7 billion, plus or minus $200 million – which would represent a $4 billion increase on Q1 2024 revenue. No wonder Micron's share price popped from around $95 to almost $110 in after-hours trading. ®