US biz stockpilers boost SK Hynix top line as memory market undergoes structural change
'Inventory accumulation' as vendors hoard HBM amid tariff and other pressures
South Korean memory maker SK Hynix is reporting a sales bounce due to the demand for AI systems, helped by US businesses stockpiling HBM supplies amid tariff uncertainty.
The memory maker, one of the top three global suppliers, says it recorded revenues of ₩17.6 trillion ($12.3 billion) and ₩7.44 trillion ($5.2 billion) in operating profit for calendar Q1.
That topline figure is up 42 percent on the same period a year ago, and down about 11 percent from the corp's all-time high of ₩19.767 trillion (nearly $14 billion) in the prior quarter.
According to SK Hynix, the memory market ramped faster than expected due to orders for AI-capable systems and what it terms "inventory accumulation demand." The latter seemingly refers to system builders and others in the memory supply chain stockpiling ahead of anticipated tariff increases from the Trump administration, as has already been seen in the PC channel.
SK says it responded to the increased demand with an expansion in sales of high value-added products such as DDR5 DRAM and its 12-layer HBM3E (high-bandwidth memory) used in AI and high-performance compute (HPC) systems.
On a conference call for investors and analysts, SK Hynix discussed the uncertainty surrounding US tariff policies, and admitted the impact of these is difficult to assess without detailed criteria. Approximately 60 percent of the memory's giant's revenue comes from US customers.
SK previously told investors that all the HBM it can manufacture has already been sold, through to the end of 2026.
- Global datacenter electricity use to double by 2030, say policy wonks. Yup, it's AI
- SK hynix has probably already sold most of the HBM DRAM it will make next year
- NAND flash prices plunge amid supply glut, factory output cut
- Continuity of CHIPS and Science Act questioned in a Trump presidency
Increased orders for high-capacity server DRAM is being fueled by AI model development, particularly due to DeepSeek's impact, which lowers AI development costs. SK expects customers to keep buying high-capacity memory for AI workloads, as AI models require more memory for precise results.
Richard Gordon, Vice President & Practice Lead for Semiconductors at The Futurum Group, warned that interpreting these results can be tricky at the moment, as the memory market is undergoing a transition.
"Usually, Q1 is a sequentially weak quarter due to seasonality, but the memory cycle and market dynamics can override seasonality," he told The Register.
Seasonal softness was a factor when PCs and smartphones were the main drivers for memory sales, but this is now shifting to datacenters, where seasonality is less of an issue.
"The bigger factor is that the DRAM market is at the early stages of a significant structural change, which SK hynix is especially exposed to since about 80 percent of its revenue is DRAM. In short, the DRAM market is transitioning from a highly commoditized market, where market pricing was heavily affected by supply and demand, to a specialized, almost application-specific market where premium pricing is much more stable (and profits are higher)," Gordon said.
For AI in particular, the industry is shifting to HBM with long-term supply and pricing agreements, given the stiff competition among those building AI datacenter infrastructure, he added.
"For this reason, I don't think the reference to stronger demand coming from inventory accumulation is necessarily to do with tariff uncertainty - it's probably more to do with customers eager to guarantee future supply of HBM, which is critical for AI infrastructure."
Perhaps because of this, SK Hynix is forecasting HBM demand to approximately double compared with last year. As a result, sales of 12-layer HBM3E are expected to account for more than 50 percent of SK's HBM3E revenues in Q2.
However, it started supplying high-performance LPCAMM2 memory modules for AI PCs to customers during in Q1, and plans to ship SOCAMM (Small Outline Compression Attached Memory Module), a low-power DRAM module for AI servers, "when demand ramps up." ®