SK hynix's high bandwidth memory buffet fully booked till 2025

Micron also riding the AI wave with 128 GB DDR5 RDIMMs

Memory chipmaker SK hynix has already sold all the high bandwidth memory (HBM) it will manufacture this year and most of its expected 2025 production, citing increased demand driven by the AI craze. Micron is also getting in on the act with availability of 128 GB DDR5 RDIMMs for servers.

SK hynix told a news conference at its Icheon headquarters in South Korea that it is set to expand its output of memory chips, predicting that global demand is set to increase over the long term thanks to applications such as AI.

In the near term, the company plans to provide customers with samples of its fifth-generation HBM products – 12-layer HBM3E – during May, and start mass production of these in the third quarter, according to The Korea Times.

Domestic rival Samsung revealed this week that it plans to mass produce HBM3E 12-layer products and a 128 GB product based on 1B nanometer 32 GB DDR5 within the second quarter, also with an eye on demand driven by AI.

SK hynix CEO Kwak Noh-jung told the conference that his company will be able to produce many more HBM chips after completing new fabrication and advanced packaging facilities, located in South Korea and the US, Nikkei Asia reports.

"The HBM market is expected to grow continuously as the number of parameters and modalities are increasing for the improvement of AI quality," Kwak is reported as saying. The company is the main supplier of HBM chips for Nvidia's GPUs, which are heavily used for AI training purposes.

He forecast that average annual demand is set to increase by 60 percent over the mid to long term as the volume of data and the size of the AI models are both continuing to grow.

According to TrendForce, Kwak stated that while AI is currently centered on training models in datacenters, it is expected to rapidly expand to on-device AI applications in smartphones, PCs, vehicles, and other end-user devices in the future. This is why SK hynix believes that demand for memory tailored for AI, characterized by "ultra-fast, high-capacity and low-power," is expected to balloon.

Just last week, the company said it plans to invest more than $14.5 billion in a new memory chip manufacturing plant in North Chungcheong Province. This M15X fab was originally slated to make NAND memory chips, but will now produce DRAM instead, with production expected to start in November 2025.

US-based memory maker Micron is also leaning into the market for AI with availability of 128 GB DDR5 RDIMMs for servers based on its 32 Gb DRAM silicon, for which it claims 16 percent lower latency as well as higher capacity.

These high-speed (up to 5,600 MT/s) memory modules have been engineered to meet the growing needs of datacenter applications such as AI and machine learning (ML), plus high-performance computing (HPC) and in-memory databases, Micron said.

The latest RDIMMs are available now directly from Micron and through select global channel distributors and resellers from June. ®

More about


Send us news

Other stories you might like