Saturday, May 18, 2024
HomeTechnologyHBM chips are almost sold out for 2025, according to Nvidia supplier...

HBM chips are almost sold out for 2025, according to Nvidia supplier SK Hynix.

As companies rapidly expand artificial intelligence services, South Korea’s SK Hynix (000660.KS) stated on Thursday that its high-bandwidth memory (HBM) chips, which are used in AI chipsets, were sold out for this year and practically sold out for 2025.

“As data and (AI) model sizes increase, the HBM market is expected to continue growing,” Chief Executive Officer Kwak Noh-Jung stated during a press conference. “Annual demand growth is expected to be about 60% in the mid-to long-term.”

Up until March, Nvidia’s exclusive source of HBM chips was SK Hynix, which rivals U.S. rival Micron (MU.O) and domestic giant Samsung Electronics (005930.KS) in the HBM market. Analysts note that major buyers of AI chips are eager to diversify their suppliers in order to better maintain operating margins. Nvidia holds over 80% of the market for AI chips.

Additionally, Micron stated that the majority of its 2025 supply had already been allocated and that its HBM chips were sold out for 2024. In March, it intends to give clients samples of its 12-layer HBM3E chips.

“As AI functions and performance are being upgraded faster than expected, customer demand for ultra-high-performance chips such as the 12-layer chips appear to be increasing faster than for 8-layer HBM3Es,” stated Jeff Kim, head of research at KB Securities.

This week, Samsung Electronics (005930.KS), which intends to begin production of its HBM3E 12-layer chips in the second quarter, announced that it has finished supply discussions with clients and that this year’s shipments of HBM chips are anticipated to expand more than threefold. It didn’t go into much detail.

SK Hynix stated last month that it would invest 5.3 trillion won ($3.9 billion) on a new DRAM chip factory at home with an emphasis on HBMs, in addition to a $3.87 billion plan to create an advanced chip packaging plant with an HBM chip line in the U.S. state of Indiana.

According to Kwak, investment in HBM is different from previous memory chip industry trends in that capacity is raised only once a specific level of demand has been met.

According to Justin Kim, head of AI infrastructure at SK Hynix, the share of chips designed for AI, like HBM and high-capacity DRAM modules, is predicted to rise from roughly 5% in 2023 to 61% of total memory volume by 2028.

In a conference call following its earnings last week, SK Hynix hinted that if demand for digital gadgets surpasses forecasts, there would be a scarcity of standard memory chips by year’s end for network servers, smartphones, and PCs.

The world’s second-largest memory chipmaker and supplier to Nvidia (NVDA.O) will start shipping samples of its newest HBM chip, the 12-layer HBM3E, in May, and start mass producing it in the third quarter.



Please enter your comment!
Please enter your name here

Most Popular

Recent Comments