Saturday, July 27, 2024
HomeAIAI boom to keep supply of high-end memory chips tight this year,...

AI boom to keep supply of high-end memory chips tight this year, analysts warn

Analysts predict that high-performance memory chips will continue to be in limited supply this year due to growing AI demand.

SK Hynix and Micron, two of the world’s major memory chip suppliers, are out of high-bandwidth memory chips for 2024, and stock for 2025 is practically sold out, according to the companies.

In a recent analysis, Kazunori Ito, director of equity research at Morningstar, predicted that the general memory supply will remain constrained until 2024.

The need for AI chipsets has bolstered the high-end memory chip market, benefiting companies such as Samsung Electronics and SK Hynix, the world’s top two memory chip manufacturers. Nvidia currently sources chips from SK Hynix, but is also exploring Samsung as a potential supplier.

High-performance memory chips are critical in the training of large language models (LLMs), such as OpenAI’s ChatGPT, which has accelerated AI adoption. LLMs require these chips to remember details from previous talks with users as well as their preferences in order to respond to queries in a manner similar to humans.

According to TrendForce, the production cycle for HBM is 1.5 to 2 months longer than that of DDR5, which is typically found in personal PCs and servers.

To accommodate rising demand, SK Hynix intends to increase production capacity by investing in sophisticated packaging facilities in Indiana, the United States, as well as the M15X fab in Cheongju and the Yongin semiconductor cluster, South Korea.

During its first-quarter results call in April, Samsung stated that its HBM bit supply in 2024 has “expanded by more than threefold versus last year.” Chip capacity is the number of bits of data that a memory chip can store.

“And we have already finalized conversations with our customers over the committed supply. “In 2025, we will continue to expand supply by at least two times or more per year, and we are already in smooth talks with our customers about that supply,” Samsung stated.

Micron did not respond to CNBC’s request for comment.

Intensive competition
To remain competitive, major tech companies such as Microsoft, Amazon, and Google are investing billions of dollars in training their own LLMs, boosting demand for AI chips.

“The big buyers of AI chips, such as Meta and Microsoft, have signaled that they intend to continue investing in AI infrastructure. This indicates they will purchase vast quantities of AI chips, including HBM, at least until 2024,” said Chris Miller, author of “Chip War,” a book about the semiconductor industry.

To capitalize on the AI growth, chipmakers are competing fiercely to provide the most powerful memory chips on the market.

SK Hynix announced in a press conference earlier this month that it would begin mass production of its latest generation of HBM chips, the 12-layer HBM3E, in the third quarter, while Samsung Electronics plans to do so in the second quarter, having been the first in the industry to ship samples of the latest chip.

“Currently, Samsung is ahead in the 12-layer HBM3E sampling process. If they can obtain qualification earlier than their counterparts, I believe they will be able to acquire majority shares by the end of 2024 or 2025,” said SK Kim, executive director and analyst at Daiwa Securities.









RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments