In the past two years, the storage chip market has experienced dramatic fluctuations. Initially, supply exceeded demand, causing prices to plummet to historic lows. However, prices have been on the rise for over a year now.
TrendForce's report shows that with the boost from AI-driven HBM and QLC, the DRAM and NAND Flash industries are expected to see annual revenue increases of 75% and 77%, respectively, in 2024. By 2025, the DRAM and NAND industries are projected to have year-on-year revenue growth of 51% and 29%, respectively.
Back in 2020, at the onset of the pandemic, global chip shortages led many manufacturers to misjudge the situation and stockpile a large number of chips. This resulted in an oversupply, causing significant inventory pressure for manufacturers. Consequently, storage chip prices remained low from 2022 to the first half of 2023.
However, upstream manufacturers quickly adjusted their capacity strategies and supply-demand relationships. In the Q2 2023 earnings conference, Samsung announced it would continue to cut production of NAND Flash-centered storage chips in the second half of the year. SK Hynix announced it would reduce NAND Flash production by 5-10% in the second half, and Micron expanded its NAND Flash wafer production cut from 25% to 30%. Kioxia began implementing a 30% production cut in Q4 2022, which expanded to 50% in 2023.
With the rise of the new AI wave led by ChatGPT, the demand for storage chips from massive servers has surged, making the upper limit of this demand hard to estimate.
Since 2023, with the advent of a new AI wave, the storage chip industry has also seen new demand. For a time, AI servers' computing power could easily exceed T (TOPS, trillions of operations per second), but memory bandwidth could not keep up (TB/s, terabytes per second), leading to a bottleneck known as the "memory wall" in the AI chain.
Compared to traditional GDDR and LPDDR, HBM, although a type of DRAM, offers significant advantages in high bandwidth, large capacity, and low latency. With its speed advantage, HBM quickly outperformed similar products and became the common choice in the AI industry.
HBM was first released in 2013, with the latest generation, HBM3E, launched in August 2023 and set to start shipping in March 2024. According to SK Hynix, this product can process up to 1.18TB of data per second, equivalent to processing 230 FHD movies in one second.
Currently, the main manufacturers of HBM are the traditional storage chip giants: SK Hynix, Samsung, and Micron, with SK Hynix in the lead, followed by Samsung and Micron.
However, it seems that the current capacities of these three companies are insufficient to meet the high demand for HBM. Micron's CEO Sanjay Mehrotra revealed in the Q4 2023 earnings conference that their HBM capacity for 2024 is already fully booked. SK Hynix Vice President Kim Ki-tae also stated that the HBM they plan to produce in 2024 is already sold out.
It is worth noting that apart from SK Hynix, Samsung, and Micron, no other manufacturers can mass-produce HBM products. Chinese scholars have expressed concern about this. Huang Letian, Deputy Director of the Integrated Circuit and System Research Center at the Yangtze Delta Region Research Institute of UESTC (Huzhou), stated, "It's like a gun; if you can't supply the bullets, the firing speed doesn't matter. If the HBM issue is not resolved, it will be difficult to improve China's computing power, limiting the development of AI and other industries."
GENIUNEIC is an electronic component supplier with 20 years experience and excellent reputation. Our main task is to provide customers with high-quality electronic components and assist them in manufacturing high-quality products.
We specialize in: 5G communication technology, electrical automation, automotive electronics, rail transit, smart healthcare, optoelectronics industry, new energy, etc.