Micron has sold out all HBM3E supply in 2024 and most of 2025

Micron has the advantage of being the first company to launch HBM3E memory, and the company revealed that it has successfully sold its entire supply of advanced high-bandwidth memory for 2024, with most of the production for 2025 already allocated, brilliantly. Micron’s HBM3E memory (or HBM3 Gen2 as Micron also calls it) is one of the first to qualify for NVIDIA’s updated H200/GH200 accelerators, so it looks like the DRAM maker will become a major supplier to the green company.

“Our 2024 HBM is sold out and the vast majority of 2025 supply has been allocated,” Micron Technology CEO Sanjay Mehrotra said in remarks prepared for the company’s earnings call this week. “We continue to expect HBM bit share will be on par with our overall DRAM bit share sometime in 2025.”

Micron’s first HBM3E product is an 8-Hi 24 GB stack with a 1024-bit interface, 9.2 GT/s data transfer rate and 1.2 TB/s total bandwidth. NVIDIA’s H200 accelerator for artificial intelligence and high-performance computing will use 6 of these cubes, providing a total of 141 GB of available high-bandwidth memory.

“We are on track to generate hundreds of millions of dollars in revenue from HBM in fiscal 2024 and expect HBM revenue to increase our DRAM and overall gross margin starting in the third fiscal quarter,” Mehrotra said.

The company has also started sampling its 12-Hi 36GB stack, which offers a 50% increase in capacity. These KGSDs will be put into use in 2025 and will be used in the next generation of artificial intelligence products. Meanwhile, NVIDIA’s B100 and B200 don’t appear to be using the 36 GB HBM3E stack, at least initially.

Demand for AI servers hit a record last year, and it looks set to remain high this year as well. Some analysts believe that NVIDIA’s A100 and H100 processors (and their various derivatives) will account for 80% of the entire AI processor market by 2023. Although NVIDIA will face competition from AMD, AWS, D-Matrix, Intel, Tenstorrent and other companies in the inference field this year, NVIDIA’s H200 looks like it will still be the processor of choice for artificial intelligence training, especially for the likes of Meta and Microsoft For large companies, they already operate fleets of hundreds of thousands of NVIDIA accelerators. With this in mind, becoming the primary supplier of HBM3E for NVIDIA H200 would make a lot of sense for Micron as it would allow it to eventually capture a sizable share of the HBM market, which is currently dominated by SK Hynix and Samsung and which Micron only controls About 10% compared to last year.

At the same time, because each DRAM device within the HBM stack has a wide interface, its physical size is larger than a regular DDR4 or DDR5 IC. As a result, the addition of HBM3E memory will impact Micron’s commercial DRAM bit supply, the company said.

“Increased HBM production will limit supply growth for non-HBM products,” said Mehrotra. “Industry-wide, HBM3E consumes approximately three times the wafer supply of DDR5 to produce a given number of wafers in the same technology node. number of digits.”

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *