Micron begins production of HBM3E memory

Micron Technology said on Monday that its HBM3E memory has begun mass production. The company’s HBM3E known good stack die (KGSD) will be used in Nvidia’s H200 compute GPU for artificial intelligence (AI) and high-performance computing (HPC) applications, which will ship in the second quarter of 2024.

Micron announced that it is in mass production of 24 GB 8-Hi HBM3E devices with a data transfer rate of 9.2 GT/s and a peak memory bandwidth of more than 1.2 TB/s per device. HBM3E increases data transfer rates and peak memory bandwidth by 44% compared to HBM3, which is especially important for bandwidth-hungry processors such as the Nvidia H200.

Nvidia’s H200 product relies on the Hopper architecture to provide the same computing performance as the H100. At the same time, it is equipped with 141GB HBM3E memory with a bandwidth of up to 4.8TB/s, which is significantly upgraded from the H100’s 80GB HBM3 with a bandwidth of up to 3.35TB/s.

With the upcoming release of the 36 GB 12-Hi HBM3E product in March 2024, Micron’s AI memory roadmap is further consolidated. In the meantime, it remains to be seen where these devices will be used.

Micron uses its 1β (1-beta) process technology to produce the HBM3E, which is a significant achievement for the company as it uses the latest production nodes in data center-class products and is a testament to its manufacturing expertise.

Ahead of rivals SK Hynix and Samsung, it is a major achievement for Micron to start mass production of HBM3E memory, which currently holds 10% of the market share in the HBM field. The move is critical for the company because it allows Micron to launch quality products earlier than its competitors, potentially increasing its revenue and profit margins while gaining greater market share.

“Micron is achieving a trifecta with this HBM3E milestone: leading time to market, best-in-class performance and a differentiated energy efficiency profile,” said Sumit Sadana, executive vice president and chief commercial officer at Micron Technology. “Workloads rely heavily on memory bandwidth and capacity, Micron is well-positioned to support significant future AI growth with our industry-leading HBM3E and HBM4 roadmap and our complete portfolio of DRAM and NAND solutions for AI applications.

Source: Micron

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *