Samsung launches 12-Hi 36GB HBM3E memory stack with 10 GT/s speed

Samsung announced late Monday that it had completed development of its 12-Hi 36 GB HBM3E memory stack, just hours after Micron said it had begun mass production of its 8-Hi 24 GB HBM3E memory products. Compared with its predecessor, code-named Icebolt, the new memory kit, code-named Shinebolt, increases peak bandwidth and capacity by more than 50%, making it the fastest memory device in the world.

As the description shows, Samsung’s Shinebolt 12-Hi 36 GB HBM3E stack packs 12 24Gb storage devices on top of a logic chip with a 1024-bit interface. The new 36 GB HBM3E memory modules have a data transfer rate of 10 GT/s, resulting in a peak bandwidth of 1.28 GB/s per stack, which is the industry’s highest per-device (or rather per-module) memory bandwidth.

At the same time, keep in mind that developers of HBM-enabled processors tend to be more cautious, so they will use Samsung’s HBM3E at much lower data transfer rates, partly because of power consumption, and partly to ensure artificial intelligence. Ultimate stability for (AI) and high-performance computing (HPC) applications.











Samsung HBM memory generation
HBM3E
(flash lightning)
HBM3
(Ice Arrow)
HBM2E
(flash bomb)
HBM2
(water arrow)
Maximum capacity 36GB 24GB 16 GB 8GB
Maximum bandwidth per pin 9.8Gb/s 6.4 Gb/s 3.6GB/sec 2.0Gb/s
Number of DRAM ICs per stack 12 12 8 8
Effective bus width 1024 bits
Voltage ? 1.1V 1.2V 1.2V
Bandwidth per stack 1.225TB/sec 819.2GB/sec 460.8GB/sec 256GB/sec

To manufacture the Shinebolt 12-Hi 36 GB HBM3E memory stack, Samsung had to use several advanced technologies.First, the 36 GB HBM3E memory product is based on a memory device manufactured by Samsung 4th A new generation of 10-nanometer (14nm) manufacturing technology is called and uses extreme ultraviolet (EUV) lithography.

Second, to ensure that the 12-Hi HBM3E stack has the same z-height as the 8-Hi HBM3 products, Samsung used its advanced thermally compressible non-conductive film (TC NCF), which enables it to achieve the industry’s smallest memory device gap of seven microns (7 µm). By narrowing the gaps between DRAMs, Samsung increased vertical density and reduced chip warping. Additionally, Samsung uses bumps of various sizes between DRAM ICs; smaller bumps are used for signal areas. In contrast, larger ones are placed where heat is needed to dissipate heat, improving thermal management.

Samsung estimates that its 12-Hi HBM3E 36 GB module can increase the average speed of AI training by 34% and expand the number of concurrent users of the inference service by more than 11.5 times. However, the company did not elaborate on the size of LLM.

Samsung has begun providing HBM3E 12H samples to customers and plans to start mass production in the first half of this year.

Source: Samsung

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *