New 36GB SK hynix HBM3E 12-High in Volume Production

0
SK hynix HBM3E 36GB Cover
SK hynix HBM3E 36GB Cover

This week, SK hynix said that its newest 36GB HBM3E 12-High modules are now in volume production. Recently, Micron said its HBM3E 12-High 36GB modules are shipping. It seems like SK hynix is joining the 36GB HBM3E party.

New 36GB SK hynix HBM3E 12-High in Volume Production

Making the new 36GB HBM3E is a neat challenge compared to making 24GB 8-high stacks. SK hynix said that it had to make the DRAM chips 40% thinner in order to match the same height as the 24GB generation. So it is using smaller chips to fit 50% more capacity.

SK hynix 12-layer HBM3E
SK hynix 12-High HBM3E

High bandwidth memory (HBM) package height may sound trivial at first, but maintaining the same height is critical in modern AI accelerators. HBM is placed next to accelerators using advanced packaging. Chip designers need to worry about thermal expansion and other factors during operation, but there is a simple reason height matters. Heatsinks or cold plates need to be fitted to accelerators, and if a HBM stack is taller, then it risks cracking a small package that costs as much as an automobile.

NVIDIA GH100
NVIDIA GH100

The NVIDIA GH100/ GH200, and H100 GPUs will not use HBM3E, but that is a great photo to show how HBM is co-packaged with the accelerator chips. As a fun fact, the GH100 shown above was never productized under that name, as they all became the GH200 family. Still, it shows the concept well.

Final Words

Demand for these packages is strong. Many workloads are accelerator memory bandwidth and capacity bound. Having newer, faster, and larger packages helps bring better AI accelerators to market. Also, even just having more vendors that can supply the chips helps other companies adopt the new larger memory modules.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.