Talk about a 'hold my beer' moment. Just yesterday, Micron was
thumping its chest over having kicked its 24GB 8-stack High Bandwidth Memory 3E (HBM3E) solution with blistering bandwidth of more than 1.2 terabytes per second (TB/s) into mass production. Now a day later, Samsung is touting the development of the industry's first 12-stack HBM3E DRAM solution, codenamed Shinebolt.
Samsung's
Shinebolt solution is, for the time being, the highest-capacity HBM product to date at 36GB. The chip maker is also laying claim to an all-time high bandwidth of 1,280 gigabytes per second (GB/s), which is likely a little bit higher than Micron's solution, which hits over 1.2TB/s. Micron didn't provide a specific figure, but the reason we think Samsung's is a tad higher is because it offers a per-pin bandwidth of 9.8 gigabits per second (Gb/s) versus 9.2 Gb/s.
In any event, Samsung's latest HBM3E solution offers capacity and bandwidth improvements of more than 50% compared to its predecessor codenamed Icebolt, which is an 8-stack HBM3 solution.
"The industry’s AI service providers are increasingly requiring HBM with higher capacity, and our new HBM3E 12H product has been designed to answer that need," said Yongcheol Bae, Executive Vice President of Memory Product Planning at Samsung Electronics. "This new memory solution forms part of our drive toward developing core technologies for high-stack HBM and providing technological leadership for the high-capacity HBM market in the AI era."
According to Samsung, the use of an advanced thermal compression non-conductive film (TC NCF) enabled it to package its 12-layer memory solution to the same height specification as its 8-layer memory. And over time, Samsung says its special film will offer more benefits, namely the mitigation of chip warping with higher stacks on thinner dies.
"Samsung has continued to lower the thickness of its NCF material and achieved the industry’s smallest gap between chips at seven micrometers (µm), while also eliminating voids between layers. These efforts result in enhanced vertical density by over 20% compared to its HBM3 8H product," Samsung says.
Advances in HBM are also critical to fueling bandwidth-hungry applications in the field of artificial intelligence (AI). To that end, Samsung claims its 12H HBM3E can increase the average speed of AI training by 34% compared to 8H HBM3, while also enabling more than 11.5 times the number of simultaneous users for inferences services.
Samsung says its has started
sampling its 12H HBM3E to customers and plans to mass product it in the coming months. Meanwhile, Micron is also sampling 36GB 12H HBM3E for partner solutions that will debut in March, so there's not really a clear lead between the two memory makers.