In case you haven't noticed, the era of artificial intelligence (AI) is being ushered in at a blistering pace, thanks in large parts to NVIDIA's high-end GPU and accelerator solutions—its
data center business outpaced its gaming division by 4X last quarter (in terms of revenue). Of course, it takes a village, as well as fast memory solutions. Enter Micron, which announced today that it has commenced volume production of its High Bandwidth Memory 3E (HBM3E) solution "to accelerate the growth of AI."
Part of the reason this is notable is because Micron's 24GB 8H HBM3E will be part of NVIDIA's H200 Tensor Core GPUs that are slated to ship in the second quarter of this year. Additionally, the more players there are, the merrier—rival SK hynix recently announced that it's already
sold out of its HBM allocation for the entirety of 2024.
"Micron is delivering a trifecta with this HBM3E milestone: time-to-market leadership, best-in-class industry performance, and a differentiated power efficiency profile," said Sumit Sadana, executive vice president and chief business officer at Micron Technology. "AI workloads are heavily reliant on memory bandwidth and capacity, and Micron is very well-positioned to support the significant AI growth ahead through our industry-leading HBM3E and HBM4 roadmap, as well as our full portfolio of DRAM and NAND solutions for AI applications."
The performance that Sadana alludes to is blistering pin speed of 9.2 gigabits per second (GB/s), resulting in memory bandwidth of more than 1.2 terabytes per second (TB/s). It's also power efficient—Micron claims its HBM3E memory consumes 30% less power than its competitors.
This doesn't mean a whole lot for consumers, at least not directly—AMD flirted with HBM on its gaming GPUs (remember Vega?), but the cost to performance ratio just isn't competitive with GDDR. While raw bandwidth isn't unimportant, quantity takes priority in the consumer realm. We also don't want to imagine how much more gaming graphics card would cost of AMD, Intel, and NVIDIA decided to make a wholesale switch to HBM instead of GDDR.
Micron says its HBM3E offers 24GB of capacity currently, but is sampling
36GB 12-High HBM3E for partner solutions that will debut in March.