Micron Reveals Plan To Bring GDDR7 Memory To Market For Next-Gen GPUs
Every major graphics product out right now uses either GDDR6, GDDR6X, or HBM3 memory, and the latter option only really exists for super-computing chips like NVIDIA's Hopper H100, AMD's Instinct MI250X, and Intel's Data Center GPU Max. GDDR6 and GDDR6X are the standards of our day, and they're really the same thing—GDDR6X just encodes the data differently to achieve higher efficiency.
The first graphics cards to ship using GDDR6 memory were NVIDIA's Turing family, better known as the original GeForce RTX series. That was back in late 2018, so it's high time for a new memory standard. Thankfully, JEDEC members like Micron and Samsung are already on the ball. In a recent earnings call, Micron CEO Sanjay Mehrotra remarked that his company plans to have GDDR7 available in the first half of next year.
GDDR7 is interesting, because in a specific way, it's sort-of like a compromise between GDDR6 and GDDR6X. Rather than the complex PAM4 signaling used by GDDR6X, it uses PAM3, which is actually quite different despite the similar name. PAM3 signaling allows GDDR7 to offer improved transfer rates over GDDR6X without requiring the exotic signal maintenance hardware required by that memory tech.
According to Micron's competitor Samsung, GDDR7 memory could hit transfer rates as high as 36 Gbps per pin—but we doubt it will be that fast at launch. For perspective, the fastest GDDR6 memory gets up to 24 Gbps, although as far as we're aware that speed class hasn't shipped in any end-user products. The hottest-clocked GPU memory so far is the 22.5 Gbps RAM on the GeForce RTX 4080.
While GDDR7 may be available in the first half of next year, we probably won't see it in graphics cards until the second half of next year, which means that—if the rumors are accurate—Intel's Battlemage will miss out. In recent years, NVIDIA's usually been the first to debut new memory technologies, so expect to see GDDR7 RAM on whatever the successor to Ada Lovelace ends up being called in 2025.