Every so option, a leak or rumor pops up claiming
NVIDIA's next flagship graphics is going to be an absolute beast. Given what we know about the current generation flagships, the GeForce RTX 3090 and GeForce RTX 3090 Ti, that's like saying water is wet and fire is hot. It's the more specific claims that can be interesting, though, and that's what we have here.
One of the prominent leakers on Twitter, @kopite7kimi, reiterates a previous claim that the GeForce RTX 4090 will be around
twice as fast as the GeForce RTX 3090. There's not much their tweet to qualify that, such as whether that's referring to rasterized performance, real-time ray tracing performance, 4K gaming, or whatever the case might be.
That said, the leaker does share some supposed specifications...
According to the Twitter post, the GeForce RTX 4090 with feature an Ada Lovelace GPU (AD102-300) with 16,128 CUDA cores, which extrapolates to 126 streaming multiprocessors (SMs). That's potential less than some previous rumors, which pegged a full-fat AD102 GPU as
flexing 18,423 CUDA cores. However, NVIDiA could be reserving its full-fat AD102 with a 75 percent increase in CUDA cores (compared to the GeForce RTX 3090) for a Ti variant.
The leaker also claims the GeForce RTX 4090 will wield 24GB of fast (21Gbps) GDDR6X memory, all packaged in a 450W profile. The Total Graphics Power (TGP) rating is actually in line with some earlier claims. We've also seen rumors pointing to a bigger power draw, with the GeForce RTX 3090 Ti serving as a
test vehicle for Ada Lovelace.
As for the memory, if paired with a 384-bit memory bus, we're looking at over 1TB/s of memory bandwidth just like the GeForce RTX 3090 Ti.
Also interesting is that @kopite7kimi notes they're "disappointed with RDNA 3." That comment drew some ire in the Twitter thread, and we're not sure why they wrote it. Perhaps it's due to the rumor that Ada Lovelace is being manufactured in a 4nm node, which in a loose sense means it's already
winning on paper versus RDNA 3. But as it stands it's a vague comment.
The usual disclaimer applies—don't put too much stock in early rumors, especially with NVIDIA being known to throw curve balls at the final hour.