NVIDIA GeForce GTX 1660 Super Turing GPU Rumored With Faster GDDR6 Memory

turing gpu
It appears that we haven't heard the last of NVIDIA's "Super" refreshes for its Turing-based GeForce graphics cards. Earlier this summer, we witnessed the launch of the GeForce RTX 2060 Super, GeForce RTX 2070 Super and the GeForce RTX 2080 Super. Now there's a rumor making the rounds that NVIDIA is looking to spread some Super lovin' to its Turing GeForce GTX cards.

This latest rumor comes from MyDrivers, which suggests that NVIDIA is working to deliver a GeForce GTX 1660 Super, which would slot in between the existing GeForce GTX 1660 (TU116-300) and GeForce 1660 Ti (TU116-400). It's reported that the GeForce GTX 1660 Super will use a vision of the TU116-300 die, with the same 1408 CUDA cores.

However, instead of pairing with 6GB of 8Gbps GDDR5 memory, the GeForce GTX 1660 Super is said to gain 6GB of 14Gbps GDDR6 memory. Coupled with a 192-bit memory bus, this would bring maximum memory bandwidth to 336 GB/sec. At this time, the source for this info is giving us no indication as to what clock speeds will be like for the new GPU, but we'd imagine that NVIDIA would provide a nice boost to base and boost GPU clocks, which currently stand at 1530MHz and 1785MHz respectively for the GeForce GTX 1660.

geforce 1660 evga gigabyte 1

Positioning the rumored GeForce GTX 1660 Super might get a bit tricky, however. The GeForce GTX 1660 retails for $229, with the GeForce GTX 1660 Ti comes in a $279. That would give the new Super variant a narrow window to hit around the $250 price point. And if MyDrivers' sources are correct, all three variants will be sold simultaneously in the retail market.

In other Turing news, the same report suggests that we'll soon be seeing a GeForce GTX 1650 Ti with up to 1152 CUDA cores (up from 896). This will further bolster NVIDIA Turing family at the low-end, further putting pressure on AMD and its Radeon family of graphics cards. However, rumor has it that AMD is working on new budget Navi 14-based GPUs that might play in this space.

For now, we'll have to take these rumors with a healthy dose of salt, but they do make some sense given NVIDIA's existing GPU strategy.

Show comments blog comments powered by Disqus