NVIDIA GeForce RTX 2080 Performance Unveiled, DLSS AI-Powered Anti-Aliasing Spikes FPS At High IQ

GeForce RTX Cards
This week at Gamescom 2018, NVIDIA held a GeForce Gaming Celebration to unveil its new GeForce RTX 2080 and 2070 series graphics cards, based on its new Turing GPU architecture. The NVIDIA GPU that powers GeForce RTX cards ushers in a wealth of new technologies, enabled by its new processing engines. These new technologies include hardware-accelerated, real-time Ray Tracing and a new form of anti-aliasing called DLSS or Deep Learning Super Sampling, among a few others we can't talk about just yet.

What’s interesting about DLSS is that it uses Turing’s Tensor cores to enable a deep learning-powered AI technique to dramatically improve image quality and remove jagged edge artifacts in games, with a minimal performance hit. Traditional anti-aliasing, or AA, can lean heavily on GPU processing resources and memory bandwidth, and may consume large amounts of frame buffer memory as well. Anti-aliasing often requires a frame to be rendered and re-rendered multiple times in order to smooth out edges of objects in a scene and the end result can be a costly performance hit. NVIDIA notes that its new DLSS anti-aliasing technique, however, uses machine learning of scenes and images (called inferencing) to improve image quality without the large performance hit associated with some traditional AA methods. To use NVIDIA's words, “powered by Turing’s Tensor Cores, which perform lightning-fast deep neural network processing, GeForce RTX GPUs also support Deep Learning Super-Sampling (DLSS), a technology that applies deep learning and AI to rendering techniques, resulting in crisp, smooth edges on rendered objects in games."

We were able to witness DLSS in action in Epic's Unreal Engine demo called Infiltrator. Infiltrator is an impressive demo to behold in and of itself, but at 4K with traditional TAA (Temporal Anti-Aliasing) enabled, it can be a beast of a GPU crusher, even on something like a GeForce GTX 1080 Ti. However, with DLSS enabled and Turing's Tensor cores doing the work with machine learning to clean things up, frame rates can sail back up to levels well north of 60 FPS at 4K even in something as graphically intense as Infiltrator. Check out our hands-on demo in action…

Taking performance claims one step further, NVIDIA allowed us to show you the following slide detailing GeForce RTX 2080 performance versus a GeForce GTX 1080, with both DLSS enabled and disabled...

Click for full view...

As you can see, NVIDIA is claiming a roughly a 40 - 60 percent performance uplift for a Turing-based GeForce RTX 2080 versus the GTX 1080 right out of the gate, and that's with DLSS disabled. Flip on DLSS in a compatible game engine, however, and that performance lift jumps up to over 2X in some cases, like the Infiltrator demo and Final Fantasy, while offering similar image quality. Though smaller, other games still get a sizeable performance boost as well.

TuringVsPascal EditorsDay Aug22

We should note that NVIDIA will enable DLSS in games for developers for free, if a dev just sends them their code for processing on an NVIDIA DGX supercomputer. NVIDIA will hand back that code, which is reportedly just megabytes in incremental size, and enable the feature in their driver for that game. And, as with anything AI and machine learning, the more you feed the beast, the better it gets at the task at hand moving forward, in this case with a wider swath of game engines and visuals.

Regardless, not only is NVIDIA’s Turing-based GeForce RTX 2080 and 2070 family looking strong right out of the gate, but it has a few seriously powerful tricks up its sleeve as well, if the developer community decides to take advantage of its Tensor cores for what appears to be an anti-aliasing (almost) freebie.

Show comments blog comments powered by Disqus