In addition to introducing a new graphics card based on new GPU, NVIDIA is also unveiling a number of new features and capabilities coming with the GeForce GTX 680, namely GPU Boost, Adaptive VSync, TXAA, NVENC and Bindless Textures.
As you can probably surmise by its name, GPU Boost is somewhat akin to the Turbo Boost and Turbo Core technologies in today’s Intel and AMD processors. Like those technologies, GPU Boost automatically boosts GPU clock speed to increase performance. GPU Boost essentially looks at power being consumed by games / applications and adjusts the GPU accordingly, taking into account environmental conditions like GPU temperature.
Traditionally, GPU clock speeds were set based on the most stressful applications and conditions. But when you do that, you leave power and performance on the table for applications that don’t consume as much power. NVIDIA looked at that gap between peak power and unused power in certain applications and called it the “Boost opportunity”. With GPU boost, with those games and applications that run at a lower power profile, power / clocks can be boosted automatically and dynamically to increase performance.
As we’ve mentioned, the GeForce GTX 680 has a 1006MHz base clock. The majority of the time, however, while gaming, it will be clocked higher than that. Reference cards have a default GPU Boost clock of 1058MHz. And if you’re playing a game that’s doesn’t result in significant GPU power consumption, the GeForce GTX 680 will likely be running closer to its peak GPU Boost clock than its base clock most of the time.
We should also point out that GPU Boost doesn’t preclude overclocking. It can’t be disabled, but users will still be able to overlcock their GPU—they’ll just have to account for the Boost clock, which will always be some percentage above the base clock. NVIDIA also tells us that GPU Boost (at least initially) will not be available in mobile GeForce 600 series parts.
NVIDIA's Adaptive VSync Technology
The next new feature arriving with Kepler is dubbed Adaptive VSync. With standard VSync enabled, provided the GPU has the necessary horsepower, a game will typically run locked a 60Hz most of the time to prevent tearing on-screen. If the game, however, suddenly slows down and framerates need to drop below 60Hz, with standard VSync enabled, framerates will drop down to 30Hz (1/2 speed) and only jump back up to 60Hz when the performance is available again. The huge dips and jumps in framerates when VSync is enabled usually result in annoying stuttering.
With Adaptive VSync technology, however, when a game has to step down its framerate, VSync is automatically disabled temporarily. The result is that framerates can gradually decrease and increase, without the sharp drop-off to 30Hz, which results in a smoother overall experience.
Also coming with Kepler are some new anti-aliasing features. First off, FXAA will now available through the driver control panel, so users can override in-game options and use FXAA with most games. NVIDIA is also introducing a new anti-aliasing mode dubbed TXAA.
For now, TXAA will be available on Kepler-based cards only. Although we weren’t able to test it just yet (TXAA is coming with a future driver release), NVIDIA claims TXAA delivers roughly 8X MSAA image quality with the performance hit of roughly 2X MSAA. And TXAA2 offers image quality and jaggie reduction beyond 8X MSAA at roughly the speed of 4X MSAA.
TXAA essentially combines the benefits of FXAA and MSAA in a smaller memory footprint. TXAA applies FXAA like resolve filter, which is most effective across high-contrast edges in an image. But with TXAA NVIDIA jitters the pixel offsets to effectively provide more samples than are in memory. With TXAA2 a temporal component is added with further improves image quality.
NVIDIA has also incorporated a dedicated hardware video encoding engine into Kepler. The feautre is called NVENC and it is capable of encoding 1080P HD video at speeds of up to 4x to 8x real-time. NVENC can also encode H.264 high profile 4.1 video (which is the Blu-Ray standard) and suports Multi-View Video coding (MVC) for stereoscopic 3D video. What's different about NVENC versus previous GPU encoding solutions is that it doesn't leverage the shader cores. NVENC is dedicated hardware with the sole purpose of encoding video.
Which brings us to Bindless Textures. With Pre-Kepler GPU architectures, shaders were bound to using up to 128 simultaneous textures. But with Kepler, the number of unique textures available to shaders at run time is significantly increased. According to NVIDIA, over 1 million simultaneous textures can be used with Kepler, but this technology is not supported in DX11.