NVIDIA GeForce RTX 2080 And RTX 2080 Ti Benchmark Review: Turing Is A Beast
NVIDIA GeForce RTX 2080 and 2080 Ti: Overclocking And Power
We also spent a little time overclocking the new GeForce RTX series cards, to see what kind of additional performance we could squeeze out of them with a bit of tweaking. Before we get to our results, though, we need to talk about an update to GPU Boost and some new overclocking related features.
Turing-based GeForce RTX cards feature GPU Boost 4.0. Like previous-gen GeForce cards, GPU Boost scales frequencies and voltages upwards, power and temperature permitting, based on the GPU's workload at the time. Should a temperature or power limit be reached, however, GPU Boost 4.0 will only drop down to the previous boost frequency -- and not the base frequency -- in an attempt to bring power and temperatures down. Where GPU Boost 3.0 could result in a steep drop-off down to the base frequency when constrained, GPU Boost 4.0 is more granular and should allow for higher average frequencies over time.
As we mentioned in our previous coverage of the Turing architecture, there are beefier VRMs on GeForce RTX cards versus their predecessors, which should help with extreme overclocking. In addition to being able to handle higher loads, the circuitry has also been optimized to smooth and clean-up power delivery across the entire voltage and frequency curve. That cleaner power should improve maximum overclocks, but NVIDIA is also making the overclocking process itself much easier with a new Scanner tool and API.
The NVIDIA Scanner is a one-click overclocking tool with an intelligent testing algorithm and specialized workload designed to help users find the maximum, stable overclock on their cards without having to resort to trial and error. The NVIDIA Scanner will try higher and higher frequencies at a given voltage step and then test for stability with a specialized workload along the way. The entire process should take around 20 minutes, but when it’s done, the Scanner will have found the maximum stable overclock throughout the entire frequency and voltage curve for a given card.
Unfortunately, the scanner was hanging on our testbed, so we couldn't properly test the auto-scan feature in our benchmarks. In lieu of using the NVIDIA Scanner, we kept things simple, and used the frequency offset and power / temperature target sliders also available in a new version of EVGA’s excellent Precision X1 utility. First we cranked up the power target to 125% (or 130% in the case of the EVGA card) and kicked up the temperature target as well. Then we bumped up the GPU and memory clocks until the test system was no longer stable. To mix things up, we overclocked the NVIDIA-built GeForce RTX 2080 Ti, but the EVGA-made GeForce RTX 2080.
In the end, we were able to come in just shy of 2.1GHz with both GPUs, with 130MHz+ increases to their memory speed. While we had them overclocked, we re-ran a couple of benchmarks and saw some nice performance improvements. The GeForce RTX 2080 Ti jumped out even further in front of the other cards and the EVGA GeForce RTX 2080 XC Ultra was able to overtake the MSI card in 3DMark, even though it trailed in its stock configuration.
Our goal was to give you an idea as to how much power each configuration used while idle and also while under a heavy gaming workload. Please keep in mind that we were testing total system power consumption at the outlet here, not the power being drawn by the graphics cards alone. It's a relative measurement that gives you a decent view of how much additional power draw a graphics card is placing on a system while gaming.
We should also mention that GPU temperatures on the various GeForce RTX cards we tested are a non-issue. At idle, Turing GPU temps hovered in the mid-30 degree C range. And under sustained load, even while overclocked, the GPU temperatures never actually hit their default peak targets. Note in the image capture above of the EVGA Precision X1 utility, the RTX 2080 Ti's temperature while overclocked peaked at only 80ºC -- the RTX 2080s ran at even lower temps and rarely broke into the 80s. Noise isn't a concern with any of the cards either. At idle the cards are essentially silent -- especially the partner boards, since their fans completely stop. Under load, with their default fan curves, the cards are slightly louder, but we'd still consider them quiet overall and definitely quieter than previous-gen Pascal-based GeForce GTX cards.