GeForce GTX 1650 - Overclocking, Power Consumption, And Noise
We also spent a little time overclocking the EVGA GeForce GTX 1650 XC Ultra, to see what kind of additional performance it had lurking under its hood. Before we get to our results, though, we would like to quickly re-cap Turing's new GPU Boost algorithm and cover some new overclocking related features.
Overclocking NVIDIA's Turing
Turing-based GeForce cards like this one, feature GPU Boost 4.0. Like previous-gen GeForce cards, GPU Boost scales frequencies and voltages upwards, power and temperature permitting, based on the GPU's workload at the time. Should a temperature or power limit be reached, however, GPU Boost 4.0 will only drop down to the previous boost frequency/voltage stepping -- and not the base frequency -- in an attempt to bring power and temperatures down gradually. Whereas GPU Boost 3.0 could result in a sharp drop-off down to the base frequency when constrained, GPU Boost 4.0 is more granular and should allow for higher average frequencies over time, and hence better and more consistent performance.
EVGA GeForce GTX 1650 XC Ultra Black In Precision X1
As we've mentioned in our previous coverage of the Turing architecture, there are typically beefier VRMs on Turing-based GeForce cards versus their predecessors, which should also help with overclocking, though most of the cards are still power limited to prevent damage and ensure longevity. Unlike some of the entry-level Turing-based cards we've look at in the past that didn't allow us to increase the power target though, the EVGA GeForce GTX 1650 XC Ultra DID offer the ability to tweak the Power Target and we could bump the voltage up by .1v as well.
EVGA GeForce GTX 1650 XC Ultra Black Overclocking Data
We haven't had any luck with NVIDIA's OC Scanner tool across multiple test beds, which is designed to find the optimal frequency and voltage curve for a particular GPU, so we couldn't properly test that feature, but we were able to crank up the EVGA GeForce GTX 1650 XC Ultra's performance quite a bit manually. In lieu of using the NVIDIA Scanner, we kept things simple and used the frequency and voltage offsets, and the power and temperature target sliders to manually push the card well beyond its stock frequencies. First, we cranked up the temperature and power targets and voltage, then we bumped up the GPU and memory clocks until the test system was no longer stable or showed on-screen artifacts.
When all was said and done, we were able to take our EVGA GeForce GTX 1650 XC Ultra's GPU boost clock all the way up to 2,040MHz -- an increase or nearly 400MHz over stock. There will probably be a bit more headroom available with some cards, but we bumped into the power limit during some tests. While we had the card overclocked, we re-ran a couple of benchmarks and saw some nice performance improvements. In fact, while overclocked, the EVGA GeForce GTX 1650 XC Ultra ended up outpacing the Radeon RX 570 (in these two tests, at least), whereas it trailed in its stock config.
Our goal was to give you an idea as to how much power each configuration used while idle and also while under a heavy gaming workload. Please keep in mind that we were testing total system power consumption at the outlet here, not the power being drawn by the graphics cards alone. It's a relative measurement that gives you a decent view of how much additional power draw a graphics card is placing on a system while gaming.
We should also mention that the GPU temperature is a total non-issue. At idle, the GPU temps hovered in the mid-30ºC range, and under sustained load, even while overclocked over 2GHz, the GPU temperatures never even came close the target and peaked at only 59ºC. Noise output wasn't an issue either. The TU117 GPU is relatively low-power, and EVGA outfitted this card with a beefy cooler, so the fans barely spin up under load and the card remains quiet under almost all conditions.