NVIDIA GeForce GTX Titan Powered Maingear SHIFT - HotHardware

NVIDIA GeForce GTX Titan Powered Maingear SHIFT

10 thumbs up

GPU Boost 2.0:

NVIDIA is also ushering in a new version of GPU Boost with the GeForce GTX Titan, dubbed GPU Boost 2.0. Fundamentally, GPU Boost 1.0 and 2.0 are similar in that they both allow the graphics card’s GPU to ramp up clock speeds and dynamically alter voltages in an effort to increase performance, but the criteria used to determine the boost frequencies and voltages changes with GPU Boost 2.0.

With GPU Boost 1.0, which was first introduced with Kepler, a power target was used to determine the peak boost clocks. If a given workload wasn’t fully utilizing available board power and environmental conditions and temperatures were acceptable, the GPU’s voltage and frequency would be boosted to take advantage of any spare power. We’ve got a more detailed explanation of GPU Boost 1.0 in our original review of the GeForce GTX 680 if you’d like to check it out.

GPU Boost 2.0 works in a similar manner, but in lieu of a strict power target it uses an actual GPU temperature target in its determination of peak boost frequencies and voltages. NVIDIA was relatively conservative with GPU Boost 1.0. Even though the max power target may have been achieved with a given workload, the GPU temperature may not have hit its peak thermal threshold. With GPU Boost 2.0, if there is still temperature headroom available, the GPU will continue to ramp clocks and voltage until the temperature target attained. The end result is that the GPU ultimately runs at higher clocks more often than it would have with GPU Boost 1.0.

We should also mention that NVIDIA will allow users to unlock even higher voltages with Titan than were available with GPU Boost 1.0. With GPU Boost 1.0, the maximum voltage was determined by the power target and could not be increased to a level that would impact the long-term reliability of the silicon. A combination of high voltages and high temperatures can and will damage silicon, but on their own—within reasonable limits—neither one will do much, if any, damage. You can run a chip with higher-than-normal voltages (again, within limits) at low temperatures, without affecting the long-term reliability all that much. Conversely, you can also run at higher-than-normal temperatures with lower voltages without measurably affecting long-term reliability. Higher than normal voltages and temperatures combined can and will damage a chip, however.

If you’re cool with pushing your card beyond what NVIDIA considers normal limits though, with the GTX Titan, you’ll now have that ability through some third-party tweaking tools, like EVGA Precision or MSI Afterburner, provided you accept a warning and acknowledge that your actions may affect the long term reliability of the GPU. There are still limits in place, NVIDIA’s not going to let customers drag a slider and fry their GPU after all, but if the performance increase offered by GPU Boost and basic overclocking aren’t enough and you don’t mind pushing things further, potentially shortening the life of your card, you can go for it.

Article Index:

0
+ -

Ahh, now I know why I got that driver update through NGE this morning.

0
+ -

This is one monster card. Two in SLI would be nice. :)

0
+ -

I admit that i was iffy on this card at first, but then it started to sink in. 6GB GDDR... 384-bit... 3x1 gaming...

Just wow. And that SHIFT... I don't think I have the right words...

0
+ -

Holy bajeezus that c ard packs a punch... I don't see any realistic use besides having money to blow on boatloads of high res monitors to run in parallel...

I feel like one of these cards could keep a system maxed out through a couple generations of GPU's

Login or Register to Comment
Post a Comment
Username:   Password: