Logo   Banner   TopRight
GeForce GTX 690 Review: Dual NVIDIA GK104 GPUs
Date: May 03, 2012
Author: Marco Chiappetta
Intro, Specifications and Related Info
After testing the GeForce GTX 680 last month, it was practically a foregone conclusion that a dual-GPU powered graphics card based on the same GK104 GPU was coming down the pipeline. The GK104 at the heart of the GeForce GTX 680 was able to best AMD’s current flagship GPU more often than not in our testing, and it did so while consuming less power and with a smaller chip die size. If NVIDIA could stick two of their previous-generation, much larger and more power hungry, GF110 chips onto a single card with the GeForce GTX 590, doing the same with the GK104 would come as no surprise. As you’ll see a little later though, NVIDIA did have a few surprises in store with the GTX 690, even if we all sort of knew the card was coming.

According to a few folks we spoke to at NVIDIA, company CEO Jen Hsun Huang told the team to spare no expense and build the best graphics card they possibly could, using all of the tools at their disposal. The result was obviously the GeForce GTX 690, but in building the card, the team at NVIDIA employed a number of new components and materials that set the card apart from the company’s previous dual-GPU powered offerings. In addition to a pair of GK104 GPUs and 4GB of GDDR5 RAM, the GeForce GTX 690 features laser etched lighting, a magnesium fan housing, and a plated aluminum frame, along with a dual vapor chamber cooler design with ducted airflow channels and tuned axial fan. The sum total of all of these things result in not only NVIDIA’s fastest graphics card to date, but also one of its quietest.

We’ve only had a GeForce GTX 690 in our possession for a few days now, but they begin shipping in limited quantities today. So we put our evaluation efforts into high gear and cranked out this article in record time so that the lucky few that can afford this beast know exactly what they are getting...

Dual GeForce GTX 690 Cards in a Quad-SLI Configuration. An easy 11 on the Geek Porn-o-Meter 
NVIDIA GeForce GTX 690
Specifications & Features
Processing Units  
Graphics Processing Clusters 8
SMXs 16
CUDA Cores  3072
Texture Units 256 
ROP Units 64
Clock Speeds  
Base Clock 915 MHz
Boost Clock 1019 MHz
Memory Clock (Data Rate) 6008 MHz
L2 Cache Size 1024KB (512KB per GPU)
Total Video Memory 4096MB GDDR5 (2048MB per GPU)
Memory Interface 2 x 256-bit
Total Memory Bandwidth 384.4 GB/s (192.2 GB/s per GPU)
Texture Filtering Rate (Bilinear) 234.2 GigaTexels/sec
Physical & Thermal  
Fabrication Process 28 nm 
Transistor Count 7.08 Billion
Connectors 3 x Dual-Link DVI, 1 x mini-DisplayPort
Form Factor Dual Slot
Power Connectors 2 x 8-pin
Recommended Power Supply 650 watts
Thermal Design Power (TDP) 300 watts
Thermal Threshold 98° C

Leading up to this launch, NVIDIA had a little fun with a few members of the tech press. A couple of weeks back, the company sent some pry bars out to members of the press with “For Use In Case of Zombies Or...” laser etched into the bars’ surface, adjacent to an NVIDIA claw logo. Well, when our GeForce GTX 690 arrived we found out exactly what those pry bars were for...



As you can see, the GeForce GTX 690 arrived in some rather unique packaging. The card arrived encased in a wooden crate, branded with a cautionary warning that some “weapons grade gaming power” was enclosed within. Needless to say, that pry bar came in mighty handy, because you know we didn’t waste any time ripping that GTX 690 from its box and installing it in our test system.

Before we give you the low-down on the GeForce GTX 690 though, we want to direct your attention to a few other HotHardware articles related today's launch...

We've already covered the details of the GK104 GPU powering the GeForce GTX 690 in our GeForce GTX 680 launch article, including things like GPU Boost, NVENC, and TXAA, so we won't go over them again here. If you'd like a deeper look into the technologies employed in the GK104, we'd definitely suggest taking a gander at our GTX 680 coverage. The details of NVIDIA's previous-gen dual-GPU flagship are covered in our GeForce GTX 590 article, and our 3D Vision 2 and 3D Vision Surround articles cover those two NVIDIA technologies, which are obviously supported in a card like the GeForce GTX 690.

The GeForce GTX 690

On Saturday night at the GeForce LAN that took place in Shanghai, NVIDIA’s CEO Jen Hsun Huang unveiled the company's latest dual-GPU powered, flagship graphics card, the GeForce GTX 690.

As you've probably read by now, the GeForce GTX 690 features a pair of fully-functional GK104 “Kepler” GPUs. If you recall, the GK104 is the same chip that powers the excellent GeForce GTX 680, which debuted just last month. On the GeForce GTX 690, each of the GK104 GPUs is also be paired to its own 2GB of memory (4GB total) via a 256-bit interface, resulting in what is essentially GeForce GTX 680 SLI on a single card.

On GeForce GTX 680 cards, the GK104 GPU has a base clock speed of 1006MHz, with a Boost clock of 1058MHz. The GeForce GTX 690 has somewhat a lower base GPU clock of 915MHz with a boost clock of 1019MHz. The memory clock on the GeForce GTX 690 is unchanged from the GTX 680, however, and runs at an effective 6Gb/s. If you're keeping count, dual GK104 GPUs equates to 3,072 CUDA Cores (1,536 per GPU), 256 total Texture Units, and 64 total ROP Units. Total Memory Bandwidth for the card is 384.4 GB/s (192.2 GB/s per GPU) with a peak Texture Filtering Rate (Bilinear) of 234.2 GigaTexels/sec.

With those specifications, the GeForce GTX 690 should offer about 90% (or more) of the performance of a GeForce GTX 680 SLI setup, give or take a couple of percentage points depending on the application. We'll tell you for sure on the pages ahead.

NVIDIA GeForce GTX 690, Front and Back

The GPUs on the GTX 690 are linked to eachother via a 48-lane PCI Express 3.0 switch from PLX, with a full 16 lanes of electrical connectivity between each GPU and the PEG slot. Previous dual-GPU powered cards from NVIDIA relied on the company’s own NF200, but that chip lacks support for PCI Express 3.0, so NVIDIA opted for a third party solution this time around. For those paying attention, AMD has used PCIe switches from PLX on their dual-GPU powered cards.

With the kind of horsepower likely lurking under the GeForce GTX 690’s hood, it will obviously have some heavy-duty power requirements. But considering the GK104’s power efficiency and the GTX 690’s somewhat lower clocks they won’t be as high as previous-gen dual-GPU cards. NVIDIA is claiming a 300W TDP for the GeForce GTX 690 and the card a dual-8-pin power connector requirement. If you're keeping track, that’s over 20% lower than the 365W peak of the GeForce GTX 590.

NVIDIA GeForce GTX 690 Exposed

The GeForce GTX 690 isn't just about speeds and feeds, though.  NVIDIA has also done some innovating on the cooling and aesthetic fronts with the GeForce GTX 690 as well. Some of the new design features of the GeForce GTX 690 include an exterior frame made from trivalent chromium-plated aluminum, and a fan housing made from a thixomolded magnesium alloy, which offers excellent heat dissipation and vibration dampening. The GeForce GTX 690 also has highly-efficient 10-phase, heavy-duty power supply with a 10-layer, two-ounce copper printed circuit board.

The cooler on the GeForce GTX 690 consists of a pair of vapor chambers (one for each GPU), each with a nickel-plated finstack, and large center-mounted axial fan with optimized fin pitch and air entry angles. The GeForce GTX 690 also uses low-profile components and a flat, ducted baseplate for unobstructed airflow, which reportedly minimizes turbulence and help quiet down the card. 

Just by looking at the GeForce GTX 690, it’s obvious NVIDIA has set out to create a powerful graphics card that also happens to look the part. We should also point out that the cooling hardware is designed in such a way that the fan blows air through the shroud, where half is directed towards the front GPU and ultimately expelled from the system, and the other half cools the  rear GPU and exhausts within the case.

The GeForce GTX 690's Outputs: DVI x 3, mini-DP x 1

The GeForce GTX 690 should have more than enough muscle to push multiple displays, and as such, the card is outfitted with a trio of DVI outputs and a single mini-DP output, all of which can be powered simultaneously. Of course, the card supports NVIDIA's 3D Vision Surround technology as well.

Test System and Unigine Heaven v2.5

How We Configured Our Test Systems: We tested the graphics cards in this article on an Asus P9X79 Deluxe motherboard powered by a Core i7-3960X six-core processor and 16GB of G.SKILL DDR3-1600 RAM. The first thing we did when configuring the test system was enter the system UEFI and set all values to their "optimized" or "high performance" default settings and disabled any integrated peripherals that wouldn't be put to use. The hard drive was then formatted and Windows 7 Ultimate x64 was installed. When the installation was complete we fully updated the OS and installed the latest DirectX redist, along with the necessary drivers, games, and benchmark tools.

HotHardware's Test System
Intel Core i7 Powered

Hardware Used:
Intel Core i7-3960X
(3.3GHz, Six-Core)
Asus P9X79 Deluxe
(Intel X79 Express)

Radeon HD 7950
Radeon HD 7970
Radeon HD 6970
GeForce GTX 580/OC
GeForce GTX 580 3GB
GeForce GTX 590
GeForce GTX 680
GeForce GTX 690

16GB OCZ DDR3-1600
Western Digital Raptor 150GB
Integrated Audio
Integrated Network

Relevant Software:
Windows 7 Ultimate x64
DirectX April 2011 Redist
ATI Catalyst v12.2b/v12.4
NVIDIA GeForce Drivers v300.99/v301.33

Benchmarks Used:

Unigine Heaven v2.5
3DMark 11
Batman: Arkham City
Just Cause 2
Alien vs. Predator
Metro 2033
Lost Planet 2
Dirt 3

Unigine Heaven v2.5 Benchmark
Pseudo-DirectX 11 Gaming

Unigine Heaven

Unigine's Heaven Benchmark v2.5 is built around the Unigine game engine. Unigine is a cross-platform, real-time 3D engine, with support for DirectX 9, DirectX 10, DirectX 11 and OpenGL. The Heaven benchmark--when run in DX11 mode--also makes comprehensive use of tessellation technology and advanced SSAO (screen-space ambient occlusion) It also features volumetric cumulonimbus clouds generated by a physically accurate algorithm and a dynamic sky with light scattering.

The new GeForce GTX 690 simply crushes everything in the Unigine Heaven benchmark, which it should considering it costs at least twice as much as any other card in the list and sports two of the most powerful GPUs money can currently buy.

And don't worry, we'll have CrossFire and SLI numbers in the actual games tests...you're almost there...

3DMark 11 Performance

Futuremark 3DMark11
Synthetic DirectX Gaming

Futuremark 3DMark11

The latest version of Futuremark's synthetic 3D gaming benchmark, 3DMark11, is specifically bound to Windows Vista and WIndows 7-based systems due to its DirectX 11 requirement, which isn't available on previous versions of Windows. 3DMark11 isn't simply a port of 3DMark Vantage to DirectX 11, though. With this latest version of the benchmark, Futuremark has incorporated four new graphics tests, a physics tests, and a new combined test. We tested the graphics cards here with 3DMark11's Extreme preset option, which uses a resolution of 1920x1080 with 4x anti-aliasing and 16x anisotropic filtering.

3DMark 11 paints essentially the same picture as Unigine Heaven from the previous page. The GeForce GTX 690 simply decimates everything in 3DMark 11. It's almost funny to see those short, little bars for relatively powerful graphics cards like the GeForce GTX 580 and Radeon HD 7950.
Just Cause 2 Performance

Just Cause 2
DX10.1 Gaming Performance

Just Cause 2

Just Cause 2 was released in March '10, from developers Avalanche Studios and Eidos Interactive. The game makes use of the Avalanche Engine 2.0, an updated version of the similarly named original. It is set on the fictional island of Panau in southeast Asia, and you play the role of Rico Rodriquez. We benchmarked the graphics cards in this article using one of the built-in demo runs called Desert Sunrise. The test results shown here were run at various resolutions and settings. This game also supports a few CUDA-enabled features, but they were left disabled to keep the playing field level.

The GeForce GTX 690 continues to dominate every other single graphics card currently available in the Just Cause 2 benchmark. From here on out though, we've got SLI and CrossFire scores as well, which will level the playing field considerably, in terms of price and performance.

Versus a pair of GeForce GTX 680 cards running in SLI mode or a pair of Radeon HD 7970 cards running in CrossFire mode, the GeForce GTX 690's performance fall right in between the two. The GeForce GTX 590 quad-SLI setup doesn't scale properly in this game and as a result it fall behind a bit.

Lost Planet 2 Performance

Lost Planet 2
DirectX 11 Gaming Performance

Lost Planet 2

A follow-up to Capcom’s Lost Planet : Extreme Condition, Lost Planet 2 is a third person shooter that takes place again on E.D.N. III ten years after the story line of the first title. We ran the game’s DX11 mode which makes heavy use of DX11 Tessellation and Displacement mapping and soft shadows. There are also areas of the game that make use of DX11 DirectCompute for things like wave simulation in areas with water. This is one game engine that looks significantly different in DX11 mode when you compare certain environmental elements and character rendering in its DX9 mode versus DX11. We used the Test B option built into the benchmark tool and with all graphics options set to their High Quality values.

No surprise here. The GeForce GTX 690, once again, smokes every other single graphics card in the Lost Planet 2 benchmark.

When put up against a pair of graphics cards, however, the GeForce GTX 690's performance is right in-line with the competition.
Metro 2033 Performance

Metro 2033
DirecX11 Gaming Performance

Metro 2033

Metro 2033 is your basic post-apocalyptic first person shooter game with a few rather unconventional twists. Unlike most FPS titles, there is no health meter to measure your level of ailment, but rather you’re left to deal with life, or lack there-of more akin to the real world with blood spatter on your visor and your heart rate and respiration level as indicators. The game is loosely based on a novel by Russian Author Dmitry Glukhovsky. Metro 2003 boasts some of the best 3D visuals on the PC platform currently including a DX11 rendering mode that makes use of advanced depth of field effects and character model tessellation for increased realism. This title also supports NVIDIA PhysX technology for impressive in-game physics effects. We tested the game resolutions of 1920x1200 and 2560x1600 with adaptive anti-aliasing and in-game image quality options set to their High Quality mode, with DOF effects disabled.

Metro 2033 tells the same story we've seen up to this point. Versus any other single graphics card, the GeForce GTX 690 simply can't be touched.

When put up against other dual and quad-GPU configurations, the GeForce GTX 690 still performs very well in the Metro 2033 test. The GTX 690 trails the 680 SLI setup by a couple of frames per second, but is able to outpace the Radeon HD 7970 configuration at 1920x1200. With the resolution cranked up to 2560x1600, however, the Radeons pull ahead.
Batman: Arkham City Performance

Batman: Arkham City
DirectX Gaming Performance

Batman: Arkham City

Batman: Arkham City is a sequel to 2009’s Game of the Year winning Batman: Arkham Asylum. This recently released sequel, however, lives up to and even surpasses the original. The story takes place 18 months after the original game. Quincy Sharp, the onetime administrator of Arkham Asylum, has become mayor and convinced Gotham to create "Arkham City" by walling off the worst, most crime-ridden areas of the city and turning the area into a giant open-air prison. The game has DirectX 9 and 11 rendering paths, with support for tessellation, multi-view soft shadows, and ambient occlusion. We tested in DX11 mode with all in-game graphical options set to their maximum values, at various resolutions.

The GeForce GTX 690 rocked with Batman Arkham City. As you should expect at this point, the GTX 690 outperforms every other single card by a wide margin here.

The GeForce GTX 690's performance falls right where you'd expect it to when put up again the dual-card configurations--trailing the GeForce GTX 680 SLI setup by a slight margin. AMD still can't get this game to work with CrossFire properly, hence the Radeon's abysmal performance here.
Dirt 3 Performance

Dirt 3
DirectX 11 Gaming Performance

Dirt 3

Dirt 3 is the latest in a string of great racing games from Codemasters. Like is predecessor, 2009's Dirt 2, this game sports impressive visuals with DX11 support. “Ultra” settings for shadow effects, tessellation, and post processing elements, like depth of field, then become available to the gamer, and in turn, crank up the workload on the graphics subsystem. The game engine also makes use of multi-core processors for higher performance on top-end systems. We tested the game configured with its Ultra graphics options with 4X anti-aliasing at resolutions with of 1920x1200 and 2560x1600.

It's more of the same in the Dirt 3 benchmark. The GeForce GTX 690 outpaces every other single card by a wide margin. Big surprise, huh?

Versus the various SLI and CrossFire configurations, the GeForce GTX 690's performance seems more tame--although keep in mind, this is one 690 vs. two competing cards. Here, the GeForce GTX 690 falls right in behind the GeForce GTX 680 SLI setup.
Alien vs. Predator Performance

Alien vs. Predator
DirectX 11 Gaming Performance

Alien vs. Predator

The Alien vs. Predator benchmark makes use of the advanced Tessellation, screen space ambient occlusion and high-quality shadow features, available with DirectX 11. In addition to enabling all of the aforementioned DirectX 11 related features offered by this benchmark, we also switched on 4X anti-aliasing along with 16X anisotropic filtering to more heavily tax the graphics cards being tested.

The GeForce GTX 690 put a hurtin' on all of the other single cards in the Alien vs. Predator benchmark, nearly doubling the performance of the GeForce GTX 680.

Versus the similarly priced CrossFire and SLI configurations, the GeForce GTX 690's performance is less dominating. NVIDIA's new flagship trailed the Radeon HD 7970 and 7950 CrossFire setups here, and obviously falls in right behind the slightly higher clocked GeForce GTX 680 SLI setup.
Surround & Eyefinity Tests
Looking back through the numbers, it's clear that even a 30" LCD monitor with a resolution of 2560x1600 doesn't pack enough pixels to make the GeForce GTX 690 break a sweat, so we also did some testing with triple-monitor setup to evaluate NVIDIA Surround and AMD Eyefinity performance with a trio of similarly priced configurations.

What we have below are tests with Metro 2033, Dirt 3, and Batman Arkham City at high-quality settings, at a resolution of 5760x1080 (the aggregate resolution of three 22" 1920x1080 displays, in a 3x1 landscape configuration).
Eyefinity vs. Surround View
AMD and NVIDIA Multi-Monitor Testing

The Radeon HD 7970 CrossFire configuration put up the best score in Metro 2033 with running across three monitors, but the GeForce GTX 690 and GeForce GTX 680 SLI take the top two positions in Dirt 3 and Batman.
Overclocking the GeForce GTX 690

NVIDIA claims that the GeForce GTX 690 had overclocking plenty of headroom for overclocking thanks to its beefy 10-phase power delivery, and somewhat conservative stock frequencies, so we fired up of EVGA’s excellent Precision performance tuning utility, which already supports the GeForce GTX 690, to see just how much additional performance we could wring from the card. We also installed a slick GeForce GTX 690 skin that matches the color scheme of the card.

During some conversations we had with a few representatives from NVIDIA in the past, we were told that most GK1-4 GPUs would likely be able to hit frequencies around 1.2GHz, with stock cooling. Our testing proved that to be true.

EVGA's Precision Performance Tuning Utility With GTX 690 Skin

By cranking up the power target by 35% and increasing the GPU Clock Offset by 150MHz, our GeForce GTX 690 would consistently boost to about 1.215GHz with perfect stability and zero visual artifacts. We were also able to crank the memory clock up by another 113MHz for an additional performance boost.

Overclocking the GeForce GTX 690
Putting The Pedal to the Metal

While we had the card overclocked, we re-ran a couple of high resolution tests, and when all was said and done we were able to increase performance considerably. Metro 2033 shows a 7.2% improvement in framerate while overclocked, and performance in Alien vs. Predator--which is mostly bound by memory bandwidth--increased by 7.6%.

Power Consumption, Noise, Temps

Before bringing this article to a close, we'd like to cover a few final data points--namely power consumption, temperatures and noise. Throughout all of our benchmarking and testing, we monitored acoustics and tracked how much power our test system was consuming using a power meter. Our goal was to give you an idea as to how much power each configuration used while idling and while under a heavy workload. Please keep in mind that we were testing total system power consumption at the outlet here, not just the power drawn by the graphics cards alone.

Total System Power Consumption
Tested at the Outlet

WIth essentially double the resources of a GeForce GTX 680, it should come as no surprise that the GeForce GTX 690 consumes more power than its single-GPU based counterpart. But considering the GeForce GTX 690's very strong performance relative to NVIDIA's previous dual-GPU flagship, the GeForce GTX 590, the GTX 690's power consumption characteristics seem downright impressive. The GeForce GTX 690 consumed 39 fewer watts than the GTX 590 while idling and almost 20 fewer watts under load, despite offering significantly better performance. That's a testament to the power efficiency of the GK104 GPU.

All of the work NVIDIA put into designing the cooling hardware for the GeForce GTX 690 also seems to have paid off. Throughout testing, we found the GeForce GTX 690 to be very quiet, especially considering how powerful the card is. While idling, the GTX 690 is essentially silent and can't be heard over a typical CPU or PSU cooling fan. And under load, while the cooling fan does spin up and become audible, it's emits a very low tone that's not distracting at all. After hours of benchmarks, we monitored GPU temps on the card with a constant loop of Unigine Heaven running and found that the GPU temp peaked at around 80'C (idle temps were around 38'C). Even after hours of benchmarking and a constant load being placed on the GTX 690 though, the fan on the card maxed out at only 55% of full speed.

Our Summary and Conclusion

Performance Summary:  To put it simply, the GeForce GTX 690 is the fastest single graphics card we have tested to date--by far. In every game or benchmark, regardless of resolution or image quality settings, the GeForce GTX 690 was able to outpace every other graphics card we've ever tested. Of course, with a $1000 price tag and two of the most powerful GPUs on-board, the GeForce GTX 690 should best every other graphics card currently available.

When compared to other multi-GPU configurations, the GeForce GTX 690’s performance is still impressive, but not nearly as dominant. The GeForce GTX 690 ends up coming in just a hair behind the similarly priced dual GeForce GTX 680 SLI configuration across the board, due to GTX 690’s slightly lower base / boost GPU frequencies. Versus a Radeon HD 7970 CrossFire setup, the GeForce GTX 690 ends up being somewhat faster more often than not, but the Radeons pulled ahead in a few tests.

The NVIDIA GeForce GTX 690 Reference Card

Although AMD has yet to show their hand at the ultra-high end with their dual-GPU powered card, the Radeon HD 7990, based on what we know so far, it would seem NVIDIA is a strong position at this point. As we've shown, the GeForce GTX 690 is about as fast or faster than a pair of Radeon HD 7970 cards running in CrossFire mode. That means AMD would have to be able to put together a dual-GPU powered card clocked higher than the current Radeon HD 7970 to best the GeForce GTX 690. That’s not likely to happen in our opinion, as AMD's Tahiti GPUs have more demanding power requirements vs. the GK104. Although, the Radeon’s wider memory bus (384-bit vs. 256-bit) and larger frame buffer (3GB per GPU vs. 2GB per GPU), may give it an edge at some very high / multi-monitor resolutions in future games. Hopefully we'll know more in the weeks ahead. 

According to NVIDIA, the GeForce GTX 690 will be available in limited quantities from add-in card partners, including ASUS, EVGA, Gainward, Galaxy, Gigabyte, Inno3D, MSI, Palit and Zotac starting today, with wider availability by May 7. Considering the scarcity of the GeForce GTX 680 though, don't expect a ton of these cards to be floating around. Expected pricing is $999--roughly on par with a pair of GeForce GTX 680 cards. At that price, the GeForce GTX 690 is obviously not for everyone. Heck, even if you've got the coin to afford a graphics card like the GeForce GTX 690, it's tough to justify when a pair of GeForce GTX 680s is faster and costs the same. But the high-end graphics card market is never about justified costs or value, it's about performance, and the GeForce GTX 690 offers that up in spades.

And if you ask us, the card looks just freakin' cool as hell too. Kudos to the team at NVIDIA for breaking the mold and using some new high quality materials and an interesting design to further set the GeForce GTX 690 apart.


  • Killer Performance
  • Surprisingly Quiet
  • 3D Surround From a Single Card
  • Awesome Aesthetics


  • Very Expensive
  • Limited Availability

Content Property of HotHardware.com