NVIDIA GeForce GTX 680 Review: Kepler Debuts

Article Index:   
We’ve been hearing about NVIDIA’s Kepler architecture since about September 2010. It was back at that years’ NVIDIA GPU Technology Conference, that company CEO Jen-Hsun Huang first publically disclosed that Kepler would offer massive performance per watt improvements over Fermi and previous-gen architectures and that GPUs based on Kepler would arrive in 2011. Well, the launch date has obviously slipped. How Kepler’s power efficiency looks, we’ll get to a little later.

The rumor mill kicked into high gear over the last few months, and has been consistently churning out more and more Kepler scuttlebutt (whether true or false) coincident with Radeon HD 7000 series launches. Today though, we can put the rumors to rest. We’ve had a new Kepler-based GeForce GTX 680 in hand for a couple of weeks now and can finally reveal all of the juicy details.

First up, we have some specifications and a little back story. And on the pages ahead, we’ll dive a little deeper and give you all the full scoop on Kepler, its new features and capabilities, and of course the GeForce GTX 680’s features and performance.


The NVIDIA GeForce GTX 680, GK104 "Kepler" Based Graphics Card

NVIDIA GeFoce GTX 680
Specifications & Features


The GeForce GTX 680’s main features and specifications are listed in the table above. Before we get into the specifics of the card and it’s GPU, however, we want to direct your attention to a few past HotHardware articles that lay the foundation for what we’ll be showing you here.


GK104 "Kepler" GPU Die Shot

Although the GeForce GTX 680 is built around a new GPU that is based on a new architecture, the Kepler-based GK104 at the heart of the card leverages technologies first introduced on previous-generation NVIDIA products. As such, we’d recommend checking out these articles for more detailed coverage of many of NVIDIA’s existing technologies that carry over to the new GeForce GTX 680:

In our Fermi and GF100 architecture previews we discuss Fermi’s architecture and detail Fermi’s CUDA cores and Polymorph and Raster engines, among other features. In our GeForce GTX 480 coverage, we dig a little deeper into Fermi, and discuss the first graphics card based on the technology. Our GeForce GTX 580 coverage details the GF110, the more-refined re-spin of the GF100 GPU. And in our 3D Vision Surround and 3D Vision 2 articles, we cover NVIDIA’s multi-monitor and stereoscopic 3D technologies, which are both very much a part of the GeForce GTX 680.
 

Image gallery

Related content

Comments

Comments
inspector 2 years ago

I was looking to do an upgrade to this but the price and the fact I need a mobo upgrade (actually whole system) as well pulled me away. I would of done it if I wasn't heading to college next year and I'm going to get a laptop.

LBowen 2 years ago

Great performance from this card. I have been looking forward to the benchmarks and I am not surprised. Metro 2033 is going to be a tough cookie for some time; but everything looks really solid. Great review Marco :)

marco c 2 years ago

Thanks, man. Killer card for sure.

rapid1 2 years ago

Yeah the Nvidia prices especially on debut are very high. I was a member at this website last year that did these great give aways pretty often. I was hoping to see them do more give away contest's but they quit I guess as I have not seen any more since last year. The point is I was looking forward to trying to win a system (I could really use one for work and home) with one of these as I cannot really afford it right now. Oh well maybe some other great site will do a contest like that I sure hope so!

rapid1 2 years ago

Off to peruse this review :)

Drake_McNasty 2 years ago

Great Review Marco, this card is a beast. I have to agree with @LBowen, the benchmarks aren't that surprising and neither is the price. You have to pay to play and Nvidia knows it! I'm happy to see the power consumption and temp results, they are heading in the right direction! No cooking eggs on this card! Looking forward to seeing it in full production and oc'd to the max.

Just out of curiosity I would love to see how this card performed with video and 3d rendering, and yes I understand it's a gaming card not a workstation card.

OMEGADraco 2 years ago

Thanks for that hella nice review. Big Green just gave AMD a slap in the face with lower prices and more performance. I can't wait to get my hands on one of these beautiful cards. It will be really interesting to see what the manufacturers come out with for non-reference cards.

eunoia 2 years ago

.

dejasoul100 2 years ago

Yikes...great job NVIDIA. AMD could afford to launch their cards at the exorbitant $599 precisely because NVIDIA hadn't struck yet. Now that Kepler's in town, watch those prices drop.

Gee, AMD can't catch a break, can they? Well it was a fun 3 months or so...

CDeeter 2 years ago

Looks like the Green machine is back on top,huh? Well I guess they deserve a turn, since it's been awhile.

Let the price wars commence!

rapid1 2 years ago

This looks like a nice piece of equipment for sure. The only think that seemed off to me is the 6990 from AMD and the 560Ti from Nvidia both seemed closer than I would like bearing the amount I have to spend on a new GPU that's just hit the market.

realneil 2 years ago

Great review. I've read a bunch of them & so far yours is tops.

I'm surprised that the price is at $500.00 for most of them.

As to the performance, it's stellar. I like the advanced new technology that these cards are sporting and I want some of them. (now to figure out how)

marco c 2 years ago

Thanks for all of the compliments, everyone. I really appreciate it. As always, I could have used more time to put some more polish on the piece and provide more detail, but too much tech isn't great either. :)

rrplay 2 years ago

Sure looks like a winner and it was sure sweet to see such performance and efficiency with this card.Really appreciate the review and respect that the fan was not audible and the temps were as low as they were,

Nice to be pleasantly surprised, and the release price seems reasonable too, definitely looks like the GPU battles may really heat up which can be good thing.

*** I may be in the same boat as neil >>> by the time I figure out how to get the extra ca$h, and / or  find more time to play some games that would really put a high -end card like this shine.  >but then again 2 560Ti's ain't too shabby either<

Thanks again for the review Marco ! Smile

sackyhack 2 years ago

Would this thing be bottlenecked by a Q6600? I know it depends on a lot of factor, but I was told my 470 would be, but I still noticed a huge improvement across the board when I upgraded from my 9800. Do you think it'd be worth it to try to squeeze a little more life out of my current build with a Kepler?

 

Oh and I'd really like to see how this baby handle's Witcher 2's Ubersampling option.

AKwyn 2 years ago

@sackyhack Obviously; it doesn't matter how much you overclock it, it will still hit a barrier because the CPU isn't powerful to handle the GPU, same goes for the 580.

If you have one of those 45 or 32nm CPU's then it should be enough to power the 680. The 45 may require an overclock but the 32mn can do it easily without an overclock.

Anyways, great review; while I am somewhat disapointed you didn't go truly in-depth and the results are somewhat expected; it does show that NVIDIA is back on top with this card; nicely done. Now if only they can come out with non-reference cards with the traditional 6-pin 8-pin config and the cooling that'd overclock these cards even further.

OMEGADraco 2 years ago

I think we will see the custom cooling solution cards with 6-pin/8-pin config in less than 1 months time Taylor I am sort of waiting until I see what comes down the line in that respect before I spend my cash. Considering I have until May before I can really have fun with the equipment anyways.

der meister 2 years ago

wow, I want on of these bad boys!

BJew 2 years ago

You guys never tested 3D Vision performance with a single or SLI configuration. I wanted to see 3D Vision benchmarks for Battlefield 3 so I know whether or not to upgrade from my two GTX 580's. I only get 40 FPS in 3D Vision.

AKwyn 2 years ago

[quote user="BJew"]

You guys never tested 3D Vision performance with a single or SLI configuration. I wanted to see 3D Vision benchmarks for Battlefield 3 so I know whether or not to upgrade from my two GTX 580's. I only get 40 FPS in 3D Vision.

[/quote]

I doubt many people would be interested in 3D vision; hence the lack of it from many reviews. I imagine it would be similar to running an extremely high resolution in that you'd lose 20FPS or a way heftier 50FPS. If I'd have to guess, the similar results from the GTX 680 and the two 580's means that you'll get mostly the same result in Battlefield 3, maybe 5 more FPS. If you do it in SLI then the results will be more beneficial, maybe 15-20FPS. Again, just a guess but you're not going to get much of a performance increase in 3D Vision. Still, one 680 would be better then those two 580's that you have and if you overclock them then maybe you could potentially increase the performance in 3D Vision by 3-8 FPS.

Marco C 2 years ago

Whether in singe or dual card configuations, 3D Vision performance is going to be limited by VSync on your 3D monitor.

KMya 2 years ago

This machine has lot of difference compared to the old version

Post a Comment
or Register to comment