NVIDIA GeForce GTX 680 Review: Kepler Debuts

rated by 0 users
This post has 22 Replies | 2 Followers

Top 10 Contributor
Posts 26,323
Points 1,190,730
Joined: Sep 2007
ForumsAdministrator
News Posted: Thu, Mar 22 2012 9:15 AM

NVIDIA GeForce GTX 680 Review: Kepler DebutsWe’ve been hearing about NVIDIA’s Kepler architecture since about September 2010. It was back at that years’ NVIDIA GPU Technology Conference, that company CEO Jen-Hsun Huang first publically disclosed that Kepler would offer massive performance per watt improvements over Fermi and previous-gen architectures and that GPUs based on Kepler would arrive in 2011. Well, the launch date has obviously slipped. How Kepler’s power efficiency looks, we’ll get to a little later.

The rumor mill kicked into high gear over the last few months, and has been consistently churning out more and more Kepler scuttlebutt (whether true or false) coincident with Radeon HD 7000 series launches. Today though, we can put the rumors to rest. We’ve had a new Kepler-based GeForce GTX 680 in hand for a couple of weeks now and can finally reveal all of the juicy details...

NVIDIA GeForce GTX 680 Review: Kepler Debuts

  • | Post Points: 275
Top 25 Contributor
Posts 3,795
Points 40,670
Joined: Jan 2010
Location: New York
Inspector replied on Thu, Mar 22 2012 11:08 AM

I was looking to do an upgrade to this but the price and the fact I need a mobo upgrade (actually whole system) as well pulled me away. I would of done it if I wasn't heading to college next year and I'm going to get a laptop.

  • | Post Points: 5
Top 150 Contributor
Posts 754
Points 8,520
Joined: Mar 2011
Location: Phoenix
LBowen replied on Thu, Mar 22 2012 11:18 AM

Great performance from this card. I have been looking forward to the benchmarks and I am not surprised. Metro 2033 is going to be a tough cookie for some time; but everything looks really solid. Great review Marco :)

"I have the power!!"

  • | Post Points: 5
Top 50 Contributor
Posts 2,361
Points 48,680
Joined: Apr 2000
Location: United States, Connecticut
ForumsAdministrator
MembershipAdministrator
Marco C replied on Thu, Mar 22 2012 11:18 AM

Thanks, man. Killer card for sure.

Marco Chiappetta
Managing Editor @ HotHardware.com

Follow Marco on Twitter

  • | Post Points: 5
Top 10 Contributor
Posts 4,838
Points 45,830
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Thu, Mar 22 2012 11:22 AM

Yeah the Nvidia prices especially on debut are very high. I was a member at this website last year that did these great give aways pretty often. I was hoping to see them do more give away contest's but they quit I guess as I have not seen any more since last year. The point is I was looking forward to trying to win a system (I could really use one for work and home) with one of these as I cannot really afford it right now. Oh well maybe some other great site will do a contest like that I sure hope so!

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 5
Top 10 Contributor
Posts 4,838
Points 45,830
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Thu, Mar 22 2012 11:23 AM

Off to peruse this review :)

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 5
Top 100 Contributor
Posts 843
Points 9,155
Joined: Feb 2011
Location: San Diego

Great Review Marco, this card is a beast. I have to agree with @LBowen, the benchmarks aren't that surprising and neither is the price. You have to pay to play and Nvidia knows it! I'm happy to see the power consumption and temp results, they are heading in the right direction! No cooking eggs on this card! Looking forward to seeing it in full production and oc'd to the max.

Just out of curiosity I would love to see how this card performed with video and 3d rendering, and yes I understand it's a gaming card not a workstation card.

Now you're just mashing it!

  • | Post Points: 5
Top 50 Contributor
Posts 2,865
Points 29,645
Joined: Mar 2011
Location: United States, Connecticut
omegadraco replied on Thu, Mar 22 2012 12:13 PM

Thanks for that hella nice review. Big Green just gave AMD a slap in the face with lower prices and more performance. I can't wait to get my hands on one of these beautiful cards. It will be really interesting to see what the manufacturers come out with for non-reference cards.

  • | Post Points: 5
Top 150 Contributor
Posts 758
Points 10,695
Joined: Sep 2009
eunoia replied on Thu, Mar 22 2012 1:39 PM

.

...pending.

  • | Post Points: 5
Top 500 Contributor
Posts 123
Points 910
Joined: Oct 2011
Location: Canada

Yikes...great job NVIDIA. AMD could afford to launch their cards at the exorbitant $599 precisely because NVIDIA hadn't struck yet. Now that Kepler's in town, watch those prices drop.

Gee, AMD can't catch a break, can they? Well it was a fun 3 months or so...

  • | Post Points: 5
Top 100 Contributor
Posts 1,103
Points 11,175
Joined: Jun 2010
Location: Pennsylvania
CDeeter replied on Thu, Mar 22 2012 3:19 PM

Looks like the Green machine is back on top,huh? Well I guess they deserve a turn, since it's been awhile.

Let the price wars commence!

  • | Post Points: 5
Top 10 Contributor
Posts 4,838
Points 45,830
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Thu, Mar 22 2012 4:05 PM

This looks like a nice piece of equipment for sure. The only think that seemed off to me is the 6990 from AMD and the 560Ti from Nvidia both seemed closer than I would like bearing the amount I have to spend on a new GPU that's just hit the market.

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 5
Top 10 Contributor
Posts 8,681
Points 104,280
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Thu, Mar 22 2012 8:09 PM

Great review. I've read a bunch of them & so far yours is tops.

I'm surprised that the price is at $500.00 for most of them.

As to the performance, it's stellar. I like the advanced new technology that these cards are sporting and I want some of them. (now to figure out how)

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 5
Top 50 Contributor
Posts 2,361
Points 48,680
Joined: Apr 2000
Location: United States, Connecticut
ForumsAdministrator
MembershipAdministrator
Marco C replied on Thu, Mar 22 2012 8:35 PM

Thanks for all of the compliments, everyone. I really appreciate it. As always, I could have used more time to put some more polish on the piece and provide more detail, but too much tech isn't great either. :)

Marco Chiappetta
Managing Editor @ HotHardware.com

Follow Marco on Twitter

  • | Post Points: 5
Top 50 Contributor
Posts 2,383
Points 31,050
Joined: Nov 2010
Location: Crystal Lake,IL
rrplay replied on Thu, Mar 22 2012 8:54 PM

Sure looks like a winner and it was sure sweet to see such performance and efficiency with this card.Really appreciate the review and respect that the fan was not audible and the temps were as low as they were,

Nice to be pleasantly surprised, and the release price seems reasonable too, definitely looks like the GPU battles may really heat up which can be good thing.

*** I may be in the same boat as neil >>> by the time I figure out how to get the extra ca$h, and / or  find more time to play some games that would really put a high -end card like this shine.  >but then again 2 560Ti's ain't too shabby either<

Thanks again for the review Marco ! Smile

"Don't Panic ! 'cause HH got's your back!"

  • | Post Points: 5
Top 200 Contributor
Posts 450
Points 5,345
Joined: Jul 2010
Location: Cincinnati
sackyhack replied on Thu, Mar 22 2012 10:29 PM

Would this thing be bottlenecked by a Q6600? I know it depends on a lot of factor, but I was told my 470 would be, but I still noticed a huge improvement across the board when I upgraded from my 9800. Do you think it'd be worth it to try to squeeze a little more life out of my current build with a Kepler?

 

Oh and I'd really like to see how this baby handle's Witcher 2's Ubersampling option.

  • | Post Points: 20
Top 10 Contributor
Posts 6,374
Points 80,315
Joined: Nov 2004
Location: United States, Arizona
Moderator

wow, I want on of these bad boys!

"Never trust a computer you can't throw out a window."

2700K

Z77 GIGABYTE G1.SNIPER

GIGABYTE GTX670

G.Skill Ripjaws X 16gb PC2133

Antec P280

Corsair H100

Asus Blu-ray burner

Seasonic X650 PSU

Patriot Pyro 128gb SSD

  • | Post Points: 5
Top 50 Contributor
Posts 3,236
Points 37,910
Joined: Mar 2010
AKwyn replied on Fri, Mar 23 2012 12:49 AM

@sackyhack Obviously; it doesn't matter how much you overclock it, it will still hit a barrier because the CPU isn't powerful to handle the GPU, same goes for the 580.

If you have one of those 45 or 32nm CPU's then it should be enough to power the 680. The 45 may require an overclock but the 32mn can do it easily without an overclock.

Anyways, great review; while I am somewhat disapointed you didn't go truly in-depth and the results are somewhat expected; it does show that NVIDIA is back on top with this card; nicely done. Now if only they can come out with non-reference cards with the traditional 6-pin 8-pin config and the cooling that'd overclock these cards even further.

 

"The future starts with you; now start posting more!"

  • | Post Points: 20
Top 50 Contributor
Posts 2,865
Points 29,645
Joined: Mar 2011
Location: United States, Connecticut
omegadraco replied on Fri, Mar 23 2012 11:15 AM

I think we will see the custom cooling solution cards with 6-pin/8-pin config in less than 1 months time Taylor I am sort of waiting until I see what comes down the line in that respect before I spend my cash. Considering I have until May before I can really have fun with the equipment anyways.

  • | Post Points: 5
Not Ranked
Posts 13
Points 125
Joined: Sep 2011
BJew replied on Tue, Mar 27 2012 11:00 PM

You guys never tested 3D Vision performance with a single or SLI configuration. I wanted to see 3D Vision benchmarks for Battlefield 3 so I know whether or not to upgrade from my two GTX 580's. I only get 40 FPS in 3D Vision.

  • | Post Points: 35
Not Ranked
Posts 8
Points 55
Joined: Mar 2012
KMya replied on Thu, Mar 29 2012 1:33 AM

This machine has lot of difference compared to the old version

  • | Post Points: 5
Top 50 Contributor
Posts 3,236
Points 37,910
Joined: Mar 2010
AKwyn replied on Fri, Mar 30 2012 12:16 PM

BJew:

You guys never tested 3D Vision performance with a single or SLI configuration. I wanted to see 3D Vision benchmarks for Battlefield 3 so I know whether or not to upgrade from my two GTX 580's. I only get 40 FPS in 3D Vision.

I doubt many people would be interested in 3D vision; hence the lack of it from many reviews. I imagine it would be similar to running an extremely high resolution in that you'd lose 20FPS or a way heftier 50FPS. If I'd have to guess, the similar results from the GTX 680 and the two 580's means that you'll get mostly the same result in Battlefield 3, maybe 5 more FPS. If you do it in SLI then the results will be more beneficial, maybe 15-20FPS. Again, just a guess but you're not going to get much of a performance increase in 3D Vision. Still, one 680 would be better then those two 580's that you have and if you overclock them then maybe you could potentially increase the performance in 3D Vision by 3-8 FPS.

 

"The future starts with you; now start posting more!"

  • | Post Points: 5
Top 50 Contributor
Posts 2,361
Points 48,680
Joined: Apr 2000
Location: United States, Connecticut
ForumsAdministrator
MembershipAdministrator
Marco C replied on Wed, Apr 4 2012 12:09 AM

Whether in singe or dual card configuations, 3D Vision performance is going to be limited by VSync on your 3D monitor.

Marco Chiappetta
Managing Editor @ HotHardware.com

Follow Marco on Twitter

  • | Post Points: 5
Page 1 of 1 (23 items) | RSS