NVIDIA Unveils Dual-GPU Powered GeForce GTX 690

rated by 0 users
This post has 10 Replies | 1 Follower

Top 10 Contributor
Posts 26,801
Points 1,212,905
Joined: Sep 2007
ForumsAdministrator
News Posted: Sun, Apr 29 2012 12:14 AM

Today at the GeForce LAN taking place in Shanghai, NVIDIA’s CEO Jen Hsun Huang unveiled the company’s upcoming dual-GPU powered, flagship graphics card, the GeForce GTX 690.

The GeForce GTX 690 will feature a pair of fully-functional GK104 “Kepler” GPUs. If you recall, the GK104 is the chip powering the excellent GeForce GTX 680, which debuted just last month. On the upcoming GeForce GTX 690, each of the GK104 GPUs will also be paired to its own 2GB of memory (4GB total) via a 256-bit interface, resulting in what is essentially GeForce GTX 680 SLI on a single card.

On GeForce GTX 680 cards, the GK104 GPU has a base clock speed of 1006MHz, with a Boost clock of 1058MHz. The GeForce GTX 690 will have a somewhat lower base GPU clock of 915MHz with a boost clock of 1019MHz. The memory clock on the GeForce GTX 690 will reportedly remain unchanged from the GTX 680 and run at an effective 6Gb/s. With those specifications, the GeForce GTX 690 will likely offer about 90% of the performance of a GeForce GTX 680 SLI setup, give or take a couple of percentage points depending on the application.


NVIDIA GeForce GTX 690, Front and Back

The GPUs on the GTX 690 will be linked to eachother via a PCI Express 3.0 switch from PLX, with a full 16 lanes of electrical connectivity between each GPU and the PEG slot. Previous dual-GPU powered cards from NVIDIA relied on the company’s own NF200, but that chip lacks support for PCI Express 3.0, so NVIDIA opted for a third party solution this time around. For those paying attention, AMD has used PCIe switches from PLX on their dual-GPU powered cards.

With the kind of horsepower likely lurking under the GeForce GTX 690’s hood, it will obviously have some heavy-duty power requirements. But considering the GK104’s power efficiency and the GTX 690’s somewhat lower clocks they won’t be as high as previous-gen dual-GPU cards. NVIDIA is claiming a 300W TDP for the GeForce GTX 690 and a dual-8-pin power connector requirement. That’s over 20% lower than the 365W of the GeForce GTX 590.


NVIDIA GeForce GTX 690 Exposed

It seems NVIDIA has also done some innovating on the cooling and aesthetic fronts with the GeForce GTX 690 as well. Some of the new design features of the GeForce GTX 690 include (source: NVIDIA):

  • An exterior frame made from trivalent chromium-plated aluminum, providing excellent strength and durability
  • A fan housing made from a thixomolded magnesium alloy, which offers excellent heat dissipation and vibration dampening
  • High-efficiency power delivery with less resistance, lower power and less heat generated using a 10-phase, heavy-duty power supply with a 10-layer, two-ounce copper printed circuit board
  • Efficient cooling using dual vapor chambers, a nickel-plated finstack and center-mounted axial fan with optimized fin pitch and air entry angles
  • Low-profile components and ducted baseplate channels for unobstructed airflow, minimizing turbulence and improving acoustic quality

Just by looking at the GeForce GTX 690, it’s obvious NVIDIA has set out to create a powerful graphics card that also happens to look the part. We should also point out that the cooling hardware is designed in such a way that the fan blows air through the shroud, where half is directed towards the front GPU and ultimately expelled from the system, and the other half cools the  rear GPU and exhausts within the case.


The GeForce GTX 690's Outputs: DVI x 3, mini-DP x 1

The GeForce GTX 690 should have more than enough muscle to push multiple displays, and as such, the card is outfitted with a trio of DVI outputs and a single mini-DP output, all of which can be powered simultaneously.

Although we haven’t tested the GeForce GTX 690 just yet and AMD is yet to show their hand at the ultra-high end with their dual-GPU powered card, the Radeon HD 7990, based on what we know so far it would seem NVIDIA is a strong position. The GeForce GTX 680 is faster than the Radeon HD 7970 in most cases. Considering the GeForce GTX 690 will likely perform about as fast as a pair of 680 cards in SLI, AMD would have to be able to put together a dual-GPU card clocked higher than the current Radeon HD 7970. That’s not likely to happen with the Tahiti GPU’s more demanding power requirements vs. the GK104. Although, the Radeon’s wider memory bus (384-bit vs. 256-bit) and larger frame buffer (3GB per GPU vs. 2GB per GPU), may give it an edge at the uber-high resolutions and image quality settings these cards are meant for. We’re sure we’ll know more in the weeks ahead.

According to NVIDIA, the GeForce GTX 690 will be available in limited quantities from NVIDIA’s add-in card partners, including ASUS, EVGA, Gainward, Galaxy, Gigabyte, Inno3D, MSI, Palit and Zotac starting May 3, with wider availability by May 7, 2012. Expected pricing is $999--roughly on par with a pair of GeForce GTX 680 cards.

  • | Post Points: 110
Not Ranked
Posts 12
Points 90
Joined: Mar 2011
Location: Sacramento, CA

Beastly Card. Funny Nvidia beat AMD to the dual gpu card this generation, seems like AMD always gets theirs out first. AMD will probably out the HD 7990 dual gpu shortly now.

  • | Post Points: 5
Top 150 Contributor
Posts 501
Points 4,625
Joined: Dec 2011
Location: centennial park az
AKnudson replied on Sun, Apr 29 2012 2:14 AM

Its pretty crazy how fast these GPU's are getting, alot of them are reaching speeds that were standard in laptops with just as much ram only a few years ago. At what point do you combine it all into one object? when do GPU and CPU and Perhaps even motherboard get thrown into one?

  • | Post Points: 35
Top 500 Contributor
Posts 104
Points 940
Joined: Apr 2012
ZTimpson replied on Sun, Apr 29 2012 10:55 AM

Ya it is crazy.

AKnudson, have you checked out Jeff Han? if not you should! really smart guy.

  • | Post Points: 5
Top 10 Contributor
Posts 8,777
Points 105,190
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Sun, Apr 29 2012 11:03 AM

Love to have this card, but realistically, I can't see it happening.

AKnudson:
when do GPU and CPU and Perhaps even motherboard get thrown into one?

Ha! Imagine a single block of silicone in your "PC" doing ~everything~ and doing it at the speeds native to what memory is capable of,....whew, what an experience that would be!

it's probably closer than we think it is.

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 5
Top 500 Contributor
Posts 207
Points 1,515
Joined: Feb 2011
pwrntspd replied on Sun, Apr 29 2012 12:43 PM

Its shiny but....if i had that kinda dough id also have a new car.

  • | Post Points: 5
Top 10 Contributor
Posts 4,851
Points 45,910
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Sun, Apr 29 2012 4:27 PM

Why is that top right DVI interface different?

ASUS Z87C
i7-4770K
Xonar DGX
Intel Gigabit CT
Geforce GTX 770 4GB
G.Skill X1600/1754 2x8GB 7/8/8/24
  • | Post Points: 20
Top 10 Contributor
Posts 8,777
Points 105,190
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Sun, Apr 29 2012 4:45 PM

rapid1:
Why is that top right DVI interface different?

Just a guess, but I think it's for compatibility with some cables. One of my GTX570's is the same.

 

EDIT: Follow this link and all four types of DVI Connectors are explained there.

 

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 5
Not Ranked
Posts 1
Points 5
Joined: Apr 2012
NSeabury replied on Sun, Apr 29 2012 8:08 PM

I could be wrong but i THINK the bottom 2 are 'dual link' and the top is single (no big deal as the card can only drive 4 outputs which means either both dual links, or 3 single and a mini display port to dvi adapter)

  • | Post Points: 5
Not Ranked
Posts 2
Points 25
Joined: Apr 2012
madanbabu replied on Mon, Apr 30 2012 11:23 AM

I'd be really interested to see what the CUDA performance is like between the 680 vs 690. Damn nice to see this thing out.

  • | Post Points: 5
Top 200 Contributor
Posts 361
Points 2,580
Joined: Sep 2011

Amazing. Would love to pair one with a new Intel processor.

  • | Post Points: 5
Page 1 of 1 (11 items) | RSS