Next-Gen NVIDIA Specs Leaked?

rated by 0 users
This post has 5 Replies | 0 Followers

Top 10 Contributor
Posts 25,886
Points 1,173,480
Joined: Sep 2007
ForumsAdministrator
News Posted: Tue, May 20 2008 4:34 PM

Specs for the next-generation NVIDIA cards have leaked onto the web, but we advise readers not to place too much faith in these specs until the cards actually ship.

On to the specs:

“The GTX 280 enables all features of the D10U processor; the GTX 260 version will consist of a significantly cut-down version of the same GPU.  The D10U-30 will enable all 240 unified stream processors designed into the processor.  NVIDIA documentation claims these second-generation unified shaders perform 50 percent better than the shaders found on the D9 cards released earlier this year.

The main difference between the two new GeForce GTX variants revolves around the number of shaders and memory bus width.  Most importantly, NVIDIA disables 48 stream processors on the GTX 260. GTX 280 ships with a 512-bit memory bus capable of supporting 1GB GDDR3 memory; the GTX 260 alternative has a 448-bit bus with support for 896MB.”

Other rumored specs are that the cards will support PCIe 2.1 and strangely enough they will only support DX 10.0, and not 10.1 or any other revision.  This could simply be an early driver issue or perhaps just bogus information.  The last interesting feature everyone is buzzing about is PhysX support.

We’re certainly eager to see how well AMD’s Radeon 4000 series stacks up against NVIDIA’s upcoming hardware.



  • | Post Points: 35
Top 75 Contributor
Posts 1,370
Points 20,925
Joined: Feb 2005
Location: new york city
ice_73 replied on Tue, May 20 2008 5:22 PM
d10u processor? first time i heard of it... i thought nvidia named their chips in g g80, g92... etc...

  • | Post Points: 20
Top 75 Contributor
Posts 1,677
Points 24,005
Joined: Aug 2002
Location: Pittsburgh, Pennsylvania

Yeah, this naming is news to me too. 

 Though I did hear of the weird bandwiths somewhere else before.  Nvidia, wtf are you doing?  448-bit, seriously?  Last gen wasn't bad enough with your silly 320-bit GPUs with 640mb of memory.  896mb... dear lord.  There still has yet to be shown performance difference between 256-bit and 512-bit, let alone these abominations such as 320-bit, 384-bit, and now the king of the crap hill: the 448-bit.  Bah, humbug!

Hello

  • | Post Points: 20
Top 150 Contributor
Posts 508
Points 7,860
Joined: Feb 2008
Location: California
peti1212 replied on Tue, May 20 2008 9:01 PM

 Looks like it will be a killer graphics card. I am only waiting for the time when Crysis will be playable at max at a decent resolution like 1680x1050 with like 60FPS with only one card. I can keep dreaming. :)

  • | Post Points: 5
Top 500 Contributor
Posts 210
Points 2,845
Joined: Apr 2007
Location: Kansas
shanewu replied on Wed, May 21 2008 8:46 AM
Anyone besides me thinking their new naming scheme will NOT help decrease confusion??!?

"Everyone always wants new things. Everybody likes new inventions, new technology. People will never be replaced by machines. In the end, life and business are about human connections. And computers are about trying to murder you in a lake. And to me, the choice is easy." - Michael Scott (The Office)

  • | Post Points: 20
Top 500 Contributor
Posts 306
Points 3,470
Joined: Feb 2004
Location: phillyish area, pennsylvania
ryan92084 replied on Wed, May 21 2008 3:42 PM

the D10U is more commonly known as the g200 (previously known as the g100) and the D9 cards would be the g92/94.  they keep trying to make their naming scheme easier for the average consumer and its throwing off the enthusiasts who follow the rumors.

Phenom 9850 | Foxconn A79a-s | Visiontek 4870

  • | Post Points: 5
Page 1 of 1 (6 items) | RSS