NVIDIA GeForce GTX 295 Unleashed

Article Index:   

NVIDIA Accelerates the Search For a Cure


We all knew that NVIDIA wasn't sitting idle since the initial release of AMD's popular ATI Radeon HD 4800 series of products. As today's information will show, it turns out NVIDIA was not only hard at work on a 55nm version of their GT200 GPU, but they have also designed a dual-GPU beast built around two of them.  There is a bit more to the story though, as the information and specifications below will reveal. 


   

Fabrication Process 

55 nm 

Core Clock (texture and ROP units) 

576 MHz 

Shader Clock (Stream Processors) 

1242 MHz 

Memory Clock (Clock rate / Data rate) 

1000 MHz / 2000 MHz 

Total Video Memory 

1792 MB 

Memory Interface 

448-bit per GPU 

Total Memory Bandwidth 

224 GB/s 

Processor Cores 

480

ROP Units 

28

Texture Filtering Units 

160

Texture Filtering Rate 

92.2 GigaTexels/sec 

Connectors 

2 x Dual-Link DVI-I 1 x HDMI 

RAMDACs 

400 MHz 

Bus Technology 

PCI Express 2.0 

Form Factor 

Dual Slot 

Power Connectors 

1 x 6-pin 1 x 8-pin 

Max Board Power (TDP) 

289 watts 

As you can see, the GPU used on the GeForce GTX 295 is built using a 55nm manufacturing process and the reference specifications call for 576MHz core, 1.24GHz shader, and 2GHz (1GHz DDR) memory clock frequencies.  What's more interesting is that each GPU on the GeForce GTX 295 is outfitted with 240 stream processor cores--for a total of 480--but the memory interface is 448-bits wide and it's got 28 ROP units per GPU, not 512-bits and 32 ROP units like the current GeForce GTX 280.  The specifications reveal that the 55nm GPU as it is implemented on the GeForce GTX 295 will be sort of a die-shrunk hybrid of the current GeForce GTX 260 and GTX 280.  Max board power is rated for 289 watts, which is right on par with the Radeon HD 4870 X2 and, as such, the GTX 295 requires both 6-pin and 8-pin supplemental PCI Express power connectors.

    
NVIDIA GeForce GTX 295

The card itself is reminiscent of the dual-GPU powered GeForce 9800 GX2.  There is a perforated shroud that surrounds the entire PCB, with an air inlet at one end near the barrel type fan.  The fan pulls air into the interior of the card, where it is blown across heatsinks and then ultimately expelled from the system through vents in the card's case bracket.  Outputs on the GeForce GTX 295 consist of a pair of dual-link DVI outputs and an HDMI output, but the pictures also reveal a couple of diagnostic LEDs.  One of the LEDs indicates whether or not the board is receiving sufficient power and the other shows whether or not a monitor is properly connected and being detected.

      

      
NVIDIA GeForce GTX 295 Reference Card

Just in case you're wondering, yes, two of these cards can be run in tandem in a Quad-SLI configuration. And we should also note that a single card can even be configured to run with one of its GPUs dedicated to PhysX calculations while the other handles graphics.  Although the GeForce GTX 295 required a single PCI Express x16 slot, it is actually two PCBs with a heavy-duty heatsink sandwiched in between.
 

Image gallery

Related content

Comments

Comments
peti1212 5 years ago

Awesome. I've actually seen this beast in a few systems about 2 weeks ago already. They perform amazingly.

3vi1 5 years ago

Is it possible to use the HDMI and both DVI connections at the same time? I.e. connect three monitors?

ice_73 5 years ago

the 295 consumes less power, and is more quiet than the 4870x2, however, all things considered, i think the 4870x2 holds the top. two 295 would completly overpower 2 4870x2 (mostly because games dont use all 4 cores). but one 4870x2 is cheaper, and performs better.

on the other hand nvidia has physx and cuda, which could play a role in the games of 09...

Der Meister 5 years ago

Both the ATI and Nvidia cards have there advantages. Youd say that spending a bit more now and going with the Nvid card is a bit more future proof, While the ATI card is the best bag for the $ at the moment going off strictly price.

ice_73 5 years ago

basically, thats the way i see it der, nvidia got the extreme top performance (when you sli them), but the best bang in the "normal" high end is still with ati.

i dont know how well the extra features nvidia has will play out, since physx has been rather useless so far, but this could turn out to be a bit like the x800 series sm 3.0 fiasco (though, on a smaller scale).

Post a Comment
or Register to comment