Logo   Banner   TopRight
NVIDIA GeForce GTX 295 Unleashed
Date: Jan 08, 2009
Author: Marco Chiappetta and Dave Altavilla
Introduction and Specifications

There's no doubt that the 3D Graphic Card business is what you'd call a proverbial "tough gig".  There was a time when discreet graphics card options were available from numerous GPU vendors, but over the years, the relentless pace of technology and fierce competition has homogenized the market down to virtually two primary suppliers.  In mainstream 3D Graphics, there is but one mantra--keep pace or exceed, execute or die.  It's a simple equation that keeps product refreshes ongoing and a natural progression of the graphics food chain that results in continuously improved product offerings, at both the hardware and software levels.

NVIDIA is obviously one of the few companies, along with AMD's ATI graphics division, that has executed amazingly well over the years.  The continuous strike / counter-strike battle that rages on between the two companies affords consumers increasingly more powerful products, as well as more realism in 3D games, as developers take advantage of each new technology update.  In 2008, NVIDIA had the fastest single GPU solution on earth with the GeForce GTX 280, while AMD's ATI Radeon HD 4870 X2 took the most powerful single card performance spot with its dual, 55nm RV770 GPU solution.

Today, fresh out of the gate for 2009, NVIDIA returns AMD's volley with their own optimized, multi-GPU, single card solution that aims to trump their rivals once again.  NVIDIA's GeForce GTX 295 is unleashed today.  With a pair of 55nm GT200B GPUs under its hood in a pseudo-single card, dual slot height configuration, it's direct competition for the ATI Radeon HD 4870 X2.  We'll step you through the technology behind NVIDIA's new single card SLI-enabled beast and then clock it around the benchmark track with some of the latest, most popular game titles on the market.


NVIDIA GeForce GTX 295

NVIDIA GeForce GTX 295
Specifications and Features
Fabrication Process: 55nm
Processor Cores: 480
ROP Units: 56
Texture Filtering Units: 160
Core Clock (MHz): 576 MHz
Shader Clock (MHz): 1242 MHz
Texture Filtering Rate: 92.2Giga Texels/s

Memory Clock (MHz DDR): 1998 MHz
Total Memory Config: 1792 MB
Memory Interface Width: 448-bit per GPU
Total Memory Bandwidth: 223.8GB/s

Display Support:
Maximum Digital Resolution: 2560x1600
Maximum VGA Resolution: 2048x1536

Standard Display Connectors:
2 x Dual-Link DVI-I, 1 x HDMI
Feature Support:
NVIDIA SLI-ready: Quad-SLI
NVIDIA PureVideo Technology-HD
NVIDIA PhysX-ready
NVIDIA CUDA Technology
HybridPower Technology
GeForce Boost
Microsoft DirectX 10
OpenGL 2.1

Bus Support: PCI-E 2.0 x16

Standard Graphics Card Dimensions:
Height: 4.376 inches (111 mm)
Length: 10.5 inches (267 mm)
Width: Dual-slot

Thermal and Power Specs:
Maximum GPU Temperature (in C): 105
Maximum Graphics Card Power (W): 289
Power Connectors: 6-pin x1, 8-pin x1

Looking closely at the spec list above, you'll note that the GeForce GTX 295 looks like a double-shot of the GeForce GTX 280 on a single card but with GTX 260 core clock and memory speeds, as well as the GTX 260's slightly narrower memory interface.  More details on this later, but beyond that we should also point out that the new GeForce GTX 295 also has a texture fillrate that comes close to a pair of GeForce GTX 280s in SLI.

In terms of the underlying technology behind NVIDIA's GT200 family of GPUs, as well as detailed explanations on many of their features, we'd suggest perusing one or more of the following HotHardware articles covering NVIDIA's previous GPU architectures:

Our GeForce 8800 GTX launch article goes in depth on NVIDIA's previous generation G80 GPU architecture and explains NVIDIA's CUDA GPGPU technology.  Also, our GeForce GTX 280 and GTX 260 article covers much of the base NVIDIA GT200 GPU architecture, that is employed in 55nm technology now on the new GeForce GTX 295 we'll be covering for you in the pages ahead.

The GeForce GTX 295

NVIDIA Accelerates the Search For a Cure

We all knew that NVIDIA wasn't sitting idle since the initial release of AMD's popular ATI Radeon HD 4800 series of products. As today's information will show, it turns out NVIDIA was not only hard at work on a 55nm version of their GT200 GPU, but they have also designed a dual-GPU beast built around two of them.  There is a bit more to the story though, as the information and specifications below will reveal. 


Fabrication Process 

55 nm 

Core Clock (texture and ROP units) 

576 MHz 

Shader Clock (Stream Processors) 

1242 MHz 

Memory Clock (Clock rate / Data rate) 

1000 MHz / 2000 MHz 

Total Video Memory 

1792 MB 

Memory Interface 

448-bit per GPU 

Total Memory Bandwidth 

224 GB/s 

Processor Cores 


ROP Units 


Texture Filtering Units 


Texture Filtering Rate 

92.2 GigaTexels/sec 


2 x Dual-Link DVI-I 1 x HDMI 


400 MHz 

Bus Technology 

PCI Express 2.0 

Form Factor 

Dual Slot 

Power Connectors 

1 x 6-pin 1 x 8-pin 

Max Board Power (TDP) 

289 watts 

As you can see, the GPU used on the GeForce GTX 295 is built using a 55nm manufacturing process and the reference specifications call for 576MHz core, 1.24GHz shader, and 2GHz (1GHz DDR) memory clock frequencies.  What's more interesting is that each GPU on the GeForce GTX 295 is outfitted with 240 stream processor cores--for a total of 480--but the memory interface is 448-bits wide and it's got 28 ROP units per GPU, not 512-bits and 32 ROP units like the current GeForce GTX 280.  The specifications reveal that the 55nm GPU as it is implemented on the GeForce GTX 295 will be sort of a die-shrunk hybrid of the current GeForce GTX 260 and GTX 280.  Max board power is rated for 289 watts, which is right on par with the Radeon HD 4870 X2 and, as such, the GTX 295 requires both 6-pin and 8-pin supplemental PCI Express power connectors.

NVIDIA GeForce GTX 295

The card itself is reminiscent of the dual-GPU powered GeForce 9800 GX2.  There is a perforated shroud that surrounds the entire PCB, with an air inlet at one end near the barrel type fan.  The fan pulls air into the interior of the card, where it is blown across heatsinks and then ultimately expelled from the system through vents in the card's case bracket.  Outputs on the GeForce GTX 295 consist of a pair of dual-link DVI outputs and an HDMI output, but the pictures also reveal a couple of diagnostic LEDs.  One of the LEDs indicates whether or not the board is receiving sufficient power and the other shows whether or not a monitor is properly connected and being detected.


NVIDIA GeForce GTX 295 Reference Card

Just in case you're wondering, yes, two of these cards can be run in tandem in a Quad-SLI configuration. And we should also note that a single card can even be configured to run with one of its GPUs dedicated to PhysX calculations while the other handles graphics.  Although the GeForce GTX 295 required a single PCI Express x16 slot, it is actually two PCBs with a heavy-duty heatsink sandwiched in between.

What's Under The Hood

NVIDIA Accelerates the Search For a Cure

We couldn't help but pull a GeForce GTX 295 apart upon its arrival in the lab. With dual 448-bit memory interfaces, a pair of GT200B GPUs, and 1.792GB of frame buffer memory, the GeForce GTX 295 is arguably the most complex, single desktop graphics adapters ever designed. And no, the Voodoo 3 6000 doesn't count--it was never officially released. 



NVIDIA GeForce GTX 295 Disassembled

As you can see, with the GeForce GTX 295, each GPU (and its frame buffer memory) resides on its own PCB. And sandwiched in between the PCBs is a single, large heatsink and fan that cools the GPUs and RAM. The rear of each PCB features a cut-out that allows air to be pulled in from either side, where it is funneled through the center of the card and ultimately exhausted from the system. The entire assembly, with the exception of the backside of the second PCB, is encased in a perforated, metal shell that not only protects the components inside, but also helps dissipate heat.

Running along the bottom of the card, towards the case bracket at the front, a ribbon cable is visible. This cable is an SLI bridge connector that links the two GPUs on the card. And on each PCB the large heatspreader installed on the 55nm GT200B GPU is the most dominant feature, surrounded by 14 GDDR3 memory chips.

OK, not to put this thing back together--there's a couple dozen tiny screws laying around here somewhere...

Retail Cards From EVGA and ASUS

NVIDIA Accelerates the Search For a Cure 

As we begun working on this article, some of NVIDIA's partners contacted us to let us know that they were ready with cards in full retail trim, just in time for today's hard launch.



EVGA GeForce GTX295+

The first card you see pictured here is EVGA's GeForce GTX 295+. As you probably suspect, the EVGA GeForce GTX 295 is a straight-up reference design, save for the custom decals affixed to the fan shroud. Included in the box, EVGA thrown is a pair of PCI Express power adapters (one 6-pin, one 8-pin), a DVI-to-VGA adapter, an S/PDIF digital audio pass-through cable to compliment the card's HDMI output, a driver CD, case badge, and a user's manual.

Unfortunately, due to time constraints leading up to CES, we weren't able to fully test EVGA's offering in time for this article. We will be following up with a full review and evaluation ASAP, however. So stay tuned.



Retail Ready Asus ENGTX295

We also received Asus' aptly named ENGTX295. Asus was able to get us their sample a few days earlier than EVGA, so we were able to put this card through a full round of testing. We have included the results alongside those of NVIDIA's reference card throughout the benchmarking segment.

Like EVGA, Asus didn't modify NVIDIA's reference design with their ENGTX295. The card is essentially identical to NVIDIA's reference design, but for the custom Asus decals installed on the fan shroud. Asus' ENGTX295 accessory bundle is similar to EVGA's as well. Included with the card we found a pair of PCI Express power adapters (one 6-pin, one 8-pin), a DVI-to-VGA adapter, an S/PDIF digital audio pass-through cable, a driver CD, and couple of user's manuals. But Asus also throws in a pleather CD wallet and a couple to receive a 10% discount on some games that are part of NVIDIA's "The Way It's Meant To Be Played" program.

Our Test System and 3DMark Vantage

NVIDIA Accelerates the Search For a Cure

HOW WE CONFIGURED THE TEST SYSTEMS: We tested all of the graphics cards used in this article on an Asus Striker II Extreme motherboard powered by a Core 2 Extreme QX9770 quad-core processor and 4GB of Corsair RAM. The first thing we did when configuring these test system was enter the system BIOS and set all values to their "optimized" or "high performance" default settings. Then we manually configured the memory timings and disabled any integrated peripherals that wouldn't be put to use. The hard drive was then formatted, and Windows Vista Ultimate SP1 was installed. When the installation was complete we fully updated the OS, and installed the latest DX10 redist and various hotfixes, along with the necessary drivers and applications.

HotHardware's Test Systems
Intel and NVIDIA Powered

Hardware Used:
Core 2 Extreme QX9770 (3.2GHz)

Asus Striker II Extreme
(nForce 790i SLI Ultra chipset)

Radeon HD 4870 1GB
Radeon HD 4850 X2
Radeon HD 4970 X2
GeForce GTX 260 Core 216
GeForce GTX 280
GeForce GTX 295
Asus ENGTX295

4096MB Corsair DDR3-1333 C7
(4 X 1GB)

Integrated Audio
Integrated Network

Western Digital "Raptor" 150GB
(10,000RPM - SATA)

Relevant Software:

Windows Vista Ultimate SP1
DirectX November 2008 Redist

NVIDIA Forceware v180.87
ATI Catalyst v8.12b

Benchmarks Used:
3DMark Vantage v1.0.1
Unreal Tournament 3 v1.3*
Crysis v1.21*
Left 4 Dead*
Enemy Territory: Quake Wars v1.5*
FarCry 2
Fallout 3*

* - Custom Benchmark

Futuremark 3DMark Vantage
Synthetic DirectX Gaming

3DMark Vantage

The latest version of Futuremark's synthetic 3D gaming benchmark, 3DMark Vantage, is specifically bound to Windows Vista-based systems because it uses some advanced visual technologies that are only available with DirectX 10, which y isn't available on previous versions of Windows.  3DMark Vantage isn't simply a port of 3DMark06 to DirectX 10 though.  With this latest version of the benchmark, Futuremark has incorporated two new graphics tests, two new CPU tests, several new feature tests, in addition to support for the latest PC hardware.  We tested the graphics cards here with 3DMark Vantage's Extreme preset option, which uses a resolution of 1,920x1,200, with 4x anti-aliasing an 16x anisotropic filtering.

According to 3DMark Vantage's "Extreme" test, the new GeForce GTX 295 is the clear victor, besting every other graphics card we tested by a significant margin.  Both the overall and GPU scores put the GeForce GTX 295 well out in front of its nearest competition, the Radeon HD 4870 X2.

If we tunnel a little deeper and look at the individual game tests, we can see how the GeForce GTX 295 managed such a high overall score.  In both GPU test 1 and test 2, the GeForce GTX 295 put up framerates significantly higher than any other card we tested.

Unreal Tournament 3

NVIDIA Accelerates the Search For a Cure

Unreal Tournament 3
DirectX Gaming Performance

Unreal Tournament 3

If you're a long-time PC gamer, the Unreal Tournament franchise should need no introduction.  UT's fast paced action and over the top weapons have been popular for as long as Epic has been making the games.  For these tests, we used the latest addition to the franchise, Unreal Tournament 3.  The game doesn't have a built-in benchmarking tool, however, so we enlisted the help of FRAPS here.  These tests were run at resolutions of 1,920 x 1,200 and 2,560 x 1,600 with no anti-aliasing or anisotropic filtering enabled, but with the UT3's in game graphical options set to their maximum values, with color correction and motion blur enabled.

The GeForce GTX 295 outpaced all comers once again in our custom Unreal Tournament 3 benchmark.  At 1920x1200, all of the high-end cards were grouped fairly close together, but once we upped the resolution to 2560x1600, the GTX 295 finished well ahead of every other card.

Enemy Territory: Quake Wars

NVIDIA Accelerates the Search For a Cure

Enemy Territory: Quake Wars
OpenGL Gaming Performance

Enemy Territory:
Quake Wars

Enemy Territory: Quake Wars is Based on a radically enhanced version of id's Doom 3 engine and viewed by many as Battlefield 2 meets the Strogg, and then some.  In fact, we'd venture to say that id took EA's team-based warfare genre up a notch or two.  ET: Quake Wars also marks the introduction of John Carmack's "Megatexture" technology that employs large environment and terrain textures that cover vast areas of maps without the need to repeat and tile many smaller textures.  The beauty of megatexture technology is that each unit only takes up a maximum of 8MB of frame buffer memory.  Add to that HDR-like bloom lighting and leading edge shadowing effects and Enemy Territory: Quake Wars looks great, plays well and works high end graphics cards vigorously.  The game was tested with all of its in-game options set to their maximum values with soft particles enabled in addition to 4x anti-aliasing and 16x anisotropic filtering.

Our Enemy Territory: Quake Wars benchmark results look much like the Unreal Tournament 3 results on the previous page.  In this OpenGL based game, the GTX 295 and 4870 X2 perform similarly at 1920x1200--with a slight edge going to NVIDIA's new flagship.  But with the resolution increased to 2560x1600, the GTX 295's margin of victory increases from about 4.5% to roughly 6.7%.

Crysis v1.21

NVIDIA Accelerates the Search For a Cure

Crysis v1.21
DirectX 10 Gaming Performance


If you're at all into enthusiast computing, the highly anticipated single player, FPS smash-hit Crysis, should require no introduction. Crytek's game engine produces some stunning visuals that are easily the most impressive real-time 3D renderings we've seen on the PC to date.  The engine employs some of the latest techniques in 3D rendering like Parallax Occlusion Mapping, Subsurface Scattering, Motion Blur and Depth-of-Field effects, as well as some of the most impressive use of Shader technology we've seen yet.  In short, for those of you that want to skip the technical jib-jab, Crysis is a beast of a game.  We ran the full game patched to v1.21 with all of its visual options set to 'Very High' to put a significant load on the graphics cards being tested  A custom demo recorded on the Ice level was used throughout testing.

Talk about a photo finish.  In our custom Crysis benchmark, the GeForce GTX 295 and Radeon HD 4870 X2 put up nearly identical scores.  Less than a single frame per second separates the two cards here, at both resolutions.  And both the GTX 295 and Radeon HD 4870 X2 finish well ahead of the other cards we tested.

FarCry 2

NVIDIA Accelerates the Search For a Cure

FarCry 2
DirectX Gaming Performance

FarCry 2

Like the original, FarCry 2 is one of the more visually impressive games to be released on the PC to date.  Courtesy of the Dunia game engine developed by Ubisoft, FarCry 2's game-play is enhanced by advanced environment physics, destructible terrain, high resolution textures, complex shaders, realistic dynamic lighting, and motion-captured animations.  We benchmarked the graphics cards in this article with a fully patched version of FarCry 2, using one of the built-in demo runs recorded in the "Ranch" map.  The test results shown here were run at various resolutions with 4X AA and No anisotropic enabled concurrently.

The new GeForce GTX 295 was back to its winning ways in the FarCry 2 benchmark.  In this game, the GeForce GTX 295 and Radeon HD 4970 X2 look like they perform similarly, but those few frames per second separating the two cards equates to an approximate 9.3% advantage for NVIDIA's new flagship.

Fallout 3

NVIDIA Accelerates the Search For a Cure

Fallout 3
DirectX Gaming Performance

Fallout 3

Fallout 3 is an action role-playing game released by Bethesda Game Studios. It is the third major game in the Fallout series, and it has received a positive response from critics who have praised its open-ended gameplay and flexible character-leveling system. fallout 3 has been compared to the 2007 game BioShock for its setting and use of elements from mid-twentieth century American culture. We tested the game at resolutions of 1,920 x 1,200 and 2,560 x 1,600 with 4x anti-aliasing and 16x anisotropic filtering enabled and all in game graphical options set to their maximum values.

Fallout 3 showed very little variation between all of the cards we tested, with the exception of the 9800 GX2. Like FarCry 2 on the previous page, Fallout 3 seems to benefit from frame buffers larger than 512MB when running at high-resolutions with high levels anti-aliasing and anisotropic filtering enabled. The 512MB available to each GPU on the 9800 GX2 just isn't enough for this test once the resolution is increased to 2560x1600.

Left 4 Dead

NVIDIA Accelerates the Search For a Cure

Left 4 Dead
DirectX Gaming Performance

Left 4 Dead

Left 4 Dead is a co-operative, survival horror, first-person shooter that was developed by Turtle Rock Studios, which was purchased by Valve part-way into development. Like Half Life 2, the game uses the Source engine, however, the visual in L4D are far superior to anything seen in the Half Life universe to date. The game pits four Survivors of an apocalyptic pandemic against hordes of aggressive zombies. We tested the game at resolutions of 1,920 x 1,200 and 2,560 x 1,600 with 4x anti-aliasing and 16x anisotropic filtering enabled and all in game graphical options set to their maximum values.

The new GeForce GTX 295 and Radeon HD 4870 X2 finished right on top of each other in our custom Left 4 Dead benchmark, when running at a resolution of 1920x1200.  However, with the resolution increased to 2560x1600, the GeForce GTX 295 once again pulls out in front, extending its margin of victory to about 17.8%.

Power Consumption

NVIDIA Accelerates the Search For a Cure

We'd like to cover a few final data points before bringing this article to a close. Throughout all of our benchmarking and testing, we monitored how much power our test systems were consuming using a power meter. Our goal was to give you an idea as to how much power each configuration used while idling and under a heavy workload. Please keep in mind that we were testing total system power consumption at the outlet here, not just the power being drawn by the motherboards alone.

Total System Power Consumption
Tested at the Outlet

Manufacturing the GT200 GPU on a 55nm process seems to have done a lot of good in terms of power consumption.  While idling at the Windows desktop, the card consumes only a few more watts than the single GPU GeForce GTX 280.  Keep in mind, the GTX 295 has nearly double the number of memory chips on board along with the second GPU.  And under load, the GeForce GTX 295 consumes about 40 fewer watts than the Radeon HD 4870 X2, despite outperforming it in the vast majority of our benchmarks.

We should also note, that altough the GTX 295 can consume a good amount of power, the card's cooler is relatively quiet.  You can hear the fan spinning up in some situations, but by no means would be consider it loud.

Our Summary and Conclusion

NVIDIA Accelerates the Search For a Cure

Performance Summary: Throughout our entire suite of benchmarks, the new GeForce GTX 295 proved to be an excellent performer.  In the vast majority of our tests, the GeForce GTX 295 was clearly the fastest card overall, sometimes besting the Radeon HD 4870 X2 by almost 18%.  And in the few tests where the Radeon managed to pull ahead, its margin of victory was small at best.  Power consumption was also good, relatively speaking, coming in well below of the Radeon HD 4870 X2 while running under the stress of a full 3D workload.


It took a few months, but NVIDIA was able to battle back and recapture the overall 3D performance crown from a resurgent AMD with the new GeForce GTX 295. As our tests have clearly shown, the GeForce GTX 295 is currently the fastest desktop graphics card on the planet.

And keep in mind, NVIDIA was able to do this with an existing GPU design, that was simply manufactured using a more advanced 55nm fabrication process. Somewhere deep inside NVIDIA, there is a new GPU architecture lurking that's likely to raise the performance bar even further at some point in the not too distant future. Of course, sticking with an existing design, means there aren't any new features to speak of being introduced with the GeForce GTX 295. Whereas AMD supports DX10.1, NVIDAI still does not. NVIDIA, however, has the added benefit of CUDA and PhysX support.

In response to the GeForce GTX 295, it seems AMD will be making some changes to the Radeon HD 4870 X2's pricing. The GeForce GTX is set to debut at $499, with immediate availability. That would put it right about on par with the 4870 X2 with yesterday's pricing. AMD has informed us that its board partners will likely be dropping the price of the X2 down to $449, with a few also offering a $50 mail in rebate--bringing the final price down to $399.

Regardless of pricing though, NVIDIA's new baby is impressive. The new GeForce GTX 295 was undoubtedly a very complex piece of hardware to design, but NVIDIA pulled it off and they have recaptured the 3D performance crown as a result. If you've got the coin and are looking for the ultimate in gaming performance, look no further.

  • Excellent Performance
  • Relatively Quiet
  • PhysX and CUDA Support
  • Pricey
  • Dependant on driver profiles to perform well in all games

Content Property of HotHardware.com