Logo   Banner   TopRight
Gigabyte GV-NX79X512DB-RH
Date: Jun 13, 2006
Author: Marco Chiappetta
Introduction, Specifications & Bundle

Now that the dual-GPU GeForce 7950 GX2 has hit the streets, the GeForce 7900 GTX is no longer NVIDIA's flagship desktop graphics adapter. But that doesn't mean it should all of a sudden be disregarded and brushed aside as an afterthought for a high-end gaming rig. In fact, the 7900 GTX may even become a more attractive option for some as its price inevitably drops over the coming the weeks and months. It's not like the 7900 GTX has suddenly become a bad performer, just because a more powerful and more expensive option has come along.

We've got a new GeForce 7900 GTX on the test-bench today, courtesy of Gigabyte. The GV-NX79X512DB-RH we'll be looking at here doesn't stray much from NVIDIA's reference design, but it does offer some things that its competition doesn't, and the GV-NX79X512DB-RH is also one of the least expensive 7900 GTX board currently available.


Gigabyte's GeForce 7900 GTX Features and Specifications
Powered by NVIDIA's Flagship GPU

NVIDIA CineFX 4.0 Shading Architecture

Vertex Shaders

  • Support for Microsoft DirectX 9.0 Vertex Shader 3.0
  • Displacement mapping
  • Geometry instancing
  • Infinite length vertex programs

Pixel Shaders

  • Support for DirectX 9.0 Pixel Shader 3.0
  • Full pixel branching support
  • Support for Multiple Render Targets (MRTs)
  • Infinite length pixel programs

Next-Generation Texture Engine

  • Accelerated texture access
  • Up to 16 textures per rendering pass
  • Support for 16-bit floating point format and 32-bit floating point format
  • Support for non-power of two textures
  • Support for sRGB texture format for gamma textures
  • DirectX and S3TC texture compression
  • Full 128-bit studio-quality floating point precision through the entire rendering pipeline with native hardware support for 32bpp, 64bpp, and 128bpp rendering modes

64-Bit Texture Filtering and Blending

  • Full floating point support throughout entire pipeline
  • Floating point filtering improves the quality of images in motion
  • Floating point texturing drives new levels of clarity and image detail
  • Floating point frame buffer blending gives detail to special effects like motion blur and explosions

NVIDIA Intellisample 4.0 Technology

  • Advanced 16x anisotropic filtering (with up to 128 Taps)
  • Blistering- fast antialiasing and compression performance
  • Gamma-adjusted rotated-grid antialiasing removes jagged edges for incredible image quality
  • Transparent multisampling and transparent supersampling modes boost antialiasing quality to new levels
  • Support for normal map compression
  • Support for advanced lossless compression algorithms for color, texture, and z-data at even higher resolutions and frame rates
  • Fast z-clear

API Support

  • Complete DirectX support, including the latest version of Microsoft DirectX 9.0 Shader Model 3.0
  • Full OpenGL support, including OpenGL 2.0

NVIDIA Digital Vibrance Control (DVC) 3.0 Technology

  • DVC color controls
  • DVC image sharpening controls

NVIDIA SLI Technology

  • Patented hardware and software technology allows two GPUs to run in parallel to scale performance
  • Scales performance on over 60 top PC games and applications

NVIDIA UltraShadow II Technology

  • Designed to enhance the performance of shadow-intensive games

NVIDIA PureVideo Technology

  • Adaptable programmable video processor
  • High-definition MPEG-2 and WMV9 hardware acceleration
  • Spatial-temporal de- interlacing
  • Inverse 2:2 and 3:2 pull-down (Inverse Telecine)
  • 4-tap horizontal, 5-tap vertical scaling
  • Overlay color temperature correction
  • Microsoft Video Mixing Renderer (VMR) supports multiple video windows with full video quality and features in each window
  • Integrated HDTV output

Composited Desktop Hardware Engine

  • Video post-processing
  • Real-time desktop compositing
  • Accelerated antialiased text rendering
  • Pixel shader-driven special effects and animation

Advanced Display Functionality

  • Dual integrated 400MHz RAMDACs for display resolutions up to and including 2048x1536 at 85Hz
  • Dual DVO ports for interfacing to external TMDS transmitters and external TV encoders
  • Full NVIDIA nView multi-display technology capability

Advanced Engineering

  • Designed for PCI Express x16
  • Designed for high-speed GDDR3 memory


Gigabyte includes a very good assortment of software and accessories with the GV-NX79X512DB-RH. Along with the card itself, we found a complete user's manual, two CDs, a DVD, and a variety of cables and adapters. One the software front, Gigabyte throws in an obligatory driver CD, another CD with a full version of Cyberlink's excellent PowerDVD 6, and a full version of Serious Sam II on DVD. Bundled hardware included an S-Video / HD Component output adapter, a pair of DVI to DB15 adapters, and a dual-Molex to 6-Pin PCI Express power adapter.  Overall, we'd consider this a well balanced bundle.  You've got games, utilities, and adapters to appease a large portion of Gigabyte's target audience.

Inspecting the Card

As we mentioned earlier, the Gigabyte GV-NX79X512DB-RH doesn't stray very far from NVIDIA's reference design and specifications. In fact, this board was equipped with the very same BIOS revision as the reference GeForce 7900 GTXs we tested back in March of this year.  About the only thing on the GV-NX79X512DB-RH that differentiates it from a reference board is the "Gigabyte" sticker smack dab in the middle of its cooling fan. 




The GV-NX79X512DB-RH is powered by NVIDIA's G71 GPU and 512MB of GDDR3 memory. By default, the GPU is clocked at 650MHz and the memory is clocked at 1.6GHz (800MHz DDR). If you'd like a more detailed description of the technologies employed in the G71, we recommend taking a look at this article.  We explain what makes the G71 tick there.

The Gigabyte GV-NX79X512DB-RH is equipped with a two-slot cooler, similar to the one used on the Quadro FX 4500 and 512MB GeForce 7800 GTX. The cooler has a large copper heat-plate in the center and is flanked by two large banks of aluminum fins. Connecting the heatsink's fins to the heat-plate are four liquid filled heat-pipes (two per side). In the center of the cooler is large, variable speed fan that blows air over the heatsink's fins, where its then exhausted from the system through vents in its mounting plate.

This card sport dual, dual-link-DVI outputs and a video output, along with the same SLI connector found on other SLI enabled GeForce cards. Gigabyte, however, includes a small cover presumably to keep the SLI connector safe.  The DVI and TV outputs can be used to power dual independent displays simultaneously, or four displays should two 7900 GTX cards be installed in a single system.

Our Test System & 3DMark06

HOW WE CONFIGURED THE TEST SYSTEMS: We tested the NVIDIA based cards on an Asus A8N32-SLI nForce 4 SLIX16 chipset based motherboard. The ATI powered cards, however, were tested on an A8R32-MVP motherboard based on the CrossFire Xpress 3200 chipset. Both systems used the same AMD Athlon 64 FX-60 dual-core processor and 2GB of low-latency Corsair XMS RAM, though. The first thing we did when configuring these test systems was enter each BIOS and loaded their "High Performance Defaults."  The hard drive was then formatted, and Windows XP Professional with SP2 was installed. When the installation was complete, we installed the latest chipset drivers available, installed all of the other drivers necessary for the rest of our components, and removed Windows Messenger from the system.  Auto-Updating and System Restore were also disabled, the hard drive was defragmented, and a 1024MB permanent page file was created on the same partition as the Windows installation. Lastly, we set Windows XP's Visual Effects to "best performance," installed all of the benchmarking software, and ran the tests.

The HotHardware Test Systems
AMD Athlon 64 FX Powered

Processor -

Motherboard -

Video Cards -

Memory -

Audio -

Hard Driv
e -


Hardware Used:
AMD Athlon 64 FX-60 (2.6GHz x 2)

Asus A8N32-SLI
nForce4 SLIX16 chipset

Asus A8R32-MVP
ATI CrossFire Xpress 3200

Gigabyte GV-NX79X512DB-RH

GeForce 7950 GX2
Reference GeForce 7900 GTX
Radeon X1900 XTX

2048MB Corsair XMS PC3200 RAM

Integrated on board

Western Digital "Raptor"

36GB - 10,000RPM - SATA

OS -
Chipset Drivers -
DirectX -

Video Drivers

Synthetic (DX) -
DirectX -
DirectX -
DirectX -
DirectX -
OpenGL -
Relevant Software:
Windows XP Professional SP2
nForce Drivers v6.85
DirectX 9.0c (March Redist)

NVIDIA Forceware v91.29

ATI Catalyst v6.5

Benchmarks Used:
3DMark06 v1.0.2
Splinter Cell: Chaos Theory v1.05
FarCry v1.33*
F.E.A.R. v1.05
Half Life 2*
Quake 4 v1.2*

* - Custom Test (HH Exclusive demo)
Performance Comparisons with 3DMark06 v1.0.2
Details: http://www.futuremark.com/products/3dmark06/

Futuremark recently launched a brand-new version of their popular benchmark, 3DMark06. The new version of the benchmark is updated in a number of ways, and now includes not only Shader Model 2.0 tests, but Shader Model 3.0 and HDR tests as well. Some of the assets from 3DMark05 have been re-used, but the scenes are now rendered with much more geometric detail and the shader complexity is vastly increased as well. Max shader length in 3DMark05 was 96 instructions, while 3DMark06 ups the number of instructions to 512. 3DMark06 also employs much more lighting, and there is extensive use of soft shadows. With 3DMark06, Futuremark has also updated how the final score is tabulated. In this latest version of the benchmark, SM 2.0 and HDR / SM3.0 tests are weighted and the CPU score is factored into the final tally as well.

Gigabyte's GV-NX79X512DB performed almost exactly like NVIDIA's reference GeForce 7900 GTX, which is to say it performed quite good. The new 7950 GX2 was significantly faster, but this is to be expected considering the GX2 has two similar GPUs and double the amount of on-board frame buffer memory.  And of course, the GX2 is much more expensive to boot. 

If we focus on 3DMark06's individual results, we see that the Gigabyte GV-NX79X512DB also performed well in the Shader Model 2.0 tests, outpacing a Radeon X1900 XTX by a couple of hundred points. 

The same can't be said in the more taxing Shader Model 3.0 tests, however.  Here, the Radeon X1900 XTX turns the tables on the Gigabyte GV-NX79X512DB and outpaces the 7900 GTX by a couple of percentage points. And as you can clearly see, the new GeForce 7950 GX2 simply can't be touched.

Splinter Cell: Chaos Theory v1.05

Performance Comparisons with Splinter Cell: Chaos Theory v1.05
Details: http://www.splintercell3.com/us/

SC: Chaos Theory
Based on a heavily modified version of the Unreal Engine, enhanced with a slew of DX9 shaders, lighting and mapping effects, Splinter Cell: Chaos Theory is gorgeous with its very immersive, albeit dark, environment. The game engine has a shader model 3.0 code path that allows the GeForce 6 & 7 Series of cards, and the new X1000 family of cards, to really shine, and a recent patch has implemented a shader model 2.0 path for ATI's X8x0 generation of graphics hardware. For these tests we enabled the SM 3.0 path on all of the cards we tested. However, High Dynamic Range rendering was disabled so that we could test the game with anti-aliasing enabled (a future patch should enable AA with HDR on the X1K family). We benchmarked the game at resolutions of 1,280 x 1024 and 1,600 x 1,200, with 4X anti-aliasing and 16X anisotropic filtering enabled.

The Gigabyte GV-NX79X512DB, reference GeForce 7900 GTX, and Radeon X1900 XTX were evenly matched in the Splinter Cell: Chaos Theory benchmark.  At both resolutions, only a couple of frames per second separated all of the single-GPU powered boards.  The Radeon X1900 XTX slightly outpaced both of the GeForce 7900 GTXs we tested, but the differences in performance can hardly be considered significant.  In a real-world gaming scenario, the XTX's advantage in this game would not be perceived by most gamers.

FarCry v1.33

Performance Comparisons with FarCry v1.33
Details: http://www.farcry.ubi.com/

If you've been on top of the gaming scene for some time, you probably know that FarCry was one of the most visually impressive games to be released on the PC last year. Courtesy of its proprietary engine, dubbed "CryEngine" by its developers, FarCry's game-play is enhanced by Polybump mapping, advanced environment physics, destructible terrain, dynamic lighting, motion-captured animation, and surround sound. Before titles such as Half-Life 2 and Doom 3 hit the scene, FarCry gave us a taste of what was to come in next-generation 3D gaming on the PC. We benchmarked the graphics cards in this article with a custom-recorded demo run taken in the "Catacombs" area checkpoint, at various resolutions with 4X AA and 16X aniso enabled concurrently.

The Gigabyte GV-NX79X512DB also performed very well in our custom FarCry benchmark, but it wasn't quite able to catch the Radeon X1900 XTX. And the GeForce 7950 GX2 with it's dual-G71s and 1GB of frame buffer memory finished way out in front. The Gigabyte GV-NX79X512DB finished four frames per second behind the XTX at the lower resolution, and nine frames per second behind at 1600x1200.  77+ frames per second at 1600x1200 with 4X anti-aliasing and 16X anisotropic filtering enabled is a good score though.  It's just not as good as ATI's X1900 XTX.

Half Life 2

Performance Comparisons with Half-Life 2
Details: http://www.half-life2.com/

Half Life 2
Thanks to the dedication of hardcore PC gamers and a huge mod-community, the original Half-Life became one of the most successful first person shooters of all time.  So, when Valve announced Half-Life 2 was close to completion in mid-2003, gamers the world over sat in eager anticipation. Unfortunately, thanks to a compromised internal network, the theft of a portion of the game's source code, and a tumultuous relationship with the game's distributor, Vivendi Universal, we all had to wait until November '04 to get our hands on this classic. We benchmarked Half-Life 2 with a long, custom-recorded timedemo in the "Canals" map, that takes us through both outdoor and indoor environments. These tests were run at resolutions of 1,280 x 1,024 and 1,600 x 1,200 with 4X anti-aliasing and 16X anisotropic filtering enabled concurrently.

Our custom Half Life 2 benchmark was completely CPU bound with all of the high-end graphics cards we tested. We'll be moving onto Half Life 2: Lost Coast with HDR in future articles though.  Here, the Radeon X1900 XTX is technically the fastest of the bunch, but its leads aren't major. The slight differences in performance shown in the graph above can likely be attributed to game engine optimizations and reduced CPU overhead in ATI's drivers more than anything else.

F.E.A.R. v1.05

Performance Comparisons with F.E.A.R
More Info: http://www.whatisfear.com/us/

One of the most highly anticipated titles of 2005 was Monolith's paranormal thriller F.E.A.R. Taking a look at the minimum system requirements, we see that you will need at least a 1.7GHz Pentium 4 with 512MB of system memory and a 64MB graphics card, that is a Radeon 9000 or GeForce4 Ti-class or better, to adequately run the game. Using the full retail release of the game patched to v1.03, we put the graphics cards in this review through their paces to see how they fared with a popular title. Here, all graphics settings within the game were set to the maximum values, but with soft shadows disabled (Soft shadows and anti-aliasing do not work together currently). Benchmark runs were then completed at resolutions of 1152x864 and 1600x1200, with anti-aliasing and anisotropic filtering enabled.

All of the single-GPU powered card performed similarly in the F.E.A.R. benchmark. The dual-GPU GeForce 7950 GX2 was in a league of its own, however.  Disregarding the GX2 results, we see the Gigabyte GV-NX79X512DB performing at the same level as NVIDIA's reference GeForce 7900 GTX and roughly on-par with the Radeon X1900 XTX.  The Radeon X1900 XTX was two frames per second slower than the GV-NX79X512DB at the lower resolution, but once we cranked the test system up to 1600x1200, the Radeon pulled ahead by a single frame per second.

Quake 4 v1.2

Performance Comparisons with Quake 4
Details: http://www.quake4game.com/

Quake 4
id Software, in conjunction with developer Raven, recently released the latest addition to the wildly popular Quake franchise, Quake 4. Quake 4 is based upon an updated and slightly modified version of the Doom 3 engine, and as such performance characteristics between the two titles are very similar.  Like Doom 3, Quake 4 is also an OpenGL game that uses extremely high-detailed textures and a ton of dynamic lighting and shadows, but unlike Doom3, Quake 4 features some outdoor environments as well. We ran this these Quake 4 benchmarks using a custom demo with the game set to its "High-Quality" mode, at resolutions of 1,280 x 1,024 and 1,600 x 1,200 with 4X AA and 8X aniso enabled simultaneously.

Gigabyte's GV-NX79X512DB performed very well in our custom Quake 4 benchmark.  Its performance was on-par with NVIDIA's reference GeForce 7900 GTX but a good amount faster than ATI's flagship Radeon X1900 XTX. Once again, the new GX2 was completely in a league of its own, but that is to be expected. The Gigabyte GV-NX79X512DB finished this test with a frame rate about 16% higher than the XTX at 1280x1024 and about 14% higher at 1600x1200.

Overclocking Gigabyte's GTX

As we neared the end of our testing, we spent a little time overclocking the new Gigabyte GV-NX79X512DB using the clock frequency slider available within NVIDIA's Forceware drivers, after enabling the "Coolbits" registry tweak. To find the card's peak core and memory frequencies, we slowly raised their respective sliders until we begun to see visual artifacts on-screen while running a game or benchmark, or our test system became unstable.

Overclocking Gigabyte's GeForce 7900 GTX
(Fast 3D Video Card) + Overclocking = Even Faster Card

Gigabyte GV-NX79X512DB Overclocked: 698MHz GPU / 876MHz Memory
Gigabyte GV-NX79X512DB Stock:
650MHz GPU / 800MHz Memory


Gigabyte GV-NX79X512DB Overclocked: 698MHz GPU / 876MHz Memory
Gigabyte GV-NX79X512DB Stock:
650MHz GPU / 800MHz Memory

When all was said and done, we were able to take the Gigabyte GV-NX79X512DB up from it's default GPU and memory clock speeds of 650MHz / 800MHz (1.6GHz DDR) all the way up to 698MHz / 876MHz -- increases of 48MHz and 76MHz, respectively.  While we had the card overclocked, we re-ran a couple of benchmarks to see how the increases clock speeds affected performance, and we saw 245 point jump in 3DMark06 and a 3 frames per second increase in F.E.A.R.

Our Summary & Conclusion

Performance Summary: The Gigabyte GV-NX79X512DB performed just like NVIDIA's reference GeForce 7900 GTX throughout our entire battery of benchmarks, which is to say it performed very well. In 3DMark06 and Quake 4 the GV-NX79X512DB outperformed a Radeon X1900 XTX, but in HL2, Splinter Cell, and FarCry, ATI's flagship was the fastest of the single-GPU powered cards. In the F.E.A.R. benchmark the GV-NX79X512DB and XTX traded victories.  The new GeForce 7950 GX2 with it's dual-GPUs and 1GB of frame buffer memory, however, was clearly the highest performer overall.

If you're in the market for a high-end graphics adapter, or are considering adding a second 7900 GTX to your system for some SLI action, look into Gigabyte's GV-NX79X512DB. This card doesn't offer super-high clock speeds like some of the "pre-overclocked" GeForce 7900 GTXs currently available, but with a little overclocking it'll perform right alongside these higher clocked cards.  Where Gigabyte's card shines, however, is in regard to its bundle and its price point.  Gigabyte bundles full versions of PowerDVD 6 and Serious Sam II with the GV-NX79X512DB, and this card is one of the least expensive GeForce 7900 GTXs on the market at under $480. The GeForce 7900 GTX may no longer be NVIDIA's flagship graphics adapter, but the Gigabyte GV-NX79X512DB is still one heck of a performer.  We're giving this card a solid 8.5 on the Heat Meter.

  • Great Performance
  • Good Overclocker
  • Decent Bundle
  • Competitive Price
  • Still Relatively Expensive
  • No Longer NVIDIA's Flagship

Get into HotHardware's PC Hardware Forum Right Now!

Content Property of HotHardware.com