Logo   Banner   TopRight
Gigabyte GeForce 8800 GT GV-NX88T512HP
Date: Mar 03, 2008
Author: Shane Unrein
Introduction, Features and Specs

Gigabyte is no stranger to innovation. Year after year, we see the company build unique features into its products to increase performance, improve power utilization, decrease noise, and enhance the user experience. As enthusiasts, we love to see innovations that improve products and differentiate them from the rest of the pack. Gigabyte has done just that with one of its GeForce 8800 GT video cards, the GV-NX88T512HP.

Most companies selling 8800s just use the NVIDIA reference design with a different sticker on the cooler, and sometimes the cards will get overclocked as well. Gigabyte wasn't happy with that idea, though, when it designed its latest 8800 GT TurboForce Edition card, which it called the GV-NX88T512HP. The deviations from NVIDIA's reference design are intriguing to say the least and go well beyond the obvious. Sure, Gigabyte used a very appealing Zalman cooler and even overclocked the GV-NX88T512HP, but that just scratches the surface.

We'll elaborate on what other features make the GV-NX88T512HP special a little bit later. For now, we'll whet your appetite by telling you that Gigabyte overclocked this card's core, memory, and shader clocks to 700 MHz, 920 MHz (1840 MHz DDR), and 1700 MHz, respectively. Recall that NVIDIA's reference clocks are 600 MHz, 900 MHz (1800 MHz DDR), and 1500 MHz, respectively. We have two GV-NX88T512HPs in our labs, so we'll not only be testing its performance in single GPU mode but also in an SLI configuration. If that's not enough to get you to keep reading, then maybe the fact that Gigabyte includes the ability to boost this card's GPU voltage will.

Gigabyte 8800 GT TurboForce (GV-NX88T512HP)
Features & Specifications

Fabrication:  65nm

Number of Transistors: 
754 Million

Stream Processors:  112

Memory Interface:  256-bit

Frame Buffer Size:  512 MB


HDCP Support:  Yes

HDMI Support: 

2xDual-Link DVI-I
7-Pin TV Out


Bus Technology: 
PCI Express 2.0

Max Board Power: 
110 Watts

NVIDIA unified architecture:

Fully unified shader core dynamically allocates processing power to geometry, vertex, physics, or pixel shading operations, delivering up to 2x the gaming performance of prior generation GPUs.

Full Microsoft DirectX 10 Support:
World's first DirectX 10 GPU with full Shader Model 4.0 support delivers unparalleled levels of graphics realism and film-quality effects.

NVIDIA SLI Technology:
Delivers up to 2x the performance of a single graphics card configuration for unequaled gaming experiences by allowing two cards to run in parallel. The must-have feature for performance PCI Express graphics, SLI dramatically scales performance on today's hottest games.

NVIDIA Lumenex Engine:
Delivers stunning image quality and floating point accuracy at ultra-fast frame rates.
16x Anti-aliasing: Lightning fast, high-quality anti-aliasing at up to 16x sample rates obliterates jagged edges.

128-bit floating point High Dynamic-Range (HDR):
Twice the precision of prior generations for incredibly realistic lighting effects - now with support for anti-aliasing.

NVIDIA Quantum Effects Technology:
Advanced shader processors architected for physics computation enable a new level of physics effects to be simulated and rendered on the GPU - all while freeing the CPU to run the game engine and AI.

NVIDIA nView Multi-Display Technology:
Advanced technology provides the ultimate in viewing flexibility and control for multiple monitors.

Dual 400MHz RAMDACs:
Blazing-fast RAMDACs support dual QXGA displays with ultra-high, ergonomic refresh rates - up to 2048x1536@85Hz.

Dual Dual-link DVI Support:
Able to drive the industry's largest and highest resolution flat-panel displays up to 2560x1600.
NVIDIA PureVideo HD Technology:
The combination of high-definition video decode acceleration and post-processing that delivers unprecedented picture clarity, smooth video, accurate color, and precise image scaling for movies and video.

Discrete, Programmable Video Processor:
NVIDIA PureVideo HD is a discrete programmable processing core in NVIDIA GPUs that provides superb picture quality and ultra-smooth movies with low CPU utilization and power.

Hardware Decode Acceleration:
Provides ultra-smooth playback of H.264, VC-1, WMV and MPEG-2 HD and SD movies.

HDCP Capable:
Designed to meet the output protection management (HDCP) and security specifications of the Blu-ray Disc and HD DVD formats, allowing the playback of encrypted movie content on PCs when connected to HDCP-compliant displays.

Spatial-Temporal De-Interlacing:
Sharpens HD and standard definition interlaced content on progressive displays, delivering a crisp, clear picture that rivals high-end home-theater systems.

High-Quality Scaling:
Enlarges lower resolution movies and videos to HDTV resolutions, up to 1080i, while maintaining a clear, clean image. Also provides downscaling of videos, including high-definition, while preserving image detail.

Inverse Telecine (3:2 & 2:2 Pulldown Correction):
Recovers original film images from films-converted-to-video (DVDs, 1080i HD content), providing more accurate movie playback and superior picture quality.

Bad Edit Correction:
When videos are edited after they have been converted from 24 to 25 or 30 frames, the edits can disrupt the normal 3:2 or 2:2 pulldown cadences. PureVideo HD uses advanced processing techniques to detect poor edits, recover the original content, and display perfect picture detail frame after frame for smooth, natural looking video.

Video Color Correction:
NVIDIA's Color Correction Controls, such as Brightness, Contrast and Gamma Correction let you compensate for the different color characteristics of various RGB monitors and TVs ensuring movies are not too dark, overly bright, or washed out regardless of the video format or display type.

Integrated SD and HD TV Output:
Provides world-class TV-out functionality via Composite, S-Video, Component, or DVI connections. Supports resolutions up to 1080p depending on connection type and TV capability.

Noise Reduction:
Improves movie image quality by removing unwanted artifacts.

Edge Enhancement:
Sharpens movie images by providing higher contrast around lines and objects.




We were happy to see that Gigabyte chose a rather small box to house its GV-NX88T512HP. The box is not much bigger than it needs to be to protect the card and accessories. The box features a character from Neverwinter Nights 2, which as you can see in the first shot below is included with the card. As expected, the back of the box touts the features and specs of the card found inside.


Along with the GV-NX88T512HP itself, Gigabyte includes the following accessories: a full version of Neverwinter Nights 2, a driver/utility CD, a manual, a quick install guide, two DVI-to-VGA adapters, a PCI Express power connector, and a component/S-video out cable. We can't say that the inclusion of Neverwinter Nights 2 is terribly exciting, but maybe some of you will find it appealing. It is worth noting that the game has garnered an average score of over 80% in reviews from respected sites across the web.

Closer Look: The Card and Its Unique Features


Closer Look: Gigabyte GV-NX88T512HP
Unique and Innovative Board

We mentioned in the introduction that the GV-NX88T512HP is no ordinary 8800 GT. Now, it's time to show you what we meant. The first thing you'll notice when you see the GV-NX88T512HP is that the cooler is definitely not the regular reference heatsink and fan combo. Instead, Gigabyte opted to utilize the Zalman VF830-AlCu for temperature control. Like many of you, we are fans of Zalman products, so we see this as a promising choice. Additionally, the PCB is not reference either. As you can see in the third image, the GV-NX88T512HP's PCB is actually more than an inch shorter than NVIDIA's reference design.


We are all for cards getting smaller but still packing the same amount of muscle. If there's anything negative to say about the GV-NX88T512HP's design versus the reference design, it's that the Zalman cooler on the GV-NX88T512HP sticks out much farther than the reference cooler. The size of the VF830-AlCu practically makes this a two-slot card.


In the next four images, we wanted to give you a closer look at the Zalman cooler, which is comprised of copper and aluminum. You can see that the cooler features two heatpipes that run through the array of fins. On top of all that metal, we find a relatively big fan. It looks like a pretty typical Zalman cooler if you ask us, but that is usually a good thing. One issue with this cooler, in addition to its size, that may concern some readers is that it only covers the GPU, leaving the memory chips with no direct cooling. We really don't think it's anything to be too concerned about, however, as the cooler does a fine job keeping the GPU cool and residual air-flow does cool the RAM as well.


Before we close out the board tour, let's take a look at the connectors on the card. There is nothing surprising here. The GV-NX88T512HP has an HDTV out and two dual-link DVI connectors (yes, that blue connector really is a regular DVI connector). On the other side of the card, you can see the 6-pin PCI Express power connector. You also get a good look at the higher quality (over the reference design) ferrite core chokes, which leads us to our next topic.


Unique Features

As we have already mentioned, Gigabyte decided to take overclocking one step further with the GV-NX88T512HP by allowing users to increase GPU voltage via custom software. No crazy volt mods here; just use Gigabyte's Gamer HUD software, and you can change the voltage from 1.05V to 1.20V. The default voltage is 1.10V. It would be nice to see the voltage go up to at least 1.30V, but we think it's a nice addition to this unique card regardless.

You may be wondering how Gigabyte achieves this voltage change feature. Since Gigabyte uses a custom PCB, the company was able to include an extra chip on the board, and that chip is dubbed Voltage Gear Overdrive. In the slide below, you can see a close up of this chip and where it is located on the card.

So far, we've covered three main ways that the GV-NX88T512HP is different from a reference 8800 GT: the shorter PCB, the Zalman cooler, and the inclusion of the Voltage Gear Overdrive chip. Gigabyte didn't stop there, though. The GV-NX88T512HP also features Gigabyte's Ultra Durable 2 design initiatives. The following slides describe the benefits of Ultra Durable 2.


"Using high quality components on graphics cards is the key factor for having a long lasting, stable and reliable product. GIGABYTE is again setting a new standard by using Ferrite Core Chokes, Low RDS (on) MOSFET and Lower ESR Solid Capacitors, providing the stability and reliability from their high-end graphics solution. In addition, GIGABYTE Ultra Durable 2 Edition graphics cards now feature a sophisticated layout topology. This state-of-the-art power design from GIGABYTE delivers the ideal integration of thermal, electric characteristics, digital signals, power circuitry and optimum placement of components for enhanced graphics performance."
- from www.gigabyte-usa.com


HH Test System and 3DMark06 Performance

For testing the Gigabyte GV-NX88T512HP, we used an Intel Core 2 Duo E6400 (2.13GHz) on an Abit Fatal1ty FP-IN9 SLI motherboard. We also used 2GB of Corsair DDR2 (TWIN2X1024A-5400UL) and a 120GB Maxtor SATA hard drive. The GV-NX88T512HP was compared to an NVIDIA GeForce 8600 GT, an NVIDIA GeForce 8800 GT 256MB, an NVIDIA GeForce 8800 GT 512MB, an ATI Radeon HD 3850 and an ATI Radeon X1950 Pro. Note: Because we are using both DirectX 9 and DirectX 10 capable cards, we will not be enabling DX10 effects in the DX10 games in order to provide an apples to apples comparison.

HotHardware Test System
Intel C2D Powered

Processor -

Motherboard -

Video Cards -

Memory -

Audio -

Hard Drive -


Hardware Used:
Intel Core 2 Duo E6400 (2.13GHz)

Abit Fatal1ty FP-IN9 SLI
nForce 650i SLI chipset

Gigabyte GV-NX88T512HP
ATI Radeon HD 3850
NVIDIA GeForce 8600 GT
NVIDIA GeForce 8800 GT 256MB
NVIDIA GeForce 8800 GT 512MB
ATI Radeon X1950 Pro

2048MB Corsair XMS2 DDR2-675MHz

Integrated on board

Maxtor DiamondMax Plus 9

120GB - 7200RPM - SATA

Operating System -
Chipset Drivers -
DirectX -

Video Drivers

Synthetic (DX) -
DirectX -
DirectX -
DirectX -
OpenGL -
Relevant Software:
Windows Vista
nForce Drivers v8.43
DirectX 10

NVIDIA Forceware v169.25

ATI Catalyst v8.1

Benchmarks Used:
3DMark06 v1.1.0
Crysis SP Demo
Half-Life 2: Episode 2
Company of Heroes v1.71
Enemy Territory: Quake Wars v1.4

Performance Comparisons with 3DMark06
Details: http://www.futuremark.com/products/3dmark06/

Futuremark recently launched a brand new version of its popular benchmark, 3DMark06. The new version of the benchmark is updated in a number of ways and now includes not only Shader Model 2.0 tests but also Shader Model 3.0 and HDR tests as well. Some of the assets from 3DMark05 have been re-used, but the scenes are now rendered with much more geometric detail, and the shader complexity is vastly increased as well. Max shader length in 3DMark05 was 96 instructions, while 3DMark06 ups the number of instructions to 512. 3DMark06 also employs much more lighting, and there is extensive use of soft shadows. With 3DMark06, Futuremark has also updated how the final score is tabulated. In this latest version of the benchmark, SM 2.0 and HDR / SM3.0 tests are weighted, and the CPU score is factored into the final tally as well.

Despite its overclock, the GV-NX88T512HP doesn't score all that much better than the reference 512MB 8800 GT. The GV-NX88T512HP SLI configuration doesn't offer too much of a performance boost either interestingly, because the SM 2.0 test is CPU bound.  The SM 3.0 test, however, does show a significant gain.  Although the overall gains weren't huge, the GV-NX88T512HP is the best performer in our test line-up when it comes to 3DMark06.

Company of Heroes Performance


Performance Comparisons with Company of Heroes
Details: http://www.companyofheroesgame.com/

Company of Heroes
Relic Entertainment's World War II era real-time strategy game Company of Heroes was originally released as a DirectX 9 title for Windows, but recent upates to the game have incorporated support for new DirectX 10 features that improve image quality and enhance the game's finer graphical details. The game features a built-in performance test, which which we used to attain the results below. Our Company of Heroes tests were run at resolutions of 1280x1024 and 1600x1200 with 4x anti-aliasing and all of the game's image-quality related options set to high.

As expected, the GV-NX88T512HP proves to be the best performer in Company of Heroes. It outperforms the reference 512MB 8800 GT by 4-8 FPS. The SLI configuration also shows some nice gains at 1600x1200.

ET: Quake Wars Performance


Performance Comparisons with ET: Quake Wars
Details: http://www.enemyterritory.com/

Enemy Territory: Quake Wars
Enemy Territory: Quake Wars is based on id's radically enhanced Doom 3 engine and viewed by many as Battlefield 2 meets the Strogg, and then some. In fact, we'd venture to say that id took EA's team-based warfare genre up a notch or two. ET: Quake Wars also marks the introduction of John Carmack's "Megatexture" technology that employs extremely large environment and terrain textures that cover vast areas of maps without the need to repeat and tile many small textures. The beauty of megatexture technology is that each unit only takes up a maximum of 8MB of frame buffer memory. Add to that HDR-like bloom lighting and leading edge shadowing effects and Enemy Territory: Quake Wars looks great, plays well and works high-end graphics cards vigorously. The game was tested with all of its in-game options set to their maximum values with soft particles enabled in addition to 8x anti-aliasing and 16x anisotropic filtering.

You probably noticed the absence of some of the cards in this graph. We had some issues with the Radeons here, so we omitted the results from this test. As you can see, the results scale well for the cards we tested. The GV-NX88T512HP outperforms the reference 512MB 8800 GT by a couple of FPS, and it blows away the 256MB 8800 GT. The SLI configuration really shows what it can do by offering an over 50% boost in performance.

Half-Life 2: EP 2 Performance


Performance Comparisons with Half-Life 2: Episode 2
Details: http://www.half-life2.com/

Half-Life 2: Episode 2
Thanks to the dedication of hardcore PC gamers and a huge mod community, the original Half-Life became one of the most successful first person shooters of all time. And thanks to an updated game engine, gorgeous visual, and intelligent weapon and level design, Half-Life 2 became just as popular. Episode 2 offers a number of visual enhancements, including better looking transparent texture anti-aliasing. These tests were run at resolutions of 1280x1024 and 1600x1200 with 4x anti-aliasing and 8x anisotropic filtering enabled concurrently. Color correction and HDR rendering were also enabled in the game engine as well. We used a custom recorded timedemo file to benchmark all cards in this test.

Once again, the GV-NX88T512HP beats out the competition in our test system, but this time it doesn't win all of the tests. The Radeon HD 3850 dominates the 1280x1024 no AA / no AF test by a wide margin. The GV-NX88T512HP puts a mark in the win column for the three other settings. The GV-NX88T512HP SLI configuration doesn't show off much until we run the 1600x1200 4x AA / 8x AF / HDR test. At this setting, the pair of GV-NX88T512HPs beat the rest of the pack by 18 FPS or more.

Crysis Performance


Performance Comparisons with Crysis
Details: http://www.ea.com/crysis

If you're at all into enthusiast computing, the highly anticipated single player demo of the hot, new FPS smash-hit Crysis, should require no introduction. Crytek's game engine visuals are easily the most impressive real-time 3D renderings we've seen on a computer screen to date. The engine employs some of the latest techniques in 3D rendering, like Parallax Occlusion Mapping, Subsurface Scattering, Motion Blur and Depth-of-Field effects, as well as some of the most impressive use of Shader technology we've seen yet. In short, for those of you that want to skip the technical jib-jab, Crysis is HOT. We ran the SP demo with all of the game's visual options set to 'High' to put a significant load on the graphics cards being tested.

With Crysis and our test system, a 512MB video buffer is a must to get anywhere near playable framerates. You can see that the GV-NX88T512HP dominates this test, and that the SLI configuration doesn't really flex its muscles until we run the game at 1600x1200.

Overclocking the GV-NX88T512HP


Overclocking the GV-NX88T512HP
Going beyond the stock settings...

Recall that the reference 512MB 8800 GT's GPU clock is 600 MHz. Its memory clock is 900 MHz (1800 MHz DDR), and its shader clock is 1500 MHz. The GV-NX88T512HP comes factory overclocked at 700 MHz (GPU), 920 MHz (memory), and 1700 MHz (shader clock). But we used Gigabyte's Gamer HUD software to overclock the GV-NX88T512HP even further. Once we hit a roadblock, we upped the voltage to 1.2V and were able to reach even higher clocks. Our final stable overclock was 800 MHz for the GPU, 1030 MHz (2060 MHz DDR) for the memory, and 1775 MHz for the shaders, which is pretty impressive.

In the graphs above, you can see how much the overclock helped with performance. The performance gained in 3DMark06 isn't that impressive, but the results in ET: Quake Wars are decent. At 1280x1024, the results are 11% higher, and at 1600x1200, the results are 7.5% higher.

Performance Summary and Conclusion


Performance Summary: The Gigabyte GV-NX88T512HP performed much as we had expected. Based on our test results, we can conclude that the GV-NX88T512HP provides excellent performance. That performance can be attributed to the generous factory overclock on the card which gives it an advantage over any "reference" GeForce 8800 GT. SLI performance was also good overall and showed solid gains in performance, especially at the higher resolutions.

There is a quite a bit to like about the GV-NX88T512HP. It features a nice factory overclock, the ability to increase the GPU voltage via Gigabyte's Gamer HUD software, and it uses high quality electrical components.  In addition to the aforementioned items, the GV-NX88T512HP also features a custom Zalman cooler, a PCB that is shorter than the reference design, it offers excellent performance, and a 3-year warranty - all for a price of around $250, which seems about average for an overclocked 8800 GT. The only feature on this card that would make us hesitate before outright recommending it is its cooler.  After spending a good amount of time with this card, we feel the cooler's size and noise level may not appeal to some users. At the same time, Zalman makes quality coolers and know many of you are fans of their products. Additionally, we don't think most people have systems that are quiet enough to notice the fan noise of this card, but it is louder than we had expected and we wanted to make you all aware.

So, why didn't Gigabyte just go with a reference cooler? Well, we didn't ask, but we don't mind speculating a little. For one, the Zalman cooler does a better job cooling (7°C cooler at idle and 3°C cooler under load), which is important when you allow users to boost voltage. And secondly (and perhaps more importantly), the GV-NX88T512HP's PCB is not a reference length, which probably means the reference cooler wasn't an option.

Overall, Gigabyte has left us quite impressed with its efforts on the GV-NX88T512HP. We hope that the company continues to innovate and motivate its competitors to do the same. If you want one of the most innovative and best performing 8800 GTs on the market, and don't need the lowest profile or absolute quietest card, then we recommend that you check out the Gigabyte GV-NX88T512HP.


•  Excellent performance
•  Factory overclocked
•  Good overclocker
•  Gamer HUD software provides voltage control
•  3-year warranty
•  High quality electrical components
•  DirectX 10 support
•  Cooler Is Somewhat Loud
•  2-slot card

Content Property of HotHardware.com