Logo   Banner   TopRight
Gigabyte GV-NX78T256V-B 7800GT
Date: Nov 23, 2005
Author: Robert Maloney


The big headlines in today's video card related articles almost always come down to seeing double; whether it's two NVIDIA GeForce cards connected in SLI or two ATi Radeons running in CrossFire mode.  Perhaps it's justified in that people always want to see higher framerates and increased performance, regardless of the cost.  Just look back at what they did to our pal, Steve Austin.  When they "rebuilt" the Six Million Dollar man, they didn't just give him the powers of an everyday man.  They made him "faster, stronger, better". 

While those kind of articles always make for exciting reading, the average PC user knows that at the end of the day, their purchases are going to invariably fall in below these heady systems that get all the fanfare.  It's simply not feasible to consistently upgrade to the fastest or most powerful hardware on a semi-regular basis for most enthusiasts.  So today, we're not looking at dual-core, dual-GPU monster machines, but instead a powerful video card from Gigabyte, that hummed right through our benchmark suite.  We present the Gigabyte GV-NX78T256V-B...

Gigabyte GV-NX78T256V-B Specifications
A card for the masses
Core GPU Clock - 405MHz
256MB - GDDR3 Memory @ 1 GHz (500MHz DDR)

NVIDIA CineFX 4.0 Shading Architecture
_Vertex Shaders
_Support for Microsoft DirectX 9.0 Vertex Shader 3.0
_Displacement mapping
_Geometry instancing
_Infinite length vertex programs
_Pixel Shaders
_Support for DirectX 9.0 Pixel Shader 3.0
_Full pixel branching support
_Support for Multiple Render Targets (MRTs)
_Infinite length pixel programs
_Next-Generation Texture Engine
_Accelerated texture access
_Up to 16 textures per rendering pass
_Support for 16-bit floating point format and 32-bit floating point format
_Support for non-power of two textures
_Support for sRGB texture format for gamma textures
_DirectX and S3TC texture compression

_Full 128-bit studio-quality floating point precision through the entire rendering pipeline with native hardware support for 32bpp, 64bpp, and 128bpp rendering modes

API Support
• Complete DirectX support, including the latest version of Microsoft DirectX 9.0 Shader Model 3.0
_Full OpenGL support, including OpenGL 2.0

64-Bit Texture Filtering and Blending
_Full floating point support throughout entire pipeline
_Floating point filtering improves the quality of images in motion
_Floating point texturing drives new levels of clarity and image detail
_Floating point frame buffer blending gives detail to special effects like motion blur and explosions

NVIDIA Intellisample 4.0 Technology
_Advanced 16x anisotropic filtering (with up to 128 Taps)
_Blistering- fast antialiasing and compression performance
_Gamma-adjusted rotated-grid antialiasing removes jagged edges for incredible image quality
_Transparent multisampling and transparent supersampling modes boost antialiasing quality to new levels
_Support for normal map compression
_Support for advanced lossless compression algorithms for color, texture, and z-data at even higher resolutions and frame rates
_Fast z-clear

NVIDIA UltraShadow II Technology
_Designed to enhance the performance of shadow-intensive games

NVIDIA Digital Vibrance Control (DVC) 3.0 Technology
_DVC color controls
_DVC image sharpening controls

NVIDIA SLI Technology
_Patented hardware and software technology allows two GPUs to run in parallel to scale performance
_Scales performance on over 60 top PC games and applications
NVIDIA PureVideo Technology
_Adaptable programmable video processor
_High-definition MPEG-2 and WMV9 hardware acceleration
_Spatial-temporal de-interlacing
_Inverse 2:2 and 3:2 pull-down (Inverse Telecine)
_4-tap horizontal, 5-tap vertical scaling
_Overlay color temperature correction
_Microsoft Video Mixing Renderer (VMR) supports multiple video windows with full video quality and features in each window
_Integrated HDTV output

Composited Desktop Hardware Engine
_Video post-processing
_Real-time desktop compositing
_Accelerated antialiased text rendering
_Pixel shader-driven special effects and animation

Advanced Display Functionality
_Dual integrated 400MHz RAMDACs for display resolutions up to and including
2048x1536 at 85Hz
_Dual DVO ports for interfacing to external TMDS transmitters and external TV encoders
_Full NVIDIA nView multi-display technology capability

Advanced Engineering
_Designed for PCI Express x16
_Designed for high-speed GDDR3 memory

Operating Systems
_Windows XP/Windows XP 64
_Windows ME
_Windows 2000
_Macintosh OS X

Software Bundle
_SpellForce - The Order of Dawn
_Xpand Rally
_Power Director 3
_Gigabyte Driver CD

Adapter/Cable Bundle
_VIVO cable w/ breakout box
_DVI adaptor
_Power cord



The GV-NX78T256V-B is one of Gigabyte's latest video cards powered by an NVIDIA GPU, the 7800 GT.  The 7800GT differs from the 7800GTX by having four fewer pipelines and one less vertex shader, as well as having core and memory speeds that are typically about 30MHz slower for the GPU and 100MHz for the memory, when comparing NVIDIA's reference specifications. There doesn't seem to be any way to "turn on" those extra pipelines just yet, so if you want the extra bandwidth you'll need to pay the price for the GTX model. However, the lower price point of the 7800GT has made it a prime candidate for SLI users.  Buying two 7800GTs over two 7800GTXs can often result in a savings close to $200 dollars.   Keeping with our theme of keeping costs down, we've only included single card performance on the pages to come, comparing the 7800GT to the card it supplanted, the 6800GT, as well as one from the competitors: ATi's Radeon X850XT.

Inspection of the contents


The Gigabyte GV-NX78T256V-B Close-up
When in Rome, do as the Romans do.



The Gigabyte GV-NX78T256V looks to be strictly based on NVIDA GeForce 7800 GT reference design, with nothing more than some fancy graphics placed over the standard copper heatsink/fan combo. There doesn't seem to be the same attention-to-detail spent on this card as on some of their previous models, such as the custom fanless heatsinks on the GV-RX80L512V X800XL or the truly beautiful GV-NX68T256DH GeForce 6800 GT, with its dark PCB and golden HSF.  Perhaps Gigabyte simply wanted to bring this card out quickly and will come out with a new, custom revision sometime down the road; only time will tell. 

The layout is strictly by numbers.  The oversized heatsink covers not only the GPU, running at a fairly modest 400MHz, but the memory chips as well.   Air is thus funneled in via the relatively smaller fan and forced out of the vents towards the rear of the card.  It's not the quietest card, as the fan spins quite quickly in order to generate the airflow needed to cool down the components, but it's no ear-buster either.  Finally, as with all other current GeForce models, there's a gold connector along the top of the card that is used for connecting two cards in SLI mode.  Interesting to note, no longer are two of the same cards needed to run in SLI -- starting with the ForceWare 80 drivers, users can use two cards from different vendors.


We laid down the Gigabyte GV-NX78T256V-B directly next to the MSI NX6800GT that will be used for benchmark comparisons, and size-wise the two cards are nearly identical.  Thus, if you had space issues with the length of the 6800GTs, you'll undoubtedly run into the same issues with the 7800GTs.  Both cards take up only one slot, as the heatsink is heavy, yet slim enough to prevent touching a card placed nearby.  With so little attention paid to third party add-in cards these days, this is barely even an issue, but when using SLI it's good to know there's decent room for airflow between the two graphic cards.  Typical of most PCI-e cards, a 6-pin power connector is placed right at the upper edge.  Most never power supplies come with at least one 6-pin power cable, but Gigabyte has provided a splitter for those who don't have one. 

Gigabyte's Software Bundle
A reason for concern?



The bundle, much like the card itself, provides sufficient resources without doing anything to actually stand out and make potential buyers take notice.  Bundled in applications include Cyberlink's Power Director 3 and PowerDVD 6, the latter of which is found on Gigabyte's driver disc.  There are no other custom applications that we've become accustomed to, such as overclocking or temp/voltage monitoring software.  The game "collection" is also a bit light, throwing in some pretty much unknown titles such as SpellForce - The Order of Dawn, an RPG game which typically doesn't push the graphics barrier, and Xpand Rally, a racing game that looks great, although rally racing has a smaller following in the U.S. than it does in, Europe. 


On the hardware front, there's basically the minimum that one need's to get by, with a power cord that diverts two MOLEX power cables into a 6-pin PCI-Express connector and a DVI-to-VGA adaptor.  Gigabyte also threw in a HDTV/VIVO cable, to allow the card to be used for video editing. As you can see, there's little here to make Gigabyte's GV-NX78T256V-B stand out from the competition.

Test Setup and 3DMark05


HOW WE CONFIGURED THE TEST SYSTEM: We tested the Gigabyte GV-NX78T256V-B on an ASUS P5N32-SLI nForce 4 Intel Edition SLI X16 chipset-based motherboard, powered by an Intel Pentium 4 550 processor and 1GB of low-latency Corsair XMS2 DDR2 memory. The first thing we did when configuring this test system was enter the BIOS and loaded the "High Performance Defaults."  The hard drive was then formatted and Windows XP Professional with SP2 was installed. When the installation was complete, we installed the latest nForce 4 chipset drivers, installed all of the other necessary drivers for the rest of our components, and removed Windows Messenger from the system. Auto-Updating and System Restore were then disabled, the hard drive was defragmented, and a 1536 MB permanent page file was created on the same partition as the Windows installation. Lastly, we set Windows' Visual Effects to "best performance," installed all of the benchmarking software, and ran the tests.

The HotHardware Test System
The Everyday man's system

Processor -

Motherboard -

Video Cards -

Memory -

Audio -

Hard Driv
e -


Hardware Used:
Intel Pentium 4 550 (3.4GHz)

ASUS P5N32-SLI Deluxe
nForce4 SLI X16 chipset

Gigabyte GV-NX78T256V-B
GeForce 6800 Ultra
ATI Radeon X850XT PE

1024MB Corsair XMS2 DDR2


Seagate Barracuda V

7,200RPM - SATA

Operating System -
Chipset Drivers -
DirectX -

Video Drivers

Synthetic (DX) -
DirectX -
DirectX -

DirectX -
DirectX -
OpenGL -
OpenGL -
Relevant Software:
Windows XP Professional SP2
nForce Drivers v7.13
DirectX 9.0c

NVIDIA Forceware v81.85

ATI Catalyst v5.10

Benchmarks Used:
3DMark05 v1.2.0
Halo v1.06
Splinter Cell: Chaos Theory v1.04
FarCry v1.33*
Half Life 2*
Doom 3 v1.3*
Chronicles of Riddick v1.1*

* - Custom Test (HH Exclusive demo)
Performance Comparisons with 3DMark05 v1.2.0
Details: http://www.futuremark.com/products/3dmark05/

3DMark05 is the latest installment in a line of synthetic 3D graphics benchmarks from Futuremark Corp that date back to late 1998.  The benchmark itself isn't based on a playable game engine, but it does boast fairly impressive DX9-driven visuals and lighting effects.  It's also a versatile DirectX 9 benchmarking tool that we feel is relative in terms of presenting our readers a full performance profile for 3D graphics and host processors. It's not a metric that you can exclusively gauge real-world performance on but rather a complement to our entire suite of application-based testing.  We ran 3DMark05's default test (1,024 x 768) on all of the cards and configurations we tested.

As you can easily see in the scores above, the Gigabyte GV-NX78T256V-B easily outperformed not only the older MSI NX6800GT, but picked up an extra 10% over ATi's X850XT as well.  It's a clear indication of how well the additional pipelines of the 7800GT (20) prevent it from getting bogged down like the X850XT or 6800GT (16 pipes each).  The X850XT is even clocked higher on both core and memory speeds, but still can't quite compete in this synthetic benchmark.

Halo v1.06 Performance

Halo 1.06 Performance Tests
All Patched & Ready To Go!

For many gamers, the release of Halo marked the end of a long wait because it was originally released as an XBOX exclusive a few years back. No additional patches or tweaks are needed to benchmark with Halo, as Gearbox has included all of the necessary information in its README file. The Halo benchmark runs through four of the cut-scenes from the game, after which the average frame rate is recorded. We patched the game using the latest v1.06 patch and ran this benchmark twice, once at 1280 x 1024 and then again at 1600 x 1200. Anti-aliasing doesn't work properly with Halo, so all of the tests below were run with anti-aliasing disabled.


Although Halo is getting on in terms of length as a viable benchmark, the framerates of the four individual cut scenes still make for some good comparisons.  With the Gigabyte NX78T256BV-B, we were able to break over 100 frames per second at 1280x1024.  The Radeon X850T was more than 12 frames back, equaling approximately a 10% difference in performance.  Furthermore, there was better than a 20% difference between the 7800GT and the 6800GT.  Roughly the same breakdown was seen at 1600x1200 as well.

Splinter Cell : Chaos Theory


Performance Comparisons with Splinter Cell: Chaos Theory v1.04
Details: http://www.splintercell3.com/us/

SC: Chaos Theory
Splinter Cell: Chaos Theory uses a heavily modified version of Epic's Unreal Engine, enhanced with a slew of DX9-class shaders, dynamic lighting, and mapping effects. This game produces some gorgeous imagery with its very immersive, albeit dark environments. We should note that the game engine has a shader model 3.0 code path that allows the GeForce 6 & 7 Series of cards to really shine, but it wasn't until the latest v1.04 patch was released that a shader model 2.0 path was added to take advantage of ATI's x8x0 hardware. For the test results listed below, we turned off HDR rendering and parallax mapping to somewhat level the playing field and benchmarked the game at resolutions of 1280 x 1024 and 1600 x 1200, both with and without anti-aliasing and anisotropic filtering.


With Splinter Cell: Chaos Theory we see that with the current generation of cards, Anti-Aliasing and Anisotropic Filtering techniques no longer create as large of a performance drop as they had in the past.  Sure, there is a definite drop-off in frame rates, but even the 6800GT is able to produce an almost playable 34.27fps at 1600x1200.  More impressively, the NX78T256V-B at 1600x1200 with 4xAA/16X Aniso is outrunning both the NX6800GT and X850XT, when they are not using any additional pixel processing at all. 

Far Cry v1.33 Comparisons


Performance Comparisons with FarCry v1.33
Details: http://www.farcry.ubi.com/

FarCry is one of the most visually impressive games to be released on the PC platform to date, courtesy of its proprietary engine, dubbed "CryEngine" by its developers. FarCry's game-play is enhanced by graphics with Polybump mapping, advanced environment physics, destructible terrain, dynamic lighting, motion-captured animation, and surround sound. The game is rife with lush, brightly lit outdoor environments, and high-poly models that are capable of taxing most of today's high-end 3D graphics hardware. We benchmarked the cards in this review with a custom-recorded demo run taken in the "Catacombs" area checkpoint, at various resolutions without anti-aliasing or anisotropic filtering enabled, and then with 4X AA and 16X aniso enabled concurrently.



Far Cry is a far cry from the norm, as we see the X850XT putting up much more comparable numbers to the GV-NX78T256V-B at 1280x1024 and then pushing ahead at higher resolutions.  Originally, all cards were in the higher 80s at the lower settings, leading us to believe that those tests were more CPU-limited.  Only when AA was applied did the 6800GT-based card drop off some.  At 1600x1200, both of the nVIDIA CPU-based cards started to fall behind even further.

Doom 3 Testing


Performance Comparisons with Doom 3 - Single Player
Details: http://www.doom3.com/

Doom 3
Unlike the vast majority of titles currently shipping on the PC platform, which are based on Microsoft's DirectX API, iD software's Doom 3 is powered by an OpenGL-based game engine. The Doom 3 engine is capable of producing extremely realistic looking visuals through the use of high-detailed textures and a unified lighting and shadowing system with complex animations and scripting that generates real-time, fully dynamic per-pixel lighting and stencil shadowing. We ran this batch of single-player Doom 3 benchmarks using one of our own custom recorded demos with the game set to its "High-Quality" mode, at resolutions of 1280 x 1024 and 1600 x 1200 without anti-aliasing enabled, and again with 4X AA and 16X aniso enabled simultaneously.


The GeForce cards redeemed themselves in Doom 3, long known as an nVIDIA-friendly benchmark.  Gigabyte's video card entry quickly asserted itself, easily and handily winning each and every set of tests with splits of nearly 10-15 frames over the competition.  ATi's X850XT usually took up the rear spot, well behind the other two cards at 1280x1024, and more on comparative terms with the 6800GT than the newer 7800GT at 1600x1200.

Chronicles of Riddick


Performance Comparisons with Chronicles of Riddick: Escape From Butcher Bay
Details: http://www.riddickgame.com/

Chronicles of Riddick
Starbreeze Studios is responsible for creating the surprisingly impressive Chronicles of Riddick: Escape From Butcher Bay. Riddick is a first person shooter that's powered by the company's proprietary Starbreeze Engine. This engine is currently being used for all projects at Starbreeze and is geared toward indoor or semi-indoor game environments, with per light-source light-mapping and high-quality shading with normal mapping. According to Starbreeze Studios, the engine has been under development for over seven years and is currently being used on the PC, PS2, Xbox, and GameCube platforms. However, the Starbreeze engine is also being ported onto next-generation platforms as well. Chronicles of Riddick: Escape From Butcher Bay has impressive visuals, a mature story line and has proven to be a tough challenge.



In many ways, using Chronicles of Riddick mimics what we saw in the Doom3 testing.  OpenGL games simply run better on GeForce-based video cards, leaving ATi's Radeon X850XT in the dust. To put into numbers, the GV-NX78T256V-B betters the X850XT by 35-40% at 1280x1024 testing, and approximately 28% at 1600x1200, with or without any additional pixel processing.  As with Doom 3, however, even the NX6800GT is outperforming the X850XT, so we will have to look in the future at ATi's newer releases to see how they stack up.

Half-Life 2 Benchmarks


Performance Comparisons with Half-Life 2
Details: http://www.half-life2.com/

Half Life 2
Half-Life 2 is a game that needs no introduction. The underlying technology behind HL2, however, may not be as well known. The Source Engine powering Half-Life 2 is responsible for the game's realistic visuals, with support for shader model 2.0 shaders, bump mapping, and Cube and environment mapping. Half-Life 2 also makes use of dynamic lights, vertex lighting and light maps, and water with refraction and fresnel effects, among a host of other rendering features. We benchmarked Half-Life 2 with a long, custom-recorded timedemo taken in the "Canals" map, that takes us through both outdoor and indoor environments. These tests were run at resolutions of 1280 x 1024 and 1600 x 1200 with 4XAA anti-aliasing and 16X anisotropic filtering enabled concurrently.


Half-Life 2 provides another twist, with the GV-NX78T256V-B falling in behind the Radeon X850XT in three out of the four tests.  The variances were anywhere between 5-11%, with the largest delta coming from the 1600x1200 test with Anti-aliasing and Anisotropic Filtering applied.  Looking at the numbers, the latest generation of cards, including the 7800GT, are quite capable of handling Half-Life 2's rendering and shading techniques.  The lowest frame rate we got went only as low as 63 fps -- a full third faster than the year older 6800GT.

Overclocking Results


Overclocking the GV-NX78T256V-B
Who needs the big, bad 7800GTX?

Without any tools provided by Gigabyte, overclocking the GV-NX78T256V-B was achieved using the Coolbits registry tweak.  We went into the driver settings, and raised the core and memory clock speeds until we began to see artifacts or experienced other instability.  At that point, we moved back a few MHz until we had a completely artifact-free environment.  Our final results: we moved the GPU clock up to 473 MHz, and the memory to 565MHz (1130MHz effective).  To see what kinds of gains this achieved, we ran two benchmarks a second time using one synthetic test and one real-world game. 




In the end, we wound up making a fast card faster.  The 7800GT had already ruled much of the testing, but raising the core and memory speeds as far as we did allowed us to get an additional 10-15% boost in performance.  Keeping with the "keep prices low" mentality that we started out with, we wanted to point out that at 473MHz/1130MHz speeds, the GV-NX78T256V-B is running with higher clocks than a typical 7800 GTX.

Performance Summary & Conclusion


Performance Summary: The Gigabyte GV-NX78T256V-B, like other 7800GT cards, proved to be one of the faster video cards available today. It bests NVIDIA's previous mid-level offering, the 6800GT, by 20-30 percent in most tests. And it out-performed the ATi Radeon X850XT in most cases, only falling slightly behind in a couple of our DirectX-based game benchmarks, namely FarCry and Half Life 2.

We started off looking for a video card that would satisfy the average enthusiast's appetite for performance. But, have we found it in the Gigabyte GV-NX78T256V-B?  From a performance point-of-view, some might say so.  The card, powered by the GeForce 7800GT GPU, has shown itself to be quite capable of dishing out high frame-rates. There's a large enough improvement  in performance over last year's mid-range offering that could help justify selling off a 6800GT and using the cash to upgrade. And we surely loved the single-slot, quieter operation of the GV-NX78T256V-B over ATi's Radeon X850XT. 

But what didn't we like?  Well, the price for one.  From what we've seen online, Gigabyte's card is retailing for approximately $340-$380, which is somewhat more expensive than other manufacturers' GeForce 7800 GTs.  Adding to that dilemma is that there's nothing about the card or its bundle that's a "must-have".  A standard looking reference design coupled with a relatively weak software bundle isn't inspiring when faster, pre-overclocked cards from BFG are available for a similar price, or when titles like Call of Duty 2 are included with cards from eVGA and XFX.  In the end, we're giving the Gigabyte GV-NX78T256V-B a HotHardware Heat Meter rating of 7.5, and hope Gigabyte has a new revision of this card in the works with higher clock speeds and a better bundle.

_Good performance for the price
_Good Overclocker
_Single-slot solution
_SLI Capable
_Priced Higher than other 7800GTs
_Relatively weak bundle
_Doesn't stand out from the crowd

Discuss This Or Any Other Article
in HotHardware's PC Hardware Forum!

Content Property of HotHardware.com