Logo   Banner   TopRight
TopUnder
Transparent
GeForce 8800 GTX and GTS SLI Performance Update
Transparent
Date: Dec 18, 2006
Section:Graphics/Sound
Author: Marco Chiappetta
Transparent
Introduction, Specifications and Related Info

A few short weeks ago, NVIDIA unveiled the GeForce 8800 series of graphics cards and the nForce 600 family of core logic chipsets on the very same day. The flagship parts in each of these product categories, the GeForce 8800 GTX and the nForce 680i SLI, made the basis of what NVIDIA has dubbed their "Ultimate Gaming Platform". The feature set and initial testing of the GeForce 8800 series of cards and the nForce 680i SLI chipset definitely gave credence to this title, but at the time of the launch NVIDIA hadn't quite worked all of the kinks out of the G80's early drivers.  As such, we only included a handful of preliminary SLI benchmarks in our coverage of the launch.

Since then, however, NVIDIA has released a couple of fresh sets of Forceware drivers for the GeForce 8800 series, so we thought it was a good time to revisit the GTX and GTS to see how a pair of each type of card performed in a dual-GPU SLI configuration. So, armed with dual GeForce 8800 GTX and GeForce 8800 GTS cards, and a top-of-the-line testbed, we set out to do just that...

NVIDIA GeForce 8800 Series
Features & Specifications
NVIDIA unified architecture:
Fully unified shader core dynamically allocates processing power to geometry, vertex, physics, or pixel shading operations, delivering up to 2x the gaming performance of prior generation GPUs.


GigaThread Technology:
Massively multi-threaded architecture supports thousands of independent, simultaneous threads, providing extreme processing efficiency in advanced, next generation shader programs.

Full Microsoft DirectX 10 Support:
World's first DirectX 10 GPU with full Shader Model 4.0 support delivers unparalleled levels of graphics realism and film-quality effects.

NVIDIA SLI Technology:
Delivers up to 2x the performance of a single graphics card configuration for unequaled gaming experiences by allowing two cards to run in parallel. The must-have feature for performance PCI Express graphics, SLI dramatically scales performance on today's hottest games.

NVIDIA Lumenex Engine:
Delivers stunning image quality and floating point accuracy at ultra-fast frame rates.
16x Anti-aliasing: Lightning fast, high-quality anti-aliasing at up to 16x sample rates obliterates jagged edges.

128-bit floating point High Dynamic-Range (HDR):
Twice the precision of prior generations for incredibly realistic lighting effects - now with support for anti-aliasing.

NVIDIA Quantum Effects Technology:
Advanced shader processors architected for physics computation enable a new level of physics effects to be simulated and rendered on the GPU - all while freeing the CPU to run the game engine and AI.

NVIDIA ForceWare Unified Driver Architecture (UDA):
Delivers a proven record of compatibility, reliability, and stability with the widest range of games and applications. ForceWare provides the best out-of-box experience and delivers continuous performance and feature updates over the life of NVIDIA GeForce GPUs.

OpenGL 2.0 Optimizations and Support:
Ensures top-notch compatibility and performance for OpenGL applications.

NVIDIA nView Multi-Display Technology:
Advanced technology provides the ultimate in viewing flexibility and control for multiple monitors.

PCI Express Support:
Designed to run perfectly with the PCI Express bus architecture, which doubles the bandwidth of AGP 8X to deliver over 4 GB/sec. in both upstream and downstream data transfers.

Dual 400MHz RAMDACs:
Blazing-fast RAMDACs support dual QXGA displays with ultra-high, ergonomic refresh rates - up to 2048x1536@85Hz. 

Dual Dual-link DVI Support:
Able to drive the industry's largest and highest resolution flat-panel displays up to 2560x1600.

Built for Microsoft Windows Vista:
NVIDIA's fourth-generation GPU architecture built for Windows Vista gives users the best possible experience with the Windows Aero 3D graphical user interface.

NVIDIA PureVideo HD Technology:
The combination of high-definition video decode acceleration and post-processing that delivers unprecedented picture clarity, smooth video, accurate color, and precise image scaling for movies and video.

Discrete, Programmable Video Processor:
NVIDIA PureVideo HD is a discrete programmable processing core in NVIDIA GPUs that provides superb picture quality and ultra-smooth movies with low CPU utilization and power.

Hardware Decode Acceleration:
Provides ultra-smooth playback of H.264, VC-1, WMV and MPEG-2 HD and SD movies.

HDCP Capable:
Designed to meet the output protection management (HDCP) and security specifications of the Blu-ray Disc and HD DVD formats, allowing the playback of encrypted movie content on PCs when connected to HDCP-compliant displays.

Spatial-Temporal De-Interlacing:
Sharpens HD and standard definition interlaced content on progressive displays, delivering a crisp, clear picture that rivals high-end home-theater systems.

High-Quality Scaling:
Enlarges lower resolution movies and videos to HDTV resolutions, up to 1080i, while maintaining a clear, clean image. Also provides downscaling of videos, including high-definition, while preserving image detail.

Inverse Telecine (3:2 & 2:2 Pulldown Correction):
Recovers original film images from films-converted-to-video (DVDs, 1080i HD content), providing more accurate movie playback and superior picture quality.

Bad Edit Correction:
When videos are edited after they have been converted from 24 to 25 or 30 frames, the edits can disrupt the normal 3:2 or 2:2 pulldown cadences. PureVideo HD uses advanced processing techniques to detect poor edits, recover the original content, and display perfect picture detail frame after frame for smooth, natural looking video.

Video Color Correction:
NVIDIA's Color Correction Controls, such as Brightness, Contrast and Gamma Correction let you compensate for the different color characteristics of various RGB monitors and TVs ensuring movies are not too dark, overly bright, or washed out regardless of the video format or display type.

Integrated SD and HD TV Output:
Provides world-class TV-out functionality via Composite, S-Video, Component, or DVI connections. Supports resolutions up to 1080p depending on connection type and TV capability.

Noise Reduction:
Improves movie image quality by removing unwanted artifacts.

Edge Enhancement:
Sharpens movie images by providing higher contrast around lines and objects.


NVIDIA G80 Wafer: 

   
The GeForce 8800 GTX GPU
 

  
The GeForce 8800 GTS GPU


We've already covered the architectural features and benefits of the NVIDIA G80 GPU at the heart of the GeForce 8800 GTS and GeForce 8800 GTX, so we won't go into detail again here. To familiarize yourself with NVIDIA's GeForce 8800 series cards, previous product offerings, and their multi-GPU platform as whole, please take some time to peruse the articles listed below:

The details and performance data in the articles listed above will give you much of the background information necessary to better understand the GeForce 8800 GTX and GTS SLI configurations we'll be showcasing here today. If you're unclear about anything on the proceeding pages, please look back to these articles for a multitude of related details.

Transparent
Our Test System and 3DMark06

HOW WE CONFIGURED THE TEST SYSTEMS: We tested all of the graphics cards used in this article on an EVGA nForce 680i SLI based motherboard powered by a Core 2 Extreme X6800 dual-core processor and 2GB of low-latency Corsair RAM (X1950 CrossFire testing was done on an Abit AW9D-MAX). The first thing we did when configuring the test system was enter the BIOS and set all values to their default settings. Then we manually configured the memory timings and disabled any integrated peripherals that wouldn't be put to use. The hard drive was then formatted, and Windows XP Pro with SP2 and the October DX9 update was installed. When the installation was complete, we then installed the latest chipset drivers available, installed all of the other drivers necessary for the rest of our components, and removed Windows Messenger from the system.  Auto-Updating and System Restore were also disabled, the hard drive was defragmented, and a 1024MB permanent page file was created on the same partition as the Windows installation. Lastly, we set Windows XP's Visual Effects to "best performance," installed all of the benchmarking software and ran the tests.

The HotHardware Test System
Core 2 Extreme Powered

Processor -

Motherboard -


Video Cards -




Memory -


Audio -

Hard Driv
e -

 

Hardware Used:
Core 2 Extreme X6800 (2.93GHz)


EVGA nForce 680i SLI
nForce 680i SLI chipset

GeForce 8800 GTX (x2)

GeForce 8800 GTS (x2)
Radeon X1950 XTX (CF Master)


2048MB Corsair PC2-6400C3
2 X 1GB

Integrated on board

Western Digital "Raptor"

74GB - 10,000RPM - SATA

OS -
Chipset Drivers -
DirectX -

Video Drivers
-




Synthetic (DX) -
DirectX -
DirectX -
DirectX -
OpenGL -

OpenGL -
Relevant Software:
Windows XP Pro SP2
nForce Drivers v9.53
DirectX 9.0c (October Redist.)

NVIDIA Forceware v97.44

ATI Catalyst v6.12


Benchmarks Used:
3DMark06 v1.0.2
FarCry v1.4*
F.E.A.R. v1.08
Half Life 2: Episode 1*
Prey v1.2*
Quake 4 v1.3*

* - Custom Test (HH Exclusive demo)
Performance Comparisons with 3DMark06 v1.0.2
Details: http://www.futuremark.com/products/3dmark06/

3DMark06
3DMark06 is the latest addition to the 3DMark franchise. This version differs from 3Dmark05 in a number of ways, and now includes not only Shader Model 2.0 tests, but Shader Model 3.0 and HDR tests as well. Some of the assets from 3DMark05 have been re-used, but the scenes are now rendered with much more geometric detail and the shader complexity is vastly increased as well. Max shader length in 3DMark05 was 96 instructions, while 3DMark06 ups the number of instructions to 512. 3DMark06 also employs much more lighting, and there is extensive use of soft shadows. With 3DMark06, Futuremark has also updated how the final score is tabulated. In this latest version of the benchmark, SM 2.0 and HDR / SM3.0 tests are weighted and the CPU score is factored into the final tally as well.

The GeForce 8800 cards showed massive improvements in 3DMark06 moving from one card to two.  The GeForce 8800 GTX's score jumped by over 2500 points, and the GTS' score improved by over 3700 points.  Why the larger improvement with the GTS?  More than likely due to a CPU limitation with this benchmark. The Radeon X1950 XTX's score improved when setup in a CrossFire configuration as well, but it obviously wasn't enough to keep pace with NVIDIA's current flagship cards.

Transparent
Half Life 2: Episode 1

Performance Comparisons with Half Life 2: Episode 1
Details: http://www.half-life2.com/

Half Life 2: Episode 1
Thanks to the dedication of hardcore PC gamers and a huge mod-community, the original Half-Life became one of the most successful first person shooters of all time.  So, when Valve announced Half-Life 2 was close to completion, gamers the world over sat in eager anticipation. Upon its release, HL2 was universally lauded, and the sequel won an array of "Game of the Year" awards. Armed with the latest episodic update to HL2, Episode 1, we benchmarked the game with a long, custom-recorded timedemo that takes us through both outdoor and indoor environments. These tests were run at resolutions of 1,280 x 1,024 and 1,600 x 1,200 with 4X anti-aliasing and 16X anisotropic filtering enabled concurrently, and with color correction and HDR rendering enabled in the game engine as well.

 

Despite its relatively advanced graphics, Half Life 2: Episode 1 was essentially CPU limited at the lower resolution, hence the similar scores among the dual-GPU platforms.  With the resolution raised to 1600x1200, however, the CPU limitation is somewhat alleviated and the GeForce 8800 series cards pull way ahead of the Radeon X1950 XTX CrossFire configuration.

Transparent
FarCry v1.4

 

Performance Comparisons with FarCry v1.4
Details: http://www.farcry.ubi.com/

FarCry
If you've been on top of the gaming scene for some time, you probably know that FarCry was one of the most visually impressive games to be released on the PC in the last few years.  Courtesy of its proprietary engine, dubbed "CryEngine" by its developers, FarCry's game-play is enhanced by Polybump mapping, advanced environment physics, destructible terrain, dynamic lighting, motion-captured animation, and surround sound. Before titles such as Half-Life 2 and Doom 3 hit the scene, FarCry gave us a taste of what was to come in next-generation 3D gaming on the PC. We benchmarked the graphics cards in this article with a fully patched version of FarCry using a custom-recorded demo run taken in the "Catacombs" area checkpoint. The tests were run at various resolutions with 4X AA and 16X aniso enabled concurrently.

 

FarCry is another game that is basically CPU-bound at lower resolutions, which is why the 8800 GTX / GTS SLI and Radeon X1950 XTX CrossFire configurations put up similar scores at 1280x1024. As the resolution is increased though, and the GPUs become the limiting factor in performance, the GeForce 8800 series cards walk away with another clear victory. The performance of both the GeForce 8800 GTX and 8800 GTS cards scaled significantly moving from a single-GPU to a dual-GPU SLI setup.

Transparent
F.E.A.R v1.08

 

Performance Comparisons with F.E.A.R
More Info: http://www.whatisfear.com/us/

F.E.A.R
One of the most highly anticipated titles of 2005 was Monolith's paranormal thriller F.E.A.R. Taking a look at the game's minimum system requirements, we see that you will need at least a 1.7GHz Pentium 4 with 512MB of system memory and a 64MB graphics card in the Radeon 9000 or GeForce4 Ti-classes or better, to adequately run the game. Using the full retail release of the game patched to v1.07, we put the graphics cards in this article through their paces to see how they fared with a popular title. Here, all graphics settings within the game were set to their maximum values, but with soft shadows disabled (Soft shadows and anti-aliasing do not work together currently). Benchmark runs were then completed at resolutions of 1,280x960 and 1,600x1,200, with anti-aliasing and anisotropic filtering enabled.

 

Score yet another victory for the 8800s. In a single-card configuration, the Radeon X1950 XTX puts up a good fight in comparison to the 8800 GTS, but installing a master card and enabling CrossFire doesn't result in the kind of scaling we see with the NVIDIA-based cards.  The GeForce 8800 GTX and 8800 GTS scaled much better than the Radeons in this test, hence the much larger margins of victory in the dual-GPU SLI versus CrossFire tests.

Transparent
Quake 4 v1.3

Performance Comparisons with Quake 4
Details: http://www.quake4game.com/

Quake 4
id Software, in conjunction with developer Raven, recently released the latest addition to the wildly popular Quake franchise, Quake 4. Quake 4 is based upon an updated and slightly modified version of the Doom 3 engine, and as such performance characteristics between the two titles are very similar.  Like Doom 3, Quake 4 is also an OpenGL game that uses extremely high-detailed textures and a ton of dynamic lighting and shadows, but unlike Doom3, Quake 4 features some outdoor environments as well. We ran this these Quake 4 benchmarks using a custom demo with the game set to its "High-Quality" mode, at resolutions of 1,280 x 1,024 and 1,600 x 1,200 with 4X AA and 8X aniso enabled simultaneously.

 

The GeForce 8800 GTX and GeForce 8800 GTS cards put on another dominant performance in our custom Quake 4 benchmark. All of the cards put up good numbers in both single- and dual-GPU configurations, but the new GeForces simply outclassed the older Radeon X1950 XTX. Moving from one card to two resulted in larger performance gains for all platforms; the GeForce 8800 GTX and GTS just have too much horsepower for the Radeon X1950 XTX to compete once the resolution is raised above 1280x1024.

Transparent
Prey v1.2

 

Performance Comparisons with Prey
Details: http://www.prey.com/

Prey
After many years of development, Take-Two Interactive recently released the highly anticipated game Prey. Prey is based upon an updated and modified version of the Doom 3 engine, and as such performance characteristics between the two titles are very similar.  Like Doom 3, Prey is also an OpenGL game that uses extremely high-detailed textures and a plethora of dynamic lighting and shadows.  But unlike Doom3, Prey features a fare share of outdoor environments as well.  We ran these Prey benchmarks using a custom recorded timedemo with the game set to its "High-Quality" graphics mode, at resolutions of 1,280 x 1,024 and 1,600 x 1,200 with 4X AA and 16X anisotropic filtering enabled simultaneously.

 

The GeForce 8800 GTX and GeForce 8800 GTS SLI configurations simply tore through our custom Prey benchmark. At both resolutions, regardless of whether or not the game was being run in single-card or SLI mode, the GeForce 8800 cards dominated.

Transparent
XHD Resolutions: EP1

 

Performance Comparisons with Half-Life 2: Episode 1 XHD
Details: http://www.half-life2.com/

Half Life 2: Episode 1
Thanks to the dedication of hardcore PC gamers and a huge mod-community, the original Half-Life became one of the most successful first person shooters of all time.  So, when Valve announced Half-Life 2 was close to completion, gamers the world over sat in eager anticipation. Upon its release, HL2 was universally lauded, and the sequel won an array of "Game of the Year" awards. Armed with the latest episodic update to HL2, Episode 1, we benchmarked the game with a long, custom-recorded timedemo that takes us through both outdoor and indoor environments. These tests were run at resolutions of 1920 x 1200 and 2560 x 1600 with 4X anti-aliasing and 16X anisotropic filtering enabled concurrently, and with color correction and HDR rendering enabled in the game engine as well.

 

The GeForce 8800 GTX continued its winning ways as we moved up to some XHD resolutions with our custom Half Life 2:Episode 1 benchmark. The GTS also performed extremely well, but at the highest resolution it fell victim to the Radeon X1950 XTX CrossFire configuration, due to the latter's superior amount of memory bandwidth.

Transparent
XHD Resolutions: F.E.A.R.

Performance Comparisons with F.E.A.R XHD
More Info: http://www.whatisfear.com/us/

F.E.A.R
One of the most highly anticipated titles of 2005 was Monolith's paranormal thriller F.E.A.R. Taking a look at the game's minimum system requirements, we see that you will need at least a 1.7GHz Pentium 4 with 512MB of system memory and a 64MB graphics card in the Radeon 9000 or GeForce4 Ti-classes or better, to adequately run the game. Using the full retail release of the game patched to v1.07, we put the graphics cards in this article through their paces to see how they fared with a popular title. Here, all graphics settings within the game were set to their maximum values, but with soft shadows disabled (Soft shadows and anti-aliasing do not work together currently). Benchmark runs were then completed at resolutions of 1920 x 1200 and 2560 x 1600 with anti-aliasing and anisotropic filtering enabled.

 

Although the framerates are somewhat lower, the results of the built-in F.E.A.R. benchmark essentially mirror those of the HL2: Episode 1 results on the previous page. Once again, the GeForce 8800 GTX and 8800 GTX SLI configurations are simply in a league of their own, outpacing all of the other configurations by wide margins at both resolutions. The GeForce 8800 GTS and GTS SLI rigs also put up some good numbers, but the Radeon X1950 XTX's superior amount of memory bandwidth allowed it to pull ahead of the GTS at 2560x1600 in both single and dual-card configurations.

Transparent
XHD Resolutions: Quake 4

Performance Comparisons with Quake 4 XHD
Details: http://www.quake4game.com/

Quake 4
id Software, in conjunction with developer Raven, recently released the latest addition to the classic Quake franchise, Quake 4. Quake 4 is based upon an updated and slightly modified version of the Doom 3 engine, and as such performance characteristics between the two titles are very similar.  Like Doom 3, Quake 4 is also an OpenGL game that uses extremely high-detailed textures and a ton of dynamic lighting and shadows, but unlike Doom3, Quake 4 features some outdoor environments as well. We ran this these Quake 4 benchmarks using a custom demo with the game set to its "High-Quality" mode, at resolutions of 1920 x 1200 and 2560 x 1600 with 4X AA and 8X aniso enabled simultaneously.

 

Our custom Quake 4 benchmark was no match for the GeForce 8800 GTX SLI configuration. At both XHD resolutions, the dual-GTX SLI configuration put up some impressive numbers, clearly besting the competition. The GeForce 8800 GTS also performed very well, especially at 1920x1200.  At 2560x1600 though, the dual-GTS configuration was only about 3.8 FPS faster than the Radeon X1950 XTX CrossFire system.

Transparent
XHD Resolutions: Prey

Performance Comparisons with Prey XHD
Details: http://www.prey.com/

Prey
After many years of development, Take-Two Interactive recently released the highly anticipated game Prey. Prey is based upon an updated and modified version of the Doom 3 engine, and as such performance characteristics between the two titles are very similar.  Like Doom 3, Prey is also an OpenGL game that uses extremely high-detailed textures and a plethora of dynamic lighting and shadows.  But unlike Doom3, Prey features a fare share of outdoor environments as well.  We ran these Prey benchmarks using a custom recorded timedemo with the game set to its "High-Quality" graphics mode, at resolutions of 1920 x 1200and 2560 x 1600 with 4X AA and 16X anisotropic filtering enabled simultaneously.

 

The GeForce 8800 GTX has no trouble with our custom Prey benchmark when running at XHD resolutions. At 1920x1200 and 2560x1600, the GeForce 8800 GTX SLI configuration clearly outpaced the competition. The GeForce 8800 GTS too outpaced everything but the GTX at 1920x1200, but the Radeon X1950 CrossFire rig took second place at 2560x1600 by a paltry 1.4 frames per second.

Transparent
Our Summary and Conclusion

Performance Summary: In every benchmark we ran, regardless of the resolution or the level of anti-aliasing or anisotropic filtering being used, the GeForce 8800 GTX SLI configuration put up the best scores bar none, usually outpacing its competition by wide margins. The GeForce 8800 GTS SLI configuration was also an excellent performer, typically losing out only to the more powerful GTX. In a couple of instances though, the Radeon X1950 XTX CrossFire configuration was slightly faster.

As of today, a GeForce 8800 GTX SLI / nForce 680i SLI combination is clearly the most powerful platform available for hardcore PC gamers. Performance is top-notch, and the feature set offered by the configuration can't be matched by anything else currently. The same basically rings true with the GeForce 8800 GTS, albeit at a somewhat lower performance level. When NVIDIA dubbed the GeForce 8800 GTX and nForce 680i SLI the foundation of the Ultimate Gaming Platform, they weren't kidding. Who needs a wimpy console when a PC can put up triple-digit framerates at a resolution of 2560x1600 with high levels of anti-aliasing and anisotropic filtering enabled?

  • Extreme Performance
  • Unified Architecture
  • Full DX10 Support
  • Enhanced Image Quality
  • New AA Modes
  • High Quality Anisotropic Filtering
  • Hefty Power Requirements
  • Pricey
  • Cards Run Hot
  • GTX Requires 4 Power Leads for SLI

 

Get into HotHardware's PC Hardware Forum Right Now!

 



Content Property of HotHardware.com