Logo   Banner   TopRight
NVIDIA GeForce 8600 GTS and 8600 GT
Date: Apr 17, 2007
Author: Marco Chiappetta
Introduction and Related Information


Today's announcement by NVIDIA should come as no surprise to PC enthusiasts. The company's modus operandi has been to launch a new flagship GPU and over the course of the next year or so scale back the design to flesh out a complete top-to-bottom line-up of products, all with essentially the same feature set.

We saw this scenario play out with the GeForce 6 series, which began with the high-end GeForce 6800 Ultra and ended with the GeForce 6100 series IGPs. The GeForce 7 series too began its life with the GeForce 7800 GTX and is now comprised of a whole host of products at various price points, many of which will soon be overshadowed by their GeForce 8 series couterparts.  "What GeForce 8 series counterparts?", you ask.  Well, that's what we're here to talk about.

NVIDIA is announcing a handful of products today, ranging from the new mid-range GeForce 8600 GTS and GT to the OEM-only GeForce 8300 GS. As you probably expect, these new cards sport the same base feature set as NVIDIA's current flagship DX10-class GeForce 8800 GTX and differ mainly in the level of performance they're able to deliver.  NVIDIA does have a couple of tricks up their collective sleeves with this launch as well, like a new, more powerful PureVideo HD engine, which we'll also tell you about on the page ahead. 

NVIDIA GeForce 8 Series
Specifications and Features

NVIDIA unified architecture:
Fully unified shader core dynamically allocates processing power to geometry, vertex, physics, or pixel shading operations, delivering up to 2x the gaming performance of prior generation GPUs.

GigaThread Technology:
Massively multi-threaded architecture supports thousands of independent, simultaneous threads, providing extreme processing efficiency in advanced, next generation shader programs.

Full Microsoft DirectX 10 Support:
World's first DirectX 10 GPU with full Shader Model 4.0 support delivers unparalleled levels of graphics realism and film-quality effects.

NVIDIA SLI Technology:
Delivers up to 2x the performance of a single graphics card configuration for unequaled gaming experiences by allowing two cards to run in parallel. The must-have feature for performance PCI Express graphics, SLI dramatically scales performance on today's hottest games.

NVIDIA Lumenex Engine:
Delivers stunning image quality and floating point accuracy at ultra-fast frame rates.
16x Anti-aliasing: Lightning fast, high-quality anti-aliasing at up to 16x sample rates obliterates jagged edges.

128-bit floating point High Dynamic-Range (HDR):
Twice the precision of prior generations for incredibly realistic lighting effects - now with support for anti-aliasing.

NVIDIA Quantum Effects Technology:
Advanced shader processors architected for physics computation enable a new level of physics effects to be simulated and rendered on the GPU - all while freeing the CPU to run the game engine and AI.

NVIDIA ForceWare Unified Driver Architecture (UDA):
Delivers a proven record of compatibility, reliability, and stability with the widest range of games and applications. ForceWare provides the best out-of-box experience and delivers continuous performance and feature updates over the life of NVIDIA GeForce GPUs.

OpenGL 2.0 Optimizations and Support:
Ensures top-notch compatibility and performance for OpenGL applications.

NVIDIA nView Multi-Display Technology:
Advanced technology provides the ultimate in viewing flexibility and control for multiple monitors.

PCI Express Support:
Designed to run perfectly with the PCI Express bus architecture, which doubles the bandwidth of AGP 8X to deliver over 4 GB/sec. in both upstream and downstream data transfers.

Dual 400MHz RAMDACs:
Blazing-fast RAMDACs support dual QXGA displays with ultra-high, ergonomic refresh rates - up to 2048x1536@85Hz. 

Dual Dual-link DVI Support:
Able to drive the industry's largest and highest resolution flat-panel displays up to 2560x1600.

Built for Microsoft Windows Vista:
NVIDIA's fourth-generation GPU architecture built for Windows Vista gives users the best possible experience with the Windows Aero 3D graphical user interface.

NVIDIA PureVideo HD Technology:
The combination of high-definition video decode acceleration and post-processing that delivers unprecedented picture clarity, smooth video, accurate color, and precise image scaling for movies and video.

Discrete, Programmable Video Processor:
NVIDIA PureVideo HD is a discrete programmable processing core in NVIDIA GPUs that provides superb picture quality and ultra-smooth movies with low CPU utilization and power.

Hardware Decode Acceleration:
Provides ultra-smooth playback of H.264, VC-1, WMV and MPEG-2 HD and SD movies.

HDCP Capable:
Designed to meet the output protection management (HDCP) and security specifications of the Blu-ray Disc and HD DVD formats, allowing the playback of encrypted movie content on PCs when connected to HDCP-compliant displays.

Spatial-Temporal De-Interlacing:
Sharpens HD and standard definition interlaced content on progressive displays, delivering a crisp, clear picture that rivals high-end home-theater systems.

High-Quality Scaling:
Enlarges lower resolution movies and videos to HDTV resolutions, up to 1080i, while maintaining a clear, clean image. Also provides downscaling of videos, including high-definition, while preserving image detail.

Inverse Telecine (3:2 & 2:2 Pulldown Correction):
Recovers original film images from films-converted-to-video (DVDs, 1080i HD content), providing more accurate movie playback and superior picture quality.

Bad Edit Correction:
When videos are edited after they have been converted from 24 to 25 or 30 frames, the edits can disrupt the normal 3:2 or 2:2 pulldown cadences. PureVideo HD uses advanced processing techniques to detect poor edits, recover the original content, and display perfect picture detail frame after frame for smooth, natural looking video.

Video Color Correction:
NVIDIA's Color Correction Controls, such as Brightness, Contrast and Gamma Correction let you compensate for the different color characteristics of various RGB monitors and TVs ensuring movies are not too dark, overly bright, or washed out regardless of the video format.

Integrated SD and HD TV Output:
Provides world-class TV-out functionality via Composite, S-Video, Component, or DVI connections. Supports resolutions up to 1080p.

Noise Reduction:
Improves movie image quality by removing unwanted artifacts.

Edge Enhancement:
Sharpens movie images by providing higher contrast around lines and objects.


We've already covered the architectural features and benefits of the NVIDIA GeForce 8 Series GPUs in previous articles, so we won't go into detail again here. To familiarize yourself with the features inherent to NVIDIA's GeForce 8x00 series cards, nForce chipsets, and their multi-GPU platform as whole, please take some time to peruse the articles listed below:

The details and performance data in the articles listed above will give you much of the background information necessary to better understand the new 8600 series graphics cards we'll be showcasing here today. If you're unclear about anything on the proceeding pages, please look back to these articles for a multitude of background data related to today's subject matter.

The New GeForce 8 Line-Up


As we've already mentioned, NVIDIA is fleshing out the GeForce 8 series with a number of new products over the course of the next few weeks. We've got a couple of GeForce 8600 GTS and GeForce 8600 GT cards in the lab and will feature them a little later in this article, but we wanted to show you what else NVIDIA has on tap.


The GeForce 8600 GTS is designed to hit the $199-$229 "sweet spot" of the market where large numbers of mainstream graphics cards are sold. Next in line is the $150-ish GeForce 8600 GT, which is a very similar product, albiet with lower GPU and memory frequencies, and even a different PCB. Then comes the sub-$130 GeForce 8500 GT, followed by the OEM-only GeForce 8400 GS and 8300 GS cards.


small_GeForce_8600_GTS_F.jpg   small_GeForce_8600_GTS_Brac.jpg  small_GeForce_8600_GTS_B.jpg
NVIDIA GeForce 8600 GTS

The GeForce 8600 GTS is built around a new unified, DX10-class GPU which is manufactured at 80nm and features 32 stream processors and 8 ROPs. If you remember, the flagship GeForce 8800 GTX sports 128 stream processors and the GeForce 8800 GTS 96. NVIDIA's reference specifications for the GeForce 8600 GTS call for a GPU clock speed of 675MHz, a shader (SP) clock speed of 1.45GHz, and 256MB of GDDR3 memory clocked at 1.0GHz (2GHz DDR) utilizing a 128-bit interface, for 32GB/s of bandwidth. NVIDIA's partners are free to offer GeForce 8600 GTS cards with higher out-of-box frequencies, however. In fact, the retail-ready cards from XFX and Asus we tested were clocked much higher than NVIDIA's reference specifications. Cards sport a slim, singe slot cooler, a pair of dual-link DVI outputs, an HD/TV video output, and require a 6-pin PCI Express supplemental power connection (NVIDIA recommends a 350W PSU for single card configurations and 450W for SLI).  GeForce 8600 GTS cards are HDCP capable, but it's up to the discretion of board partners to include the necessary crypto-ROMs.  And as the edge connector along the top of the PCB suggests, the 8600 GTS is SLI-ready, of course.

small_GeForce_8600_GT_F.jpg   small_GeForce_8600_GT_brack.jpg  small_GeForce_8600_GT_B.jpg
NVIDIA GeForce 8600 GT

small_GeForce_8600_GT_3qtr.jpg The GeForce 8600 GT and GTS are very similar in terms of their specifications, but there are a couple of notable differences in their implementations. The GeForce 8600 GT is outfitted with the same 32 stream processors and 8 ROPs, and 256MB of GDDR3 frame buffer memory riding a 128-bit interface.  The GT's reference specifications, however, call for a 540MHz GPU clock, a 1.18GHz shader clock, and a 700MHz memory clock (22.4GB/s).  The GeForce 8600 GT's lower frequencies also lower its power requirements; as you can see the card does not require supplimental power. The differences between the GT and GTS also allowed NVIDIA to use a smaller PCB.

small_GeForce_8500_GT_F.jpg   small_GeForce_8500_GT_Brack.jpg  small_GeForce_8500_GT_B.jpg
NVIDIA GeForce 8500 GT

small_GeForce_8500_GT_3qtr.jpg As its name suggests, the GeForce 8500 GT is a step down from the 8600 GT. The GeForce 8500 GT is powered by a GPU that's outfitted with 16 stream processors but retains the 8 ROPs.  NVIDIA's reference specifiactions recommend a GPU clock speed of 450MHz with a shader clock speed of 900MHz and 400MHz memory (12.8GB/s), also utilizing a 128-bit interface. Unlink the 8600 cards though, the GeForce 8500 GT uses DDR2 memory and most cards will likely be outfitted with twice the amount of RAM at 512MB. Outputs include a standard15-pin VGA out, DVI out, and an HD/TV video output. GeForce 8500 GT cards lack an SLI edge connector, but SLI is supported over the PCI Express interface.

Retail GeForce 8600 Cards: Asus, XFX, MSI


We got our hands on GeForce 8600 GTS and 8600 GT cards from Asus and XFX for the purposes of this article.  The Asus card did not arrive in full retail trim, but the XFX products were fully boxed and ready to hit store shelves.


Asus EN8800GTS

Asus' version of the GeForce 8600 GTS is dubbed the EN8600GTS S.T.A.L.K.E.R. Edition. It sports the same PCB, features, and single-slot cooler as the reference card pictured on the previous page.  Asus does, however, take the liberty of installing a custom fan shroud compelte with imagery from the recently released game S.T.A.L.K.E.R.: Shadow of Chernobyl. Asus plans to include the game in the full retail bundle, but it wasn't shipped along with our sample. The EN8600GTS also differs from the reference design in regard to its GPU and memory clock speeds. Asus has cranked the GPU clock up to 745MHz and the memory clock up to 1.145GHz.



XFX GeForce 8600 GTS XXX Edition

In typical XFX fashion, the company has an entire line-up of GeForce 8600 GTS cards at the ready. The top-of-the-line model is pictured here, the XFX GeForce 8600 GTS XXX Edition. There are two other GTS cards also planned, each with lower clock speeds.

Model Number Description MSRP
         8600 GTS
PV-T84G-UDFX GF 8600GTS 675M 256MB DDR3 DUAL DVI TV PCI-E 199.99
       8600 GT
PV-T84J-UDFX GF 8600GT 540M 256MB DDR3 DUAL DVI TV PCI-E 149.99
       8500 GT
PV-T86J-UAFX GF 8500GT 450M 256MB DDR2 TV DVI PCI-E 129.99
PV-T86J-UALX GF 8500GT 450M 256MB DDR2 TV DVI PCI-E 109.99

As the 730M in its description implies, the XFX GeForce 8600 GTS XXX Edition sports a 730MHz core GPU clock; its memory is clocked at 1.13GHz. The XFX GeForce 8600 GTS XXX Edition also sports a custom fan shroud and a heavy duty retention bar that runs the length of the top edge of the PCB. Included in the box along with the card, we also found XFX's typical assortment of video and power cables, and adapters, along with a full version of Tom Clancy's Ghost Recon: Advanced Warfighter.



XFX GeForce 8600 GT XXX Edition

Here we also have XFX's GeForce 8600 GT XXX Edition. This card has a bit modr XFX-flavor so to speak. XFX's high-end 8600 GT sports a black PCB with neon-green DVI outputs.  The small, single-slot cooler is also adorned with a custom XFX decal and as you probably expect, it too is pre-overclocked right out of the box. This card sports a 620MHz GPU with 800MHz memory.

We should also note that XFX is making some changes to their warranty policy to better serve the enthusiast community. In the days leading up to this launch, XFX had this to say redarding the company's warranty coverage...

"We understand that today’s enthusiast gamers are interested in trying to maximize the performance of all of their components to give them that extra competitive advantage.

  • XFX North America has revised its warranty policy to alleviate the concerns that enthusiasts may have against these efforts by enthusiasts and maintain the product’s protection as long as the there is no physical damage on the graphics card or any original components are missing.
  • The Double Lifetime Protection ensures that the coverage will be transferable to a second owner, which adds additional value for the XFX card.

These changes will cover all 6 series and higher XFX video cards. This new warranty policy will kick of on the 17th coinciding with new 8600 series launch."



Lastly we have MSI's NX8600 GTS.  This was the only card of the bunch to feature a dual-slot cooler.  At first, that may seem like a disadvantage, but the MSI NX8600 GTS was much quieter than the Asus card of either of the XFX cards. MSI also pre-overclockes their GTS right out of the gate with a 700MHz core and 1.05GHz memory.  The higher than reference clocks give the MSI NX8600 GTS an edge over "stock" cards, but as you'll see on the pages ahead, it finished a tad behind the XFX and Asus offerings.

DX10 Screen Shots and IQ


When we covered the inital launch of the GeForce 8800 series , we detailed the new image quality-related features inherent in the Lumenex engine, so we won't go into detail again here. We did, however, run a couple of preliminary tests to ensure the GeForce 8600 was producing the same kind of images as the GeForce 8800.

GeForce 8600 GTS
1280x1024 (4X AA / 16X Aniso)

GeForce 8800 GTS
1280x1024 (4X AA / 16X Aniso)

In all of the few games we tested, the new GeForce 8600 cards produced images that were idential to those of a GeForce 8800. The screenshots above, taken with Futuremark's 3DMark06, illustrate that point. The two images are idential right down to every pixel.


World In Conflict

And although DirectX 10 games that can exploit NVIDIA's GeForce 8 series hardware haven't arrived just yet, we wanted to present some eye candy from a couple of upcoming titles to demonstrate what the API is capable of.  The screenshots above are from CryTek's upcoming shooter Crysis and Massive Entertainmant's World In Conflict.  Flipping through the images avbove is sure to enduce some jaw-dropping.  Of course, we can't vouch for the GeForce 8600's performance in these titles, because they aren't ready yet.  But, the DX10 features offerf by the 8600's GPU mean it'll be able to render these future titles as the developers intended.

PureVideo HD Updates


In the earyly part of this article, we mentioned that NVIDIA has updated the PureVideo HD video engine integrated into the GeForce 8600 / 8500 GPUs and have incorporated a few new features specifically designed to enhance playback of cutting edge HD content. The name of the engine remains PureVideo HD, however, the GeForce 8600 is capable of even more than the GeForce 8800 in regard to HD video playback.

PureVideo HD Updates
New Features For the 8600s

Video Processor Block Diagram

This high-level overview shows that the GeForce 8600 and GeForce 8500 GPUs are equipped with a second generation PureVideo HD video processor and a new BSP engine. BSP in this instance stands for "Bitstream Processor". Not pictured in this diagram is another new addition dubbed the AES128 engine.  As its name imples, the AES128 engine handles on-board hardware decryption of HD movie content.

The combination of the the second generation PureVideo HD and BSP / AES128 engines means the new architecture can handle 100% of the CABAC and H.264 video decoding necessary to provide full specifcation playback of Blu-Ray and HD DVD content, without straining the resources of the host CPU.

The second gen PureVideo HD engine handles CPU offloading of IDCT, and motion compensation and deblocking fucntions of for MPEG-2, VC-1, and H.262 playback.  The BSP Engine handles H.264 CABAC (Context Adaptive Binary Arithmetic Coding) and CAVLC (Context Adaptive Variable Length Coding) reverse entropy decoding, and the AES128 engine takes care of decryption. 

Our Test Systems and Video Performance


HOW WE CONFIGURED THE TEST SYSTEMS: We tested all of the graphics cards used in this article on either an EVGA nForce 680i SLI motherboard (NVIDIA GPUs) or an Intel D975XBX2 board (ATI GPUs) powered by a Core 2 Extreme X6800 dual-core processor and 2GB of low-latency Corsair RAM. The first thing we did when configuring the test system was enter the BIOS and set all values to their default settings. Then we manually configured the memory timings and disabled any integrated peripherals that wouldn't be put to use. The hard drive was then formatted, and Windows XP Pro with SP2 and the April '07 DX9 update was installed. When the installation was complete, we then installed the latest chipset drivers available, installed all of the other drivers necessary for the rest of our components, and removed Windows Messenger from the system.  Auto-Updating and System Restore were also disabled, the hard drive was defragmented, and a 1024MB permanent page file was created on the same partition as the Windows installation. Lastly, we set Windows XP's Visual Effects to "best performance," installed all of the benchmarking software, and ran the tests.

The HotHardware Test System
Core 2 Extreme Powered

Processor -

Motherboard -

Video Cards -

Memory -

Audio -

Hard Drive

Hardware Used:
Core 2 Extreme X6800 (2.93GHz)

EVGA nForce 680i SLI
nForce 680i SLI chipset

Intel D975XBX2
975X Express 

GeForce 8800 GTS 320M (2)
XFX GeForce 8600 GTS XXX (2)
Asus EN8600GTS
XFX GeForce 8600 GT
GeForce 8600 GTS
GeForce 8600 GT
GeForce 7950 GT
Radeon X1950 XTX (2)
Radeon X1950 Pro (2)

2048MB Corsair PC2-6400C3
2 X 1GB

Integrated on board

Western Digital "Raptor"

74GB - 10,000RPM - SATA

OS -
Chipset Drivers -
DirectX -

Video Drivers

Synthetic (DX) -
DirectX -
DirectX -
DirectX -
DirectX -
OpenGL -

OpenGL -
Relevant Software:
Windows XP Pro SP2
nForce Drivers v9.53
DirectX 9.0c (April '07 Redist.)

NVIDIA Forceware v158.16
ATI Catalyst v7.3

Benchmarks Used:
3DMark06 v1.0.2
S.T.A.L.K.E.R. *
F.E.A.R. v1.08
Half Life 2: Episode 1*
Prey v1.2*
Quake 4 v1.3*

* - Custom Test (HH Exclusive demo)

For our first round of tests we took a look at Digital Video processing performance between the two competing core GPU architectures. "PureVideo HD" technology is at work here for NVIDIA and "AVIVO" for ATI.

WMV-HD Decode Acceleration
PureVideo Performance Explored

To illustrate CPU utilization when playing back WMV HD content, we used the Performance Monitor built into Windows XP. Using the data provided by Windows Performance Monitor, we created a log file that sampled the percent of CPU utilization every second, while playing back the 1080p version of the "Amazing Caves" video available for download on Microsoft's WMVHD site. The CPU utilization data was then imported into Excel to create the graph below. The graph shows the CPU utilization for a GeForce 8600 GTS and a Radeon X1950 Pro using Windows Media Player 10.


Both of the cards produced a "spikey" graph here, but the peaks and valleys produced by the GeForce 8600 GTS weren't quite as pronouced, which resulted in slightly lower CPU utilization in comparison to the Radeon X1950 Pro. 

DVD Video Quality: HQV Benchmark with PowerDVD 7

Next up, we have the HQV DVD video benchmark from Silicon Optics. HQV is comprised of a sampling of SD video clips and test patterns that have been specifically designed to evaluate a variety of interlaced video signal processing tasks, including decoding, de-interlacing, motion correction, noise reduction, film cadence detection, and detail enhancement. As each clip is played, the viewer is required to "score" the image based on a predetermined set of criteria. The numbers listed below are the sum of the scores for each section. We played the HQV DVD using the latest version of Intervideo's WinDVD 8, with hardware acceleration and PureVideo extensions enabled.


Both NVIDIA and ATI have made great strides in regards to standard DVD movie playback, to the point where they have almost rendered the HQV benchmark useless. The new GeForce 8600 GTS comes within 2 points of a perfect score in HQV, with the ATI solution not too far behind.

3DMark06 Performance


Performance Comparisons with 3DMark06 v1.0.2
Details: www.futuremark.com/products/3dmark06/

3DMark06 is the latest addition to the 3DMark franchise. This version differs from 3Dmark05 in a number of ways, and now includes not only Shader Model 2.0 tests, but Shader Model 3.0 and HDR tests as well. Some of the assets from 3DMark05 have been re-used, but the scenes are now rendered with much more geometric detail and the shader complexity is vastly increased as well. Max shader length in 3DMark05 was 96 instructions, while 3DMark06 ups the number of instructions to 512. 3DMark06 also employs much more lighting, and there is extensive use of soft shadows. With 3DMark06, Futuremark has also updated how the final score is tabulated. In this latest version of the benchmark, SM 2.0 and HDR / SM3.0 tests are weighted and the CPU score is factored into the final tally as well.


We've got a lot of numbers to digest from this point forward. We've included benchmark scores from stock and pre-overclocked GeForce 8600 GTS and GT cards, in addition to similarly priced last-gen products, and current higher-end parts. We've also got some multi-GPU SLI and CrossFire numbers to consider.

As you can see, the new GeForces perfomed well in the 3DMark06 benchmark. The GTS cards outpaced all of the last-gen products with the GT finishing up just behind them. The more expensive X1950 XTX and 8800 GTS cards took the top spots thanks to their higher fullrates, large frame buffers, and incrased memory bandwidth.



If we tunnel a bit deeper and look at the individual shader model 2.0 and HDR / shader model 3.0 benchmarks, we see a similar breakdown. The new GeForce 8600 GTS in both single and dual-card SLI configurations are able to outpace the similarly priced last-gen 7950 GT and Radeon X1950 Pro.

Half Life 2: Episode 1 Performance


Performance Comparisons with Half-Life 2: Episode 1
Details: www.half-life2.com/

Half Life 2:
Episode 1
Thanks to the dedication of hardcore PC gamers and a huge mod-community, the original Half-Life became one of the most successful first person shooters of all time.  And thanks to an updated game engine, gorgeous visual, and intelligent weapong and level design, Half Life 2 became almost as popular.  Armed with the latest episodic update to HL2, Episode 1, we benchmarked the game with a long, custom-recorded timedemo that takes us through both outdoor and indoor environments. These tests were run at resolutions of 1,280 x 1,024 and 1,600 x 1,200 with 4X anti-aliasing and 16X anisotropic filtering enabled concurrently, and with color correction and HDR rendering enabled in the game engine as well.


Our custom Half Life 2: Episode 1 benchmark proved to be a bit of a stumbling clock for the GeForce 8600 GTS and GT cards. In this test, the cards put up playable framerates at 1280x1024 with AA and aniso enabled, but performance fell off quite a bit at the higher resolution.  Overall, they were outpaced by all of the competition in this game, including the GeForce 7950 GT and the Radeon X1950 Pro.

F.E.A.R. Performance


Performance Comparisons with F.E.A.R
More Info: www.whatisfear.com/us/

One of the most highly anticipated titles of recent years was Monolith's paranormal thriller F.E.A.R. Taking a look at the game's minimum system requirements, we see that you will need at least a 1.7GHz Pentium 4 with 512MB of system memory and a 64MB graphics card in the Radeon 9000 or GeForce4 Ti-classes or better, to adequately run the game. Using the full retail release of the game patched to v1.07, we put the graphics cards in this article through their paces to see how they fared with a popular title. Here, all graphics settings within the game were set to their maximum values, but with soft shadows disabled (Soft shadows and anti-aliasing do not work together currently). Benchmark runs were then completed at resolutions of 1,280x1,024 and 1,600x1,200, with anti-aliasing and anisotropic filtering enabled.


The new GeForce 8600 GTS and GT put up some decent numbers in F.E.A.R. at a resolution of 1280x1024, but with the resolution cranked up to 1600x1200, the fastest of the 8600 GTS cards just hit the 30 FPS mark. Once again, the last-gen GeForce 7950 GT and Radeon X1950 Pro put up the better scores in the singel card configurations.  Running a pair of GTS cards in an SLI configuration, however, resulted in a much better score than Radeon X1950 Pro CrossFire at the lower res, but the Radeons pulled ahead once again at 1600x1200.

Quake 4 Performance


Performance Comparisons w ith Quake 4
Details: www.quake4game.com

Quake 4
id Software, in conjunction with developer Raven, recently released the latest addition to the wildly popular Quake franchise, Quake 4. Quake 4 is based upon an updated and slightly modified version of the Doom 3 engine, and as such performance characteristics between the two titles are very similar.  Like Doom 3, Quake 4 is also an OpenGL game that uses extremely high-detailed textures and a ton of dynamic lighting and shadows, but unlike Doom3, Quake 4 features some outdoor environments as well. We ran this these Quake 4 benchmarks using a custom demo with the game set to its "High-Quality" mode, at resolutions of 1,280 x 1,024 and 1,600 x 1,200 with 4X AA and 8X aniso enabled simultaneously.


Our custom Quake 4 benchmark reported more of the same.  The new GeForce 8600 series cards put up perfectly playable framerates at both resolutions with anti-aliasing and anisotropic filtering enabled, but their performance was overshadowed by the last-generation parts.

Prey Performance


Performance Comparisons w ith Prey
Details: www.prey.com

After many years of development, Take-Two Interactive recently released the highly anticipated game Prey. Prey is based upon an updated and modified version of the Doom 3 engine, and as such performance characteristics between the two titles are very similar.  Like Doom 3, Prey is also an OpenGL game that uses extremely high-detailed textures and a plethora of dynamic lighting and shadows.  But unlike Doom3, Prey features a fare share of outdoor environments as well.  We ran these Prey benchmarks using a custom recorded timedemo with the game set to its "High-Quality" graphics mode, at resolutions of 1,280 x 1,024 and 1,600 x 1,200 with 4X AA and 16X anisotropic filtering enabled simultaneously.


The results reported by our custom Prey benchmark look much like those from Quake 4 on the previous page, which is to be expexted considering they share the same game engine. in comparison to the Radeon X1950 Pro, the new GeForce 8600 GTS cards fare a bit better, but yet again they are outperformed by the older hardware.

S.T.A.L.K.E.R. Performance


Performance Comparisons w ith S.T.A.L.K.E.R.
Details: www.stalker-game.com
The highly anticipated game S.T.A.L.K.E.R. makes use of a proprietary DX9 game engine, dubbed "X-Ray" by its developers. It features an advanced DX9 renderer with Defferred Shading capabilities, which allows the engine to draw a vast amount of dynamic light sources with correct materials and light ‘feedback’. Becuase S.T.A.L.K.E.R. does not have a built-in benchmarking tool, we tested the game using FRAPS at resolutions of 1,280x1,024 and 1,600x1,1200 with the in-game anti-aliasing and anisotropic filtering options set to their maximum values, and with full dynamic lighting enabled. With  We should also note that 'grass shadows' were disabled due to the further slowdowns this feature causes during gameplay.


The new GeForce 8600 GTS and GT cards' performance as they relate to the GeForce 7950 GT and Radeon X1950 Pro are better in S.T.A.L.K.E.R. than in the previous titles we tested, but none of the mainstream cards put up very fluid framerates here. We need to reiterate that we tested this game at its most taxing settings, however. With reduced image quality settings the game would no doubt perform better.

We would also like to point out the performance scaling of the 8600 GTS SLI configuration versus the CrossFire rigs. Enabling CrossFire actually reduced performance, whereas enabling SLI caused significant performance icnreases.  We suspect SLI performance scaling will be better in future driver releases with this game though, based on converstations we had with representatives from NVIDIA. ATI's driver team will likely be doing some tweaking as well.

Power Consumption and Noise


We have a few final data points to cover before bringing this article to a close. Throughout all of our benchmarking, we monitored how much power our test system was consuming using a power meter and also took some notes regarding its noise output. Our goal was to give you all an idea as to how much power each configuration used and to explain how loud the configurations were under load. Please keep in mind that we were testing total system power consumption here, not just the power being drawn by the video cards alone.

Total System Power Consumption & Acoustics
It's All About the Watts and Decibels


Whether sitting at idle or being taxed by a heavy graphics workload, the new GeForce 8600 series cards has the lower power consumption of the bunch. The GeForce 7950 GT was in-line with the pre-overclocked GTS cards, but the GTs put up numbers well below the rest here. The Radeon X1950 Pro wasn't all that much more power hungry but under load there was a 10-20+ watt delta separating it from NVIDIA's new mainstream offerings.

We'd also like to talk a little about the noise generated by the new slim-line, single slot coolers on the GeForce 8600 cards. For the most part, the fans on the coolers spun at a low speed, and were barely audible over our test system's PSU, CPU heatsink, and case fans. After some extended benchmarking sessions though, the fans did spin-up to a higher RPM and they produced a noticable report. We wouldn't call the fans loud by any means, but they were clearly audible.

Our Summary and Conclusion


Performance Summary: NVIDIA's new GeForce 8600 GTS and GT cards put up decent performance numbers throughout our suite of benchmarks, but they were outpaced in a number of situations by the Radeon X1950 Pro and GeForce 7950 GT. In 3DMark06 and S.T.A.L.K.E.R. the 8600s performed on-par or better than the last-gen cards we tested. However, in virtually all of the other tests the new GeForce 8600 GTS and GT cards fell short of the mark set by its older mainstream counterparts.

The new GeForce 8600 GTS and GeForce 8600 GT cards are a bit of a mixed bag.  On one hand, they offer all of the cutting edge features of the GeForce 8800 series, and then some.  These new cards offer full support for DirectX 10, CSAA, high-quality anisotropic filtering, and they feature an enhanced PureVideo HD engine that puts them a step ahead of even the GeForce 8800.  The GeForce 8600 GTS and GT will also hit the market before any competitive DX10 offereings from ATI, they run relatively cool and quiet, and their power consumption numbers are lower than current mainstream graphics cards. On the other hand, the GeForce 8600 GTS and GT don't perform on the same level as last-generation's GeForce 7950 GT and Radeon X1950 Pro, despite the similar price points currently. We should also note that the $220+ price tags of pre-overclocked GeForce 8600 GTS cards is within striking distance of the significantly more powerful GeForce 8800 GTS 320MB, which could be had for about $250-$279 at the moment.

Given their in-game performance and streamlined design, we suspect prices on GeForce 8600 GTS and GT cards will come down quite a bit in the coming weeks. Performance will likely be increased over time with more driver tuning as well, which will further enhance the product's value. As it stands today, however, its difficult to recommend the GeForce 8600 GTS or GT to gamers in light of their gaming performance alone, unless that is DX10 support is on your short list of requirements and these cards fit your budget. For home theature enthusiasts though, the GeForce 8600 GTS is arguably the video card to own. It's got the most advnaced video engine current available and at a price that won't break the bank. You can expect GeForce 8600 GTS card to be available almost immediately, with the 8600 GT and 8500 GT following over the course of the next few weeks.

  • Mainstream DX10
  • Affordable
  • Cool & Quiet
  • Low Power Consumption
  • Next-Gen Video Engine  
  • Gaming Performance

Discuss This Article in HotHardware's PC Hardware Forum

Content Property of HotHardware.com