Logo   Banner   TopRight
XFX GeForce 6600 256MB DDR2
Date: Nov 08, 2005
Author: Matt Beauvais
Introduction, Specifications & Bundle

Around this time last month, ATI made a lot of noise with the launch of their new X1000 series of graphics cards. With much of the buzz still continuing a month later, it's little surprise that NVIDIA is feeling the need to showcase something new. With their 7800 boards covering the enthusiast market, and the new 6800 GS tackling the mainstream, NVIDIA has re-worked an existing product to specifically compete against new ATI's entry-level offering, the Radeon X1300 Pro. The card we have for you today isn't exactly new technology, it's a GeForce 6600, but some modifications have been made to boost performance.

DDR2 memory, as many of you may recall, was first by NVIDIA on their GeForce FX 5800.  Then ATI had their turn with the 256MB Radeon 9800 Pro. Early DDR2 RAM chips ran relatively hot though, so hot in fact that they required the use of ramsinks. Since then, DDR2 hasn't been widely used on a graphics card and GDDR3 has taken over. But here we are, a few years later, and DDR2 is back!  And NVIDIA has put 256MB of it on a GeForce 6600. In addition to the new memory, the GeForce 6600 DDR2 reference specifications call for a 350MHz core, and 400MHz memory (DDR 800MHz). This is a considerable step up from the previous 300MHz / 250MHz version of the GeForce 6600.

What we have on-tap for you all today, is XFX's version of the "new" GeForce 6600 DDR2, which actually comes in clocked at 400MHz / 400MHz. The 50MHz increase to the core clock should offer a nice performance boost, and with FarCry thrown into the bundle as well, its clear XFX is looking to differentiate this product from competing products in its class.

Specifications & Features of The GeForce 6600
NVIDIA's Newest Value Priced PCI-Express GPU
  • Vertex Shaders
    Support for Microsoft DirectX 9.0 Vertex Shader 3.0
    Displacement mapping
    Geometry Instancing
    Infinite length vertex programs*
  • Pixel Shaders
    Support for DirectX 9.0 Pixel Shader 3.0
    Full pixel branching support
    Support for Multiple Render Targets (MRTs)
    Infinite length pixel programs*
  • Next-Generation Texture Engine
    Up to 16 textures per rendering pass
    Support for 16-bit floating point format and 32-bit floating point format
    Support for non-power of two textures
    Support for sRGB texture format for gamma textures
    DirectX and S3TC texture compression
  • Full 128-bit studio-quality floating point precision through the entire rendering pipeline with native hardware support for 32bpp, 64bpp, and 128bpp rendering modes


  • Full floating point support throughout entire pipeline
  • Floating point filtering improves the quality of images in motion
  • Floating point texturing drives new levels of clarity and image detail
  • Floating point frame buffer blending gives detail to special effects like motion blur and explosions


  • Advanced 16x anisotropic filtering
  • Blistering-fast anti-aliasing and compression performance
  • New rotated-grid anti-aliasing removes jagged edges for incredible edge quality
  • Support for advanced lossless compression algorithms for color, texture, and z-data at even higher resolutions and frame rates
  • Fast z-clear
  • High-resolution compression technology (HCT) increases performance at higher resolutions through advances in compression technology


  • Designed for PCI Express x1
  • Designed for high-speed memory
  • 128-bit advanced memory interface
  • 0.11 micron process technology
  • Advanced thermal management and thermal monitoring
  • Designed to enhance the performance of shadow-intensive games, like id Softwares Doom III
  • Dedicated on-chip video processor
  • MPEG video encode and decode
  • WMV9 decode acceleration
  • Advanced adaptive de-interlacing
  • High-quality video scaling and filtering
  • DVD and HDTV-ready MPEG-2 decoding up to 1920x1080i resolutions
  • Dual integrated 400 MHz RAMDACs for display resolutions up to and including 2048x1536 at 85Hz.
  • Dual DVO ports for interfacing to external TMDS transmitters and external TV encoders
  • Microsoft Video Mixing Renderer (VMR) supports multiple video windows with full video quality and features in each window
  • Full NVIDIA nView multi-display technology capability


  • DVC color controls
  • DVC image sharpening controls


  • Windows XP
  • Windows ME
  • Windows 2000
  • Windows 9X
  • Macintosh OS, including OS X
  • Linux


  • Complete DirectX support, including the latest version of Microsoft DirectX 9.0
  • Full OpenGL, including OpenGL 1.5

* The operating system or APIs can impose limits, but the hardware does not limit shader program length.



The bundle you see here is basic, but complete. Drivers, a manual, and an S-Video cable are included with the XFX GeForce 6600 DDR 256MB, along with two DVI to VGA adapters. But gamers need something to show off the power of their new card, right? Well, that's why the DVD version of Far Cry has been included with the package. Though FarCry can now be found for a relatively low price almost everywhere, it's still an excellent game and is a good, solid, title that should showcase the capabilities of this card very well. About the only thing missing is some sort of DVD playback software, but because of this card's low selling price this isn't a major problem.

The XFX GeForce 6600 256MB DDR2

The XFX GeForce 6600 256MB DDR2
A Steal @ $100?




This is what $100 will get you these days - and 8-pipe, SM3.0 compliant GPU clocked at 400MHz and 256MB of DDR2 memory running at an effective 800MHz. As you can see, the XFX GeForce 6600 256MB DDR2 is not a large graphics card. It is about the same size as ATI's X1300 Pro, and older GeForce 6600 cards. While inspecting the card, one of the first things we noticed, was the small aluminum cooler affixed to the GPU. For a budget card, you can't expect a big chunk of copper to be attached to the core. However, right on the box it does state, "Extreme Overclocking" as a feature. At first glance we had to wonder how much overclocking was possible with the included cooler.  More on this later though.  Usually smaller fans run at higher RPMs for effective cooling. This leads to the cooling solution having a loud, high pitch buzz. This cooler, however, is actually fairly quiet. Later in the review, we'll cover more on the cooler's performance.

Unlike the previous implementations of DDR2 memory on a graphics card, there are no ramsinks to be found here. This is not a problem though. DDR2 memory has matured significantly since its inception, and the current generation of DDR2 memories, have lower power requirements, and operate at lower temperatures.

Since this is a low-cost graphics card, dual DVI ports are not included. Also in line with NVIDIA's lower-end cards, is the absence of an SLI bridge. To run two of these baby's in SLI mode, all you need to do is plug them in and configure them correctly in the driver control panel. Finally, the blue PCB is a nice touch. With all the green and red we see these days, it's a welcome change.

Our Test System & 3DMark05

The HotHardware Test System
AMD Athlon 64 FX Powered

Processor -

Motherboard -

Video Cards -

Memory -

Audio -

Hard Driv
e -


Hardware Used:
AMD Athlon 64 3000+ (1.8GHz)

ASUS A8N Deluxe
nForce4 SLI chipset

XFX GeForce 6600 DDR2

ASUS Extreme N6600GT Silencer
ATI Radeon X700 Pro
ATI Radeon X1300

1024MB Mushkin Redline PC3200 RAM

Integrated on board

Western Digital

120GB - 7,200RPM - ATA

160GB - 7,200RPM - ATA

Operating System -
Chipset Drivers -
DirectX -

Video Drivers

Synthetic (DX) -
DirectX -

DirectX -
DirectX -
DirectX -
OpenGL -
OpenGL -
Relevant Software:
Windows XP Professional SP2 (Patched)
nForce Drivers v6.82
DirectX 9.0c

NVIDIA Forceware v81.85

ATI Catalyst v5.10a beta

Benchmarks Used:
3DMark05 v1.2.0
Unreal Tournament 2004*
FarCry v1.33
Half Life 2*
Doom 3 v1.3
Quake 4*

* - Custom Test (HH Exclusive demo)

Performance Comparisons with 3DMark05 v1.2.0
Details: http://www.futuremark.com/products/3dmark05/

3DMark05 is the latest installment in a long line of synthetic 3D graphics benchmarks, dating back to late 1998. 3DMark05 is a synthetic benchmark that requires a DirectX 9.0 compliant video card, with support for Pixel Shaders 2.0 or higher, to render all of the various modules that comprise the suite. To generate its final "score", 3DMark05 runs three different simulated game tests and uses each test's framerate in the final tabulation. Fillrate, Memory bandwidth, and compute performance especially all have a measurable impact on performance in this benchmark. We ran 3DMark05's default test (1,024 x 768) on all of the cards and configurations we tested, and have the overall results posted for you below.

3DMark05 is extremely stressful on the video subsystem, so the scores you see here will be fairly close to the scores you'll see elsewhere, despite the different CPU's and different amounts of memory being used. The XFX GeForce 6600 DDR2 comes in ahead of the ATI Radeon X1300 Pro, but behind the higher priced ATI Radeon X700 Pro and 6600GT.

Unreal Tournament 2004

Head-to-Head Performance With Unreal Tournament 2004
Epic's Smash Hit!

Unreal Tournament 2K4
Epic's "Unreal" games have been wildly popular, ever since the original Unreal was released in the late '90s. Unreal, Unreal Tournament, and then Unreal Tournament 2003, rapidly became some of our favorites, for both benchmarking, and for killing a few hours when our schedules allowed it! Epic recently released the latest addition to the franchise, Unreal Tournament 2004. We used the retail version of the game to benchmark these cards at resolutions of 1024x768 and 1280x1024, without any anti-aliasing, and with 4X AA and 8X anisotropic filtering.


Our first gaming benchmark starts to give us an impression of how well the XFX GeForce 6600 DDR2 performs. Unreal Tournament 2004 is starting to show its age, as all the cards put up similar scores. When raising the resolution to 1280x1024, and applying 4x Anti-Aliasing and 8x Anisotropic Filtering, we start to see a noticeable performance gap. The XFX GeForce 6600 DDR2 hangs in at 2nd place, except when using our most stressful settings, where the ATI Radeon X700 Pro comes in slightly ahead.

FarCry v1.33

Performance Comparisons with FarCry v1.33
Part of the $100 package

Far Cry
If you've been on top of the gaming scene for some time, you probably know that FarCry is one of the most visually impressive games to be released for the PC. Courtesy of its proprietary engine, dubbed "CryEngine" by its developers, FarCry's game-play is enhanced by Polybump mapping, advanced environment physics, destructible terrain, dynamic lighting, motion-captured animation, and surround sound. Before titles such as Half-Life 2 and Doom 3 hit the scene, FarCry gave us a taste of what was to come in next-generation 3D Gaming on the PC. We benchmarked the graphics cards in this review with the Ubisoft Regulator demo, at various resolutions without anti-aliasing or anisotropic filtering enabled, and then with 4X AA and 8X aniso enabled concurrently.


XFX's GeForce 6600 DDR2 continues to put up some excellent numbers considering its price. Anti-Aliasing and Anisotropic filtering will of course drag your FPS down though. Without the addition of those features, the XFX GeForce 6600 DDR2 ranks 3rd in our testing, with the ATI Radeon X1300 Pro coming in last. Over a year after FarCry was released, it continues to be a stressful benchmark for all but today's high-end cards.

Half Life 2

Performance Comparisons with Half-Life 2
Details: http://www.half-life2.com/

Half Life 2
Thanks to the dedication of hardcore PC gamers and a huge mod-community, the original Half-Life became one of the most successful first person shooters of all time.  So, when Valve announced Half-Life 2 was close to completion in mid-2003, gamers the world over sat in eager anticipation. Unfortunately, thanks to a compromised internal network, the theft of a portion of the game's source code, and a tumultuous relationship with the game's distributor, Vivendi Universal, we all had to wait until November 2004 to get our hands on this classic. We benchmarked Half-Life 2 with a long, custom-recorded timedemo in the "Canals" map, that takes us through both outdoor and indoor environments. These tests were run at resolutions of 1024 x 768 and 1280 x 1024 without any anti-aliasing or anisotropic filtering and with 4X anti-aliasing and 8X anisotropic filtering enabled concurrently.


To enjoy the adventures of Gordon Freeman in all it's graphical glory, you need something a little more powerful than the cards we have on the table today. However, the cards we do have allow you to run the game at 1024x768 with AA and AF at excellent speeds, and also allow you to raise the resolution while keeping acceptable frame rates. ATI's Radeon X1300 Pro continues to fall behind the XFX GeForce 6600 DDR2 here. But the X700 Pro, on the other hand, manages to out run the XFX 6600 DDR2 when using 4xAA and 8xAF at a resolution of 1280x1024.

Doom 3

Performance Comparisons with Doom 3
Fighting demons on a budget

Doom 3
id Software's games have long been pushing the limits of 3D graphics. Quake, Quake 2, and Quake 3 were all instrumental in the success of 3D accelerators on the PC. Now, many years later, with virtually every new desktop computer shipping with some sort of 3D accelerator, id is at it again with the visually stunning Doom 3. Like most of id's previous titles, Doom 3 is an OpenGL game that uses extremely high-detailed textures and a ton of dynamic lighting and shadows. We ran this batch of Doom 3 single player benchmarks using the standard "Demo1" demo, with the game set to its "High-Quality" mode, at resolutions of 1024 x 768 and 1280x1024 without anti-aliasing enabled and then again with 4X AA and 8X aniso enabled simultaneously.


Doom3, as well as being a hit game, has been somewhat of a subliminal advertisement for Nvidia's line of cards. The XFX GeForce 6600 DDR2 runs this game extremely well for a card in its price range. Both our ATI Radeon X700 Pro, and ATI Radeon 1300 Pro fall behind XFX's offering. When benchmarking the X1300 Pro, we used the recently released Catalyst 5.10a beta drivers, which have brought optimizations to the memory controller of ATI's Radeon X1000 series cards. The optimizations have given a boost in performance to all OpenGL games. Even with these optimizations though, the XFX GeForce 6600 DDR2 holds the lead between the two cards. With the XFX card being the least expensive of the bunch, this is some great news for those of you budget gamers who are looking for a decent Doom 3 experience.

Quake 4

Performance Comparisons with Quake 4
Can XFX handle the Strogg?

Quake 4
id Software, in conjunction with developer Raven, recently released the latest addition to the wildly popular Quake franchise, Quake 4. Quake 4 is based upon an updated and slightly modified version of the Doom 3 engine, and as such performance characteristics between the two titles are very similar.  Like Doom 3, Quake 4 is also an OpenGL game that uses extremely high-detailed textures and a ton of dynamic lighting and shadows, but unlike Doom3, Quake 4 features some outdoor environments as well. We ran this these Quake 4 benchmarks using a custom demo with the game set to its "High-Quality" mode, at resolutions of 1024 x 768 and 1280 x 1024 without anti-aliasing enabled and then again with 4X AA and 8X aniso enabled simultaneously.


NVIDIA has traditionally held onto a big performance lead with the Doom 3 engine, and all OpenGL games in general, and that trend continues today. As with the Doom 3 benchmark, We used the Catalyst 5.10a beta drivers here. Even with the new drivers though, NVIDIA still holds the lead among these budget cards. The X700 and X1300 are competitive, but XFX's GeForce 6600 256MB DDR2 was clearly superior.

F.E.A.R. Performance

Performance Comparisons With F.E.A.R
Candy for the Mind and Eyes...

One of the most highly anticipated titles of 2005, Monolith's new psychological thriller F.E.A.R promises to be as thrilling to the mind as it is to the eyes. Taking a look at the minimum system requirements, we see that you will need at least a 1.7GHz Pentium 4 with 512MB of system memory and a 64MB graphics card that is a Radeon 9000 or GeForce4 Ti-class or better. Using the newly released single player demo, we put the notebook through its paces to see how it fared with a promising new title. Here, all graphics settings within the game were set to "Medium" and Trilinear Filtering was enabled. Benchmark runs were then completed at resolutions of 1024x768 and 1280x1024.


Monolith's F.E.A.R. is not a forgiving game when it comes to performance. If you want the best graphics, you need the best card, which would be NVIDIA's GeForce 7800GTX, or ATI's Radeon x1800XT. Our budget cards put up some respectable framerates, however. The XFX GeForce 6600 comes in 2nd, being bested by the faster and more expensive 6600GT. While the XFX 6600 DDR 2 gets 36FPS average using a resolution of 1024x768, you should know that the minimum framerate was 22FPS. To keep F.E.A.R. playable, we recommend dropping the resolution to 800x600, and using no Anti-Aliasing or Anisotropic filtering.

Overclocking the XFX GF6600

Overclocking the XFX GeForce 6600 256MB DDR2
Speeding It Up A Bit

Remember that small cooler we talked about earlier? Well, let's see how it handles a few rounds of overclocking. Like we said, the box claims this card is for "Extreme Overclocking", so we were expecting the card to live up to this claim. The XFX 6600 DDR2 is already clocked 50MHz above the reference 6600 DDR2 core speed, so we were interested to see how much further it would go. We started out by enabling the hidden overclocking features in the Nvidia control panel, by using coolbits of course. Using the built in clock speed detector, the core was automatically set it 425MHz, and 850MHz memory. To test for artifacts, we used ATI tool, which has an artifact tester that is compatible with Nvidia cards. No artifacts were detected after 20 minutes so we decided to play a few levels of Quake 4. Even after 20 minutes of Quake 4, no artifacts were seen. So, we decided to go higher.

We decided to see how far we could go with ATI Tool, so we started the artifact tester, and raised the clock speeds 5MHz every 2 minutes. Artifacts were finally started appearing at around 530MHz. This is an amazing achievement for the XFX GeForce 6600 DDR2, as well as for the little cooler that's been assigned to disperse the heat. To overclock the memory, we put the core back down to its stock speed, and performed the same 5MHz increase every 2 minutes until artifacts were detected. 885MHz is what the new DDR2 memory on this card was able to dish out. We then set the card to 515MHz core and 885MHz memory, and tested for a full 60 minutes without an artifact. We also played a little more Quake 4, which lacked any artifacts. To give you an idea of the performance benefits, we decided to benchmark the same Quake 4 timedemo with our overclocked settings, and also used the same demo with the core clocked at 350MHz, and 400MHz for the memory. This, of course, is to give you an idea of how much faster the XFX card is over the reference GeFoce 6600 DDR2, and how much faster the XFX 6600 DDR2 can become with a little tweaking.

It's a nice improvement, that's for sure. The XFX GeForce 6600 DDR2 is a bit faster than the standard GeForce 6600 DDR2, and when overclocked, will give you some nice results. Playing Quake 4 at a resolution of 1280x1024 becomes a much smoother experience. We feel the memory is holding the core back somewhat, but 885MHz is all we could manage. We had a small fan blowing on our card to keep the heat down, as we wanted to see just how far this card could overclock. Depending on the airflow in your case, and the room temperature, overclocking results can vary. Also, you shouldn't expect the same results you see here from every card. Each card is different, so the results could be better or worse. We feel however, that most of you should getat leastt some kind of a decent overclock with the XFX GeForce 6600 DDR2.

Our Summary & Conclusion

After spending some quality time with XFX's latest budget offering, we were left with a very good impression of this card. It's certainly not going to be setting any benchmark records, but for $100 you'd be hard pressed to find a better value.

The new GeForce 6600 DDR2, in genera, looks to be an excellent performer for the money, but XFX wasn't satisfied with that and raised the core clock speed an extra 50MHz. And raising the core is what this card is all about. As you saw from our overclocking results, this card lived up to its self proclaimed "Extreme Overclocking" status. We honestly didn't expect that little aluminum cooler to allow for such high clock speeds. The new DDR2 ram also allows for some decent overclocking. While it would have been nice to see it running at more than 885MHz, we certainly can't complain. We were surprised, not just by the overclocking results, but the entire package in general.

The performance of the XFX GeForce 6600 DDR2 is somewhere between that of the recently released Radeon X1300 Pro, and the GeForce 6600 GT. Considering both cards are considerably more expensive, XFX's offering is quite a deal. With the exception of F.E.A.R. with AA and Aniso enabled, all of the games we benchmarked today are perfectly playable with the XFX GeForce 6600 DDR2.

To state it simply, the XFX GeForce 6600 256MB DDR2 is a standout product at its price point. Our expectations were exceeded on almost every level. What it all comes down to is the price / performance ratio of this card. $100 can now buy you playable framerates in just about all of today's games.  And to sweeten the deal even further, the card comes with a free copy of FarCry on DVD. Because of this product's great price / performance ratio and good bundle, we have decided to give the XFX GeForce 6600 256MB DDR2 a 9 out of 10 on our Heat Meter, as well as an Editor's Choice award.

_Excellent performance
_Extreme overclocking __potential
_Far Cry on DVD
_No DVD Playback software

Get into HotHardware's PC Hardware Forum Right Now!

Content Property of HotHardware.com