Logo   Banner   TopRight
ATI Radeon X1800 XT CrossFire Evaluation
Date: Dec 20, 2005
Author: Marco Chiappetta
Intro, Specifications & Related Info

The onslaught of ATI announcements this quarter continues today, with the official unveiling of the Radeon X1800 XT Master Edition graphics card -- the "missing link" necessary to run a pair of X1800 cards in a dual-GPU CrossFire configuration. From the end of September '05 through to today, ATI has announced and released a top-to-bottom line-up of graphics products in rapid succession, in an effort to bolster their product lines in time for the peak holiday buying season.

But ATI's intentions aren't just about getting products on store shelves in time for Santa to place them under the tree. In frank conversations with representatives from ATI, we were also told that the company's recent sequence of releases is about delivering on its promises. It's no secret that ATI has had a rough time bringing the CrossFire and the X1K family of products to market, and they've suffered because of it.  But the company seems committed to resolving their past problems, and re-ascending to the top of the 3D food-chain; the position aggresively wrestled from their grasp by NVIDIA's GeForce 7 series and SLI technology.  The next step to achieving their goal, is delivering Radeon X1800 CrossFire, and the company is poised to do just that.  Read on and feast your eyes on the best ATI has to offer...for now.

ATI Radeon X1800 & CrossFire
Features & Specifications
Features - ATI Radeon X1800
• 321 million transistors on a 90nm fabrication process
• Ultra-threaded architecture with fast dynamic branching
• Sixteen pixel shader processors
• Eight vertex shader processors
• 256-bit 8-channel GDDR3/GDDR4 memory interface
• Native PCI Express x16 bus interface
• Dynamic Voltage Control

Ring Bus Memory Controller
• 512-bit internal ring bus for memory reads
• Programmable intelligent arbitration logic
• Fully associative texture, color, and Z/stencil cache designs
• Hierarchical Z-buffer with Early Z test
• Lossless Z Compression (up to 48:1)
• Fast Z-Buffer Clear
• Z/stencil cache optimized for real-time shadow rendering
• Optimized for performance at high display resolutions, including widescreen HDTV resolutions

Ultra-Threaded Shader Engine
• Support for Microsoft DirectX 9.0 Shader Model 3.0 programmable vertex and pixel shaders in hardware
• Full speed 128-bit floating point processing for all shader operations
• Up to 512 simultaneous pixel threads
• Dedicated branch execution units for high performance dynamic branching and flow control
• Dedicated texture address units for improved efficiency
• 3Dc+ texture compression
_o High quality 4:1 compression for normal maps and two-channel data formats
_o High quality 2:1 compression for luminance maps and single-channel data formats
• Multiple Render Target (MRT) support
• Render to vertex buffer support
• Complete feature set also supported in OpenGL 2.0

Advanced Image Quality Features
• 64-bit floating point HDR rendering supported throughout the pipeline
_o Includes support for blending and multi-sample anti-aliasing
• 32-bit integer HDR (10:10:10:2) format supported throughout the pipeline
_o Includes support for blending and multi-sample anti-aliasing
• 2x/4x/6x Anti-Aliasing modes
_o Multi-sample algorithm with gamma correction, programmable sparse sample patterns, and centroid sampling
_o New Adaptive Anti-Aliasing feature with Performance and Quality modes
_o Temporal Anti-Aliasing mode
_o Lossless Color Compression (up to 6:1) at all resolutions, including widescreen HDTV resolutions
• 2x/4x/8x/16x Anisotropic Filtering modes
_o Up to 128-tap texture filtering
_o Adaptive algorithm with Performance and Quality options
• High resolution texture support (up to 4k x 4k)
• Multi-GPU technology
• Four modes of operation:
_o Alternate Frame Rendering (maximum performance)
_o Supertiling (optimal load-balancing)
_o Scissor (compatibility)
_o Super AA 8x/10x/12x/14x (maximum image quality)
_o Program compliant

Avivo Video and Display Engine
• High performance programmable video processor
_o Accelerated MPEG-2, MPEG-4, DivX, WMV9, VC-1, and H.264 decoding (including DVD/HD-DVD/Blu-ray playback), encoding & transcoding
_o DXVA support
_o De-blocking and noise reduction filtering
_o Motion compensation, IDCT, DCT and color space conversion
_o Vector adaptive per-pixel de-interlacing
_o 3:2 pulldown (frame rate conversion)
• Seamless integration of pixel shaders with video in real time
• HDR tone mapping acceleration
_o Maps any input format to 10 bit per channel output
• Flexible display support
_o Dual integrated dual-link DVI transmitters
_o DVI 1.0 / HDMI compliant and HDCP ready
_o Dual integrated 10 bit per channel 400 MHz DACs
_o 16 bit per channel floating point HDR and 10 bit per channel DVI output
_o Programmable piecewise linear gamma correction, color correction, and color space conversion (10 bits per color)
_o Complete, independent color controls and video overlays for each display
_o High quality pre- and post-scaling engines, with underscan support for all outputs
_o Content-adaptive de-flicker filtering for interlaced displays
_o Xilleon™ TV encoder for high quality analog output
_o YPrPb component output for direct drive of HDTV displays
_o Spatial/temporal dithering enables 10-bit color quality on 8-bit and 6-bit displays
_o Fast, glitch-free mode switching
_o VGA mode support on all outputs
• Compatible with ATI TV/Video encoder products, including Theater 550

ATI Graphics x 2

A lot of things had to happen before ATI was able to release CrossFire into the wild. ATI needed to design and implement the compositing engine, because the necessary logic was not present in the X800 and X1K family of GPUs. They had ensure compatible motherboards were available, and among numerous other things, they also had to engineer the software to enable the technology. We've covered much of what went into CrossFire over the past year, so we won't re-hash anything again here. But if you're not completely familiar with ATI's multi-GPU strategy, we recommend taking a look at these recent articles. There is a multitude of background information that laid the foundation for what we're going to showcase here today.

The ATI Cross Fire solution we'll be spotlighting in this article consists of the following three main components, an ATI CrossFire Edition "master" Radeon X1800 XT card, an ATI Radeon Xpress 200 CrossFire Edition chipset based reference design motherboard, and a second ATI Radeon X1800 XT standard graphics card.  And because the Radeon X1800 XT Master card will also work in conjunction with the Radeon X1800 XL, we've paired them together for some benchmarking as well.  We've actually covered all of these platform technologies in previous HotHardware articles linked above.

In our first "hands-on" look at ATI's Radeon X850 XT CrossFire configuration, we mentioned that CrossFire, like SLI, was a proprietary solution that required an ATI chipset-based motherboard, along with the requisite graphics cards, to function.  This situation has changed since the initial release, however.  Unlike NVIDIA's SLI, which is still a completely proprietary solution that "officially" requires two matched GeForce cards and an nForce 4 SLI chipset-based motherboard, CrossFire will now function on compatible Intel Chipsets as well.  And there are no limitations in hardware that'll prevent CrossFire from working on other chipsets either, provided they've got the proper slots and PCI Express implementations.

The Master Card & Its Accessories

On the surface, the Radeon X1800 XT CrossFire Edition Master card looks just like the standard X1800 XT. The card's share the same dual-slot cooler and red PCB, and are exactly the same size, but underneath the Master Edition's cooling apparatus lies a group of chips that make CrossFire a reality.

The ATI Radeon X1800 XT Master Card
All Up In There



Like the standard edition, the Radeon X1800 XT CrossFire Edition Master card is fairly long and may have clearance issues in some systems. Its 16-pipe Radeon X1800 XT GPU and 512MB of GDDR3 memory populate the front side of the PCB, and are adorned with a dual-slot copper/aluminum hybrid cooler. The cooler exhausts heat from the system through vents in the mounting plate, which helps keep internal temperatures down when running a pair of cards in single system.

The GPU on our Master Edition card was clocked at 625MHz and its memory was clocked at 720MHz (1440MHz DDR), just shy of the 625MHz / 1500MHz of the standard Radeon X1800 XT. The 30MHz memory clock disparity should have a minimal effect on performance, so we won't dwell on it here.  And CrossFire doesn't require a matched-pair of video cards to function, so the difference in memory clock speed shouldn't pose a compatibility problem either. In fact, the Radeon X1800 XT CrossFire Edition Master card will work alongside a Radeon X1800 XL as well. In this type of configuration though, half of the Master Card's RAM gets disabled, and users are required to re-boot their systems to enabled CrossFire (no re-boot required with X1800 XT CrossFire). We tested an X1800 XL / X1800 XT CrossFire configuration on the proceeding pages, but we question the value of spending $600 on an X1800 XT Master Card, only to have half of its frame buffer memory disabled anytime a user wishes to enabled CrossFire.


To bring CrossFire to the X1800, ATI put together a new compositing engine, that's similar but superior to the one used on the older Radeon X850 XT Master cards. If you remember, because X850 cards were equipped with single-link DVI outputs, X850 CrossFire was limited to a max resolution of 1600x1200 with a lowly refresh rate of 60Hz. Graphics cards in the X1K family of products are equipped with dual-link DVI outputs, however. Having dual-link DVI outputs means more bandwidth, which the new compositing engine capitalizes on to offer higher-resolutions than last-generation's CrossFire implementation.

The compositing engine on the Radeon X1800 XT CrossFire Edition Master card consists of a handful of chips. The biggest chip in the group, in the middle of the picture, is a Xilinx Spartan XC3S400 FPGA. The XC3S400 is a higher-end FPGA (Field Programmable Gate Array) than the one ATI used to enable CrossFire on the X850 XT. The XC3S400 is the chip that's programmed to do the actual compositing work.  In total, this chip has ~ 400K logic gates on board, which is fairly low-end by today's standards for an FPGA.  And a cost of somewhere south of $7, the overall retail price point of the board isn't affected too adversely.  We should note that it's upgradeable via firmware as well, so ATI could theoretically incorporate more features into X1800 CrossFire moving forward. To the left of the Xilinx FPGA is the flash ROM chip, that actually contains the necessary programming and configuration code.

Above and to the left of the Xilinx FPGA in the picture, are a pair of Silicon Image SiI 163B TMDS receivers. These are the chips that receive data from the slave card. The "data" is the information being transmitted from the slave card's dual-link DVI output and on through the custom dongle pictured above. The Silicon Image SiI 163B TMDS receivers are clocked at 165MHz and are capable of processing images at a resolution of up to 1600x1200 @ 60Hz. But because there are two of them and they work in tandem, the maximum resolution is doubled to 2560x1600. The three smaller Silicon Image chips to the right are a pair of SiI PanelLink TDMS transmitters and an Analog Devices RAMDAC, which then power any displays connected to the output on the CrossFire dongle.

Our Test Systems & 3DMark05

HOW WE CONFIGURED THE TEST SYSTEM: We used two different test systems for this article.  We tested our NVIDIA based cards on an Asus A8N32-SLI, nForce 4 SLIX16 chipset based motherboard, powered by an AMD Athlon 64 FX-55 processor and 1GB of low-latency Corsair XMS RAM. However, the ATI based cards were tested on an ATI reference Radeon Xpress 200 motherboard, but with the same processor and RAM. The first thing we did when configuring these test systems was enter each BIOS and load the "High Performance Defaults."  The hard drive was then formatted, and Windows XP Professional with SP2 was installed. When the installation was complete, we installed the latest chipset drivers available, installed all of the other necessary drivers for the rest of our components, and removed Windows Messenger from the system. Auto-Updating and System Restore were also disabled, the hard drive was defragmented, and a 768MB permanent page file was created on the same partition as the Windows installation. Lastly, we set Windows XP's Visual Effects to "best performance," installed all of the benchmarking software, and ran the tests.

The HotHardware Test System
AMD Athlon 64 FX Powered

Processor -

Motherboard -

Video Cards -

Memory -

Audio -

Hard Driv
e -


Hardware Used:
AMD Athlon 64 FX-55 (2.6GHz)

Asus A8N32-SLI
nForce4 SLIX16 chipset

ATI Reference CrossFire MB
ATI Radeon Xpress 200 CF Edition

GeForce 7800 GTX 512MB (x2)

GeForce 7800 GTX (x2)
Radeon X1800 XT

1024MB Corsair XMS PC3200 RAM

Integrated on board

Western Digital "Raptor"

36GB - 10,000RPM - SATA

Operating System -
Chipset Drivers -
DirectX -

Video Drivers -

Synthetic (DX) -
DirectX -

DirectX -
DirectX -
DirectX -
OpenGL -
OpenGL -
Relevant Software:
Windows XP Professional SP2 (Patched)
nForce Drivers v6.82
DirectX 9.0c

NVIDIA Forceware v81.89

ATI Catalyst v5.13

Benchmarks Used:
3DMark05 v1.2.0
Splinter Cell: Chaos Theory v1.04
FarCry v1.33*
Half Life 2*
Doom 3 v1.3 (Single Player)*
Quake 4*

* - Custom Test (HH Exclusive demo)

Performance Comparisons with 3DMark05 v1.2.0
Details: http://www.futuremark.com/products/3dmark05/

3DMark05 is the latest installment in a long line of synthetic 3D graphics benchmarks, dating back to late 1998. 3DMark05 is a synthetic benchmark that requires a DirectX 9.0 compliant video card, with support for Pixel Shaders 2.0 or higher, to render all of the various modules that comprise the suite. To generate its final "score", 3DMark05 runs three different simulated game tests and uses each test's framerate in the final tabulation. Fillrate, Memory bandwidth, and compute performance especially all have a measurable impact on performance in this benchmark. We ran 3DMark05's default test (1,024 x 768) on all of the cards and configurations we tested, and have the overall results posted for you below.

Both CrossFire configurations performed very well while running the default 3DMark05 test. The X1800 XL CrossFire configuration finished just behind a pair of 256MB GeForce 7800 GTX cards running in SLI mode, while the X1800 XT CrossFire configuration pulled ahead of the same SLI setup by about 500 points. The highest performing configuration here, however, was a pair of 512MB GeForce 7800 GTX cards running in SLI. Although the pair of X1800 XTs was right behind it, finishing the test with a score only 366 points lower than NVIDIA's flagship gaming platform.

Splinter Cell: Chaos Theory

Performance Comparisons with Splinter Cell: Chaos Theory v1.04
Details: http://www.splintercell3.com/us/

SC: Chaos Theory
Based on a heavily modified version of the Unreal Engine, enhanced with a slew of DX9 shaders, lighting and mapping effects, Splinter Cell: Chaos Theory is gorgeous with its very immersive, albeit dark, environment. The game engine has a shader model 3.0 code path that allows the GeForce 6 & 7 Series of cards, and the new X1000 family of cards, to really shine, and a recent patch has implemented a shader model 2.0 path for ATI's X8x0 generation of graphics hardware. For these tests we enabled the SM 3.0 path on all of the cards we tested. However, High Dynamic Range rendering was disabled so that we could test the game with anti-aliasing enabled. We benchmarked the game at resolutions of 1,280 x 1024 and 1,600 x 1,200, both with and without anti-aliasing and anisotropic filtering.



As expected, the dual-card configurations smoked all of the single-card configurations in the Splinter Cell: Chaos Theory benchmark.  Here, a pair of 512MB GeForce 7800 GTX cards was top-dog at both resolutions, followed by either the X1800 XT CrossFire rig or the 256MB GeForce 7800 GTX SLI rig depending on the test settings. The pair of 256MB GeForce cards performed better than the X1800 XTs when no addition pixel processing was used, but the X1800 XT's running in CrossFire mode performed better when anti-aliasing and anisotropic filtering was enabled.  The Radeon X1800 XL CrossFire configuration performed well also, but was outpaced by every other dual-GPU configuration.

We should also note that Splinter Cell: Chaos Theory was not completely stable while testing it on CrossFire. We didn't keep a log of errors, but this game crashed on the CrossFire rig at least once out of every 6-8 benchmark runs. After crashing, the game's option were reset to their defaults and Catalyst Control Center wouldn't open without a re-boot. After re-setting all of the in-game options and verifying the drivers were correctly configured, we were able to continue testing though, so hopefully this is just a sign of driver immaturity that'll be worked out with "retail-ready" product.

F.E.A.R. v1.02


Performance Comparisons with F.E.A.R
More Info: http://www.whatisfear.com/us/

One of the most highly anticipated titles of 2005, Monolith's new paranormal thriller F.E.A.R promises to be as thrilling to the mind as it is to the eyes. Taking a look at the minimum system requirements, we see that you will need at least a 1.7GHz Pentium 4 with 512MB of system memory and a 64MB graphics card that is a Radeon 9000 or GeForce4 Ti-class or better to adequately run the game. Using the full retail release of the game patched to v1.02, we put the graphics cards in this review through their paces to see how they fared with a promising new title. Here, all graphics settings within the game were set to the maximum values, but with soft shadows disabled (Soft shadows and anti-aliasing do not work together currently). Benchmark runs were then completed at resolutions of 1280x960 and 1600x1200, with and without anti-aliasing and anisotropic filtering enabled.



F.E.A.R did not perform as well as we had expected with CrossFire, but we suspect there is an issue with the beta drives ATI had us use for testing because this was another game that repeatedly crashed on us. And we also encountered a bug that caused CrossFire to spontaneously be disabled in the Catalyst Control Center during our F.E.A.R testing. We tried two sets of drivers here (5.12 beta and 5.13 beta), ran numerous test loops, and this is the performance data presented to us by the test system. As you can see, the SLI systems had a marked performed advantage when no additional pixel processing was used, but their leads diminished somewhat once anti-aliasing and anisotropic filtering were enabled. The Radeon X1800 XTs were actually able to nudge past the 256MB GeForce 7800 GTXs at 1600x1200 when anti-aliasing and anisotropic filtering were used.

FarCry v1.33

Performance Comparisons with FarCry v1.33
Details: http://www.farcry.ubi.com/

If you've been on top of the gaming scene for some time, you probably know that FarCry was one of the most visually impressive games to be released on the PC this past year. Courtesy of its proprietary engine, dubbed "CryEngine" by its developers, FarCry's game-play is enhanced by Polybump mapping, advanced environment physics, destructible terrain, dynamic lighting, motion-captured animation, and surround sound. Before titles such as Half-Life 2 and Doom 3 hit the scene, FarCry gave us a taste of what was to come in next-generation 3D Gaming on the PC. We benchmarked the graphics cards in this review with a custom-recorded demo run taken in the "Catacombs" area checkpoint, at various resolutions without anti-aliasing or anisotropic filtering enabled, and then with 4X AA and 16X aniso enabled concurrently.



FarCry proved to be one of CrossFire's strong points. The Radeon X1800 XT CrossFire configuration ran right alongside the dual-512MB GeForce 7800 GTX SLI configuration in every test, and surpassed NVIDIA's best in a couple of tests as well. The same basically rang true for the Radeon X1800 XL CrossFire rig, which performed on par with a pair of 256MB GeForce 7800 GTX cards running in SLI mode. Although FarCry still looks good with its huge outdoor environments and lush graphics, it's no longer one of the more demanding benchmarks in our test suite.

Half Life 2

Performance Comparisons with Half-Life 2
Details: http://www.half-life2.com/

Half Life 2
Thanks to the dedication of hardcore PC gamers and a huge mod-community, the original Half-Life became one of the most successful first person shooters of all time.  So, when Valve announced Half-Life 2 was close to completion in mid-2003, gamers the world over sat in eager anticipation. Unfortunately, thanks to a compromised internal network, the theft of a portion of the game's source code, and a tumultuous relationship with the game's distributor, Vivendi Universal, we all had to wait until November 2004 to get our hands on this classic. We benchmarked Half-Life 2 with a long, custom-recorded timedemo in the "Canals" map, that takes us through both outdoor and indoor environments. These tests were run at resolutions of 1,280 x 1,024 and 1,600 x 1,200 without any anti-aliasing or anisotropic filtering and with 4X anti-aliasing and 16X anisotropic filtering enabled concurrently.



Like FarCry, Half Life 2 was another game that ran very well on CrossFire, but the GeForce cards rocked here too. If we want to get nit-picky, the X1800 XT CrossFire rig was the top performer, followed by the 512MB GeForce 7800 GTX SLI rig, and then the Radeon X1800 XL CrossFire system. Only a few frames per second separated the top finishers in this test though, and all configurations posted frame rates will into triple-digit territory regardless of resolution or level of pixel processing. Any high-end single card is enough for some high-resolution Half-Life 2 action, let alone of pair of cards.

Doom 3 v1.3

Performance Comparisons with Doom 3
Details: http://www.doom3.com/

Doom 3
id Software's games have long been pushing the limits of 3D graphics. Quake, Quake 2, and Quake 3 were all instrumental in the success of 3D accelerators on the PC. Now, many years later, with virtually every new desktop computer shipping with some sort of 3D accelerator, id is at it again with the visually stunning Doom 3. Like most of id's previous titles, Doom 3 is an OpenGL game that uses extremely high-detailed textures and a ton of dynamic lighting and shadows. We ran this batch of Doom 3 single player benchmarks using a custom demo with the game set to its "High-Quality" mode, at resolutions of 1,280 x 1,024 and 1,600 x 1,200 without anti-aliasing enabled and then again with 4X AA and 8X aniso enabled simultaneously.



Although ATI's recent driver updates have improved OpenGL performance dramatically, their cards still fall prey to NVIDIA in our custom Doom 3 benchmark. Both GeForce 7800 GTX SLI configurations, whether equipped with 256MB or faster clocked 512MB cards, handily outpaced both CrossFire rigs by significant margins. If we focus on performance when anti-aliasing and anisotropic filtering were enabled, the GeForce 7800 SLI systems outperformed either CrossFire configuration by margins ranging from about 20 FPS all the way up to about 50 FPS. Without any additional pixel processing, these margins are even bigger. So, while the 90+ FPS the pair of Radeon X1800 XTs running in CrossFire mode is nothing to sneeze at, it's just not enough to challenge NVIDIA's best within the Doom 3 environment.

Quake 4

Performance Comparisons with Quake 4
Details: http://www.quake4game.com/

Quake 4
id Software, in conjunction with developer Raven, recently released the latest addition to the wildly popular Quake franchise, Quake 4. Quake 4 is based upon an updated and slightly modified version of the Doom 3 engine, and as such performance characteristics between the two titles are very similar.  Like Doom 3, Quake 4 is also an OpenGL game that uses extremely high-detailed textures and a ton of dynamic lighting and shadows, but unlike Doom3, Quake 4 features some outdoor environments as well. We ran this these Quake 4 benchmarks using a custom demo with the game set to its "High-Quality" mode, at resolutions of 1,280 x 1,024 and 1,600 x 1,200 without anti-aliasing enabled and then again with 4X AA and 8X aniso enabled simultaneously.



Our custom Quake 4 benchmark results essentially mirror those of Doom 3. A single Radeon X1800 XT is actually able to overtake a single 256MB GeForce 7800 GTX in some test configurations, but all of the tests belong to NVIDIA when testing with dual-card configurations. A pair of 256MB GeForce 7800 GTX cards running in SLI mode is slightly faster than X1800 XT CrossFire at 1600x1200 when AA and aniso are enabled concurrently, but in every other test the 256MB GTXs finish well ahead. Not to mention the 512MB GeForce 7800 GTX SLI configuration, which simply crushes everything else in this test.

CrossFire AA: Image Quality

We've already explained how ATI's 'Super-AA' modes work in our initial CrossFire Technology Preview and X850 XT CrossFire piece, but we thought we'd give you all a quick refresher here as well. Essentially, Super-AA works by blending the frames rendered on each card together in the CrossFire Edition (Master) card's compositing engine. The end result is an effective increase in the number of samples used to anti-alias the scene.

New ATi Super AA Modes
Not to be confused with full Super Sample AA

The new Super-AA modes offered by CrossFire are 8X AA, 10X AA, 12X AA, and 14X AA. 8X Super-AA is the result of a blend of 4X multi-sample AA being used on each card, with each one using a different sample pattern. 12X Super-AA is very similar, but in this mode ATI's 6X multi-sample AA algorithm is used on each card, before they're blended together in the compositing engine. 10X AA and 14X AA, however, are somewhat different. In these modes, an additional level of anti-aliasing is applied to the image, in the form of 2X super-sample AA. Unlike MSAA, which only works on polygon edges, Super-Sample AA can reduce the appearance of jaggies in an entire scene, even on transparent textures like those used to simulate vegetation or a chain-link fence, for example. 10X Super-AA is basically 8X AA plus 2X super-sample AA, and 14X Super-AA is 12X AA plus 2X super-sample AA.

An additional benefit to the new compositing engine used on the X1800 XT CrossFire Edition Master card is that it can perform the blend operation at full-speed, for increased performance.

Half Life 2 Image Quality Analysis: Anti-Aliasing

We spent some time analyzing the CrossFire X1800 XT's in-game image quality versus the best NVIDIA currently has to offer, a GeForce 7800 GTX SLI configuration. We used the "background 2" map in Half-Life 2 to get a feel for how each configuration's anti-aliasing algorithm's affected the scene.

NVIDIA GeForce 7800 GTX SLIAA Screenshots

GeForce 7800 GTX SLI
1280x1024 - No AA

GeForce 7800 GTX SLI
1280x1024 - 4x AA

GeForce 7800 GTX SLI
1280x1024 - SLI 8X AA

GeForce 7800 GTX SLI
1280x1024 - SLI 16x AA

ATI Radeon X1800 XT CrossFire AA Screenshots

Radeon X1800 XT CrossFire
1280x1024 - 8x AA

Radeon X1800 XT CrossFire
1280x1024 - 10x AA

Radeon X1800 XT CrossFire
1280x1024 - 12x AA

Radeon X1800 XT CrossFire
1280x1024 - 14x AA

We've included "No AA" and "4x AA" shots using a pair of GeForce 7800 GTX cards above, but these images are strictly here to maintain symmetry on the page. :)  The "real" comparison here is between NVIDIA's SLI8X and SLI16X modes versus ATI's 8X through 14X Super-AA.

If you direct your attention to the water-tower and crane in the background of the images, and the grass and tree in the forefront on the left, the impact anti-aliasing has on image quality is readily apparent. What'll you see if you browse through the images is that both platforms produce very good images and it's difficult to say with any conviction that one is any better than the other. Our favorites of the bunch are ATI's 10X and 14X modes, however, as they do the best job with micro-detail, and also clean up the jaggies in the grass as well. There is an obvious issue that still exists within ATI's drivers that causes the edge of the water to render incorrectly though at higher AA levels. This should be fixed in a future driver release.

CrossFire AA: Performance

We didn't perform a comprehensive test routine to assess the performance of all of ATI's CrossFire Super-AA modes, but we did run a couple of tests to get a general idea as to how the various modes perform. ATI claimed the new compositing engine on the X1800 XT Master Card offers higher-performance than the engine used on the X850 XT though, so we wanted to see this for ourselves.

CrossFire AA Performance: Half Life 2 & FarCry
Upping the Number of Samples



As you can see, ATI's claims of a faster compositing engine appear to be true. As expected, performance is much lower than standard AA when using CrossFire, because each card is rendering the same frame as opposed to alternating frames, but performance remained quite good in both games we tested. With either Half-Life 2 or FarCry running at 1280x1024 with all in-game graphical option set to their respective maximums, and with 16x anisotropic filtering enabled, the X1800 XT CrossFire system was able to maintain very playable framerates.

Power Consumption & Noise


Total System Power Consumption, Acoustics & Temperatures
It's All About the Watts and Decibels

We have a few final data points we'd like to cover before bringing this article to a close. Throughout all of our benchmarking, we monitored how much power our ATI based CrossFire test system was consuming using a power meter, and also took some notes regarding its noise output. Our goal was to give you all an idea as to how much power each configuration used and to explain how loud the configurations were under load. Please keep in mind that we were testing total system power consumption here, not just the power being drawn by the video cards alone.

While idling, either CrossFire configuration drew only marginally more power than a single card configuration. Under load, however, power consumption goes way up - as expected. A pair of 512MB Radeon X1800 XTs running in CrossFire more used nearly 370W of power. A Radeon X1800 XL CrossFire configuration peaked at just over 340W. Clearly, running two Radeon X1800s in CrossFire mode requires a significant amount of power, so make sure your PSU is up to snuff should you be in the market for X1800 CrossFire.

We should also talk a bit about the noise associated with running a pair of X1800 XTs in a single system. When we first setup our test machine, and powered it up for the first time, it was clearly the loudest system that had ever graced the lab. Upon initial startup, both fans on both X1800s rotated at their maximum speeds, which resulted in a lot of noise. Once the drivers were installed, however, the fans on both cards spun-down dramatically and the test system became relatively quiet.  Then, throughout the rest of our testing, the fans never spun up to maximum again. To put it simply, except for the initial shock of hearing two X1800 XTs running at full-bore when we first turned on the machine, our X1800 XT CrossFire test system was relatively quiet, and we would not consider excessive noise an issue at all during normal use.

Our Summary & Conclusion

Performance Summary: Both CrossFire configurations performed very well in the majority of games / benchmarks we tested. Half Life 2, FarCry, Splinter Cell and 3DMark05 all scaled dramatically on CrossFire, and ATI's and NVIDIA's respective flagship configurations traded victories in these benchmarks. 256MB and 512MB GeForce 7800 GTX SLI configurations had big performance advantages in OpenGL games, like Doom 3 and Quake 4, but overall ATI had a good showing. Clearly though, a 512MB GeForce 7800 GTX SLI configuration remains the most powerful gaming setup available today.

Radeon X1800 XT CrossFire Edition Master cards should be available from a handful on-line retailers over the next few days, at an MSRP of $599. We have already received word that HIS has begun shipping cards, and ATI was only waiting on shipments of the custom dongle before getting product out to retailers, so there is a chance consumers will be able to get their hands on an X1800 XT master card in time for Christmas. But is X1800 XT CrossFire worth the expense? Well, that all depends on your perspective, and your wallet's girth of course. If we strictly look at price, a pair of X1800 XTs - if master cards actually end up being available at or below MSRP - will end up being up to $300 "cheaper" than a pair of 512MB GeForce 7800 GTX cards. Although NVIDIA launched these cards with an MSRP of $649, you can't find them for less than $750, and that's if you can find them at all. Plus, ATI's Catalyst 5.13 drivers give the X1K series a cards an advantage in video playback quality by exposing more of Avivo's capabilities. So, with very good gaming performance, better video performance, and an overall lower price, X1800 XT CrossFire is clearly a solid alternative to a pair of 512MB GeForce 7800 GTXs. A pair of 256MB GeForce 7800 GTX, however, could be had for $200-$300 less than X1800 XT CrossFire, which probably makes that configuration the best value at the high-end. You would miss out on the performance benefits of larger 512MB frame buffers though with 256MB GTXs, so there is definitely a lot to consider if you've got the funds for a high-end dual-graphics configuration. We wouldn't recommend purchasing an X1800 XT master card for use in conjunction with a Radeon X1800 XL, however. Dropping $600 on a much higher-clocked, more powerful 512MB Master card, only to disable half of its memory, doesn't strike us a great idea.

In the end, X1800 XT CrossFire proved to be a very high performing platform, that offered excellent image quality at smooth frame rates. We are a bit concerned over the instability we experienced during some of our testing, but we suspect these issues will be ironed out with future driver releases. We also suspect ATI will be able to wring more performance out of X1800 XT CrossFire configurations with updated drivers, as the company's software engineers get more familiar with the intricacies of the hardware, so the future certainly looks promising for ATI.

_X1800 XT CrossFire is Fast
_Super-AA Performance
_Excellent Image Quality
_Less Expensive than dual-512MB GeForce 7800 GTXs
_We Experienced Some Instability
_CrossFire Dongle
_X1800 XL CrossFire Price/Performance

Get into HotHardware's PC Hardware Forum Right Now!

Content Property of HotHardware.com