ATi All In Wonder Radeon 9600 XT
Splinter Cell & UT2003
|Splinter Cell's version 1.2 patch includes three pre-recorded demos and incorporates a previously unavailable benchmarking tool. The demos included with the patch are somewhat limited by CPU performance, however, so we opted for the custom Oil Rig demo created by the folks at Beyond 3D to test with this game. Beyond 3D's demo removes two CPU intensive routines while increasing dependence on Pixel Shader performance. Shaders are used to render realistic looking ocean water surrounding an Oil Rig in the demo, as well as simulating a night vision effect for a brief period. Also note that anti-aliasing doesn't work with Splinter Cell. Due to this fact, we do not have any AA scores listed in the graphs below.|
The GeForce FX 5700 Ultra couldn't handle Splinter Cell's shaders quite as well as the Radeons did, so it fell to the bottom of the pack here, although NVIDIA's performance in this test has improved over time. Once again we see the All-In-Wonder 9600 XT's faster core and memory clock speeds give it a small performance advantage over the Radeon 9600 XT - in the neighborhood of 5% at both resolutions.
|Epic's Unreal Tournament series has consistently been one of the most popular first person shooters, and by no coincidence is it also one of the most commonly used video card benchmarks. We continued our DirectX benchmarking with a completely patched, retail version Unreal Tournament 2003. When benchmarking with UT2003, we use a utility that ensures all of the cards are being tested with the exact same in-game settings and "High-Quality" graphical options. We ran the UT2003 benchmarks at resolutions of 1024x768 and 1600x1200 without anti-aliasing, and then again with 4X enabled. We kept Anisotropic filtering disabled in UT2003 because NVIDIA and ATi aren't doing the same type of trilinear filtering when aniso and trilinear are enabled together in this game.|
Without any anti-aliasing enabled, all of the cards we tested performed similarly in Unreal Tournament 2003, at both resolutions. The same held true when we enabled 4XAA at 1024x768, with only a few frames per second separating the three cards. At 1600x1200 with 4XAA enabled though, the GeForce FX 5700 Ultra's huge memory bandwidth advantage (15.2GB/s vs. 10.4GB/s) help it to pull about 32% ahead of the Radeons.