Logo   Banner   TopRight
TopUnder
Transparent
Sapphire Ultimate X1600 Pro and X1600 XT
Transparent
Date: Aug 17, 2006
Section:Graphics/Sound
Author: Matt Beauvais
Transparent
Introduction

 

Selecting the right video card when you're on a budget can be difficult. The major deciding factor for many gamers out there is price. As mouthwatering as it is to think about getting a top-notch SLI or CrossFire setup, it's a luxury that most simply cannot afford. Fear not though, as both Nvidia and ATI have some quality entry-level and mid-range graphics cards in their respective line-ups. Today we have two such cards with us, Sapphire's Ultimate X1600 Pro and X1600 XT. Aimed towards consumers looking for a silent video card at a budget price, Sapphire's Ultimate X1600s will appeal to those looking to build up an HTPC or a PC for casual gaming.

Both cards are based on ATI's RV530 GPU. The X1600 Pro and X1600 XT have 12 parallel pixel pipelines, 5 vertex shaders, 4 texture units, and 4 render backends (ROPs). Both cards also support DirectX 9.0c, OpenGL 2.0, are able to render cutting edge features such as HDR lighting with AA, and they are both CrossFire ready. The only real differences between the Pro and XT are clock speeds. 

The ATI Radeon X1600 GPU
Features & Specifications
Features - ATI Radeon X1600
. 157 million transistors on 90nm fabrication process
. Twelve pixel shader processors
. Five vertex shader processors
. 128-bit 4-channel DDR/DDR2/GDDR3/GDDR4 memory interface
. Native PCI Express x16 bus interface
. Dynamic Voltage Control

Ring Bus Memory Controller
. 256-bit internal ring bus for memory reads
. Programmable intelligent arbitration logic
. Fully associative texture, color, and Z/stencil cache designs
. Hierarchical Z-buffer with Early Z test
. Lossless Z Compression (up to 48:1)
. Fast Z-Buffer Clear
. Z/stencil cache optimized for real-time shadow rendering

Ultra-Threaded Shader Engine
. Support for Microsoft DirectX 9.0 Shader Model 3.0 programmable vertex and pixel shaders in hardware
. Full speed 128-bit floating point processing for all shader operations
. Dedicated branch execution units for high performance dynamic branching and flow control
. Dedicated texture address units for improved efficiency
. 3Dc+ texture compression
_o High quality 4:1 compression for normal maps and two-channel data formats
_o High quality 2:1 compression for luminance maps and single-channel data formats
. Multiple Render Target (MRT) support
. Render to vertex buffer support
. Complete feature set also supported in OpenGL 2.0

Advanced Image Quality Features
. 64-bit floating point HDR rendering supported throughout the pipeline
_o Includes support for blending and multi-sample anti-aliasing
. 32-bit integer HDR (10:10:10:2) format supported throughout the pipeline
_o Includes support for blending and multi-sample anti-aliasing
. 2x/4x/6x Anti-Aliasing modes
_o Multi-sample algorithm with gamma correction, programmable sparse sample patterns, and centroid sampling
_o New Adaptive Anti-Aliasing feature with Performance and Quality modes
_o Temporal Anti-Aliasing mode
_o Lossless Color Compression (up to 6:1) at all resolutions, including widescreen HDTV resolutions
. 2x/4x/8x/16x Anisotropic Filtering modes
_o Up to 128-tap texture filtering
_o Adaptive algorithm with Performance and Quality options
. High resolution texture support (up to 4k x 4k)
Avivo Video and Display Engine
. High performance programmable video processor
_o Accelerated MPEG-2, MPEG-4, DivX, WMV9, VC-1, and H.264 decoding (including DVD/HD-DVD/Blu-ray playback), encoding & transcoding
_o DXVA support
_o De-blocking and noise reduction filtering
_o Motion compensation, IDCT, DCT and color space conversion
_o Vector adaptive per-pixel de-interlacing
_o 3:2 pulldown (frame rate conversion)
. Seamless integration of pixel shaders with video in real time
. HDR tone mapping acceleration
_o Maps any input format to 10 bit per channel output
. Flexible display support
_o Dual integrated dual-link DVI transmitters
_o DVI 1.0 / HDMI compliant and HDCP ready
_o Dual integrated 10 bit per channel 400 MHz DACs
_o 16 bit per channel floating point HDR and 10 bit per channel DVI output
_o Programmable piecewise linear gamma correction, color correction, and color space conversion (10 bits per color)
_o Complete, independent color controls and video overlays for each display
_o High quality pre- and post-scaling engines, with underscan support for all outputs
_o Content-adaptive de-flicker filtering for interlaced displays
_o XilleonTM TV encoder for high quality analog output
_o YPrPb component output for direct drive of HDTV displays
_o Spatial/temporal dithering enables 10-bit color quality on 8-bit and 6-bit displays
_o Fast, glitch-free mode switching
_o VGA mode support on all outputs
. Compatible with ATI TV/Video encoder products, including Theater 550

High Performance Memory Controller
. Fully associative texture, color, and Z/stencil cache designs
. Hierarchical Z-buffer with Early Z test
. Lossless Z Compression (up to 48:1)
. Fast Z-Buffer Clear
. Z/stencil cache optimized for real-time shadow rendering

 


Transparent
Taking a Closer Look

   

A closer look at the cards
Identical Twins?

Sapphire X1600 Pro

Sapphire X1600 Pro

Sapphire X1600 XT

Sapphire X1600 XT

At first glance, your eyes will be drawn to the custom cooler Sapphire has designed for their cards. The large aluminum heatsink covers both sides of the PCB, and uses heatpipe technology to help enhance cooling. Thanks to the heatpipes, and the large surface area of the cooler, Sapphire was able to use a low-RPM fan on their cards, so noise won't be a problem for most people. The Sapphire X1600 Pro uses GDDR2 memory, while the X1600 XT uses GDDR3 memory to achieve much higher memory speeds. Both types of ram run considerably cooler than standard DDR. The only advantage over Sapphire's Ultimate X1600 series and Sapphire's other X1600 series is the heatsink design, which provides better cooling and less noise.

The X1600 Pro and X1600 XT and their respective packages are nearly identical. The only difference being the type of memory used, and the clock speeds of the two cards. Sapphire's X1600 Pro runs at 500MHz/800MHz, while the X1600 XT runs at 578MHz/1.38GHz. There's quite a big difference in memory speeds, so the performance should be interesting to see once the benchmarks start. The cards are outfitted with 1xVGA, 1xDVI, and 1xComposite Video. 

 

Sapphire includes a DVI to VGA adapter with each of their products though, so those of you with dual VGA monitors will be all set. Inside you'll find a CD containing all the necessary drivers to get your card up and running. Sapphire has also included an S-Video cable, Composite video cable, an S-Video to Composite video adapter, S-Video to component output adapter, and an DVI to VGA adapter.

Cool and Quiet
The best of both worlds

    

Sapphire's new heatsink design is effective and aesthetically pleasing. The large heatsink on the front of the card makes direct contact with the GPU, and has no fan. Heat is transferred through the pipes to the back of the card, where the heat is dispersed with the help of a small and quiet fan. The shear size of the cooler adds some weight to the card, but not enough weight to damage the card if you were to accidently knock over your case. We'd also have no fears installing this into a ventilated HTPC enclosure, it really seems to be a well made heatsink.

Transparent
Test System and 3DMark06

 

HOW WE CONFIGURED THE TEST SYSTEMS: We tested our cards on an DFI LanParty UT CFX3200-DR chipset based motherboard, powered by an AMD Athlon 64 3200+ (Venice Core) processor and 1GB of low-latency Mushkin Redline RAM. The first thing we did when configuring the test system was enter the BIOS and load the "High Performance Defaults."  The hard drive was then formatted, and Windows XP Professional with SP2 was installed. When the installation was complete, we installed the latest chipset drivers available, installed all of the other necessary drivers for the rest of our components, and removed Windows Messenger from the system. Auto-Updating and System Restore were also disabled, the hard drive was defragmented, and a 768MB permanent page file was created on the same partition as the Windows installation. Lastly, we set Windows XP's Visual Effects to "best performance," installed all of the benchmarking software, and ran the tests. We tested at the two most popular resolutions according to a recent poll of our readers, using a mid-range gaming system.

The HotHardware Test System
AMD Athlon 64 FX Powered

Processor -

Motherboard -


Video Cards -




Memory -


Audio -

Hard Driv
e -

 

Hardware Used:
AMD Athlon 3200+ (2GHz, Venice Core)

DFI LanParty UT CFX3200-DR

CrossFire X3200 Chipset

Sapphire X1600 XT
Sapphire X1600 Pro
GeForce 6600 GT 256MB

1024MB Mushkin PC3200 RAM
CAS 2

Integrated on board

Western Digital "Caviar"

80GB - ATA 100

Operating System -
DirectX -

Video Drivers
-



Synthetic (DX) -
DirectX -

DirectX -
DirectX -
OpenGL -

OpenGL -
Relevant Software:
Windows XP Professional SP2
DirectX 9.0c

ATI Catalyst 6.6

Nvidia Forceware 91.31

Benchmarks Used:
3DMark06 v1.0.2
FarCry v1.33
F.E.A.R.
Half Life 2*
Prey*
Quake 4*

* - Custom Test (HH Exclusive demo)

Performance Comparisons with 3DMark06 v1.0.2
Details: http://www.futuremark.com/products/3dmark06/

3DMark06
Futuremark recently launched a brand-new version of their popular benchmark, 3DMark06. The new version of the benchmark is updated in a number of ways, and now includes not only Shader Model 2.0 tests, but Shader Model 3.0 and HDR tests as well. Some of the assets from 3DMark05 have been re-used, but the scenes are now rendered with much more geometric detail and the shader complexity is vastly increased as well. Max shader length in 3DMark05 was 96 instructions, while 3DMark06 ups the number of instructions to 512. 3DMark06 also employs much more lighting, and there is extensive use of soft shadows. With 3DMark06, Futuremark has also updated how the final score is tabulated. In this latest version of the benchmark, SM 2.0 and HDR / SM3.0 tests are weighted and the CPU score is factored into the final tally as well.

3DMark06 provides us with an overall performance score based on the results from a few different tests, but also provides individual Shader Model 2.0 and 3.0 performance numbers. Both X1600 cards seem to fare well in this benchmark, with the X1600 XT on top in all categories.

Transparent
Far Cry

 

Performance Comparisons with FarCry v1.33
Details: http://www.farcry.ubi.com/

FarCry
If you've been on top of the gaming scene for some time, you probably know that FarCry was one of the most visually impressive games to be released on the PC in 2004. Courtesy of its proprietary engine, dubbed "CryEngine" by its developers, FarCry's game-play is enhanced by Polybump mapping, advanced environment physics, destructible terrain, dynamic lighting, motion-captured animation, and surround sound. Before titles such as Half-Life 2 and Doom 3 hit the scene, FarCry gave us a taste of what was to come in next-generation 3D gaming on the PC. We benchmarked the graphics cards in this article using the standard Regulator demo, at various resolutions without anti-aliasing or anisotropic filtering enabled, and then again with 4X AA and 8X aniso enabled concurrently.

Far Cry will be our first indication of the performance we can expect out of these two cards in a gaming scenario. The X1600 XT comes in slightly ahead of Nvidia's GeForce 6600GT, however that's without any degree of Anti-Aliasing or Anisotropic filtering enabled. Once we up the image quality using 4x AA and 8x AF, the X1600 XT still manages to hold the top spot. The X1600 Pro puts up a respectable score, but our levels of AA and AF are a bit to much for the card, as an average of 45FPS can be a bit slow to some gamers.

Going to a higher resolution yields similar results. The X1600 XT remains at the top, and an average framerate of 58.37FPS will provide a decent experience for a card in the X1600 XT's price range. The 6600 GT falls behind a considerable amount when AA and AF are applied. As you can see from the X1600 Pro's numbers, high resolution gaming isn't its strong point.

Transparent
Half-Life 2: Lost Coast

 

Performance Comparisons with Half-Life 2: Lost Coast
Details: http://www.half-life2.com/

Half Life 2
Thanks to the dedication of hardcore PC gamers and a huge mod-community, the original Half-Life became one of the most successful first person shooters of all time.  So, when Valve announced Half-Life 2 was close to completion in mid-2003, gamers the world over sat in eager anticipation. Unfortunately, thanks to a compromised internal network, the theft of a portion of the game's source code, and a tumultuous relationship with the game's distributor, Vivendi Universal, we all had to wait to get our hands on this classic. We benchmarked Half-Life 2: Lost Coast with a long, custom-recorded timedemo, that takes us through a primarily outdoor environment. These tests were run at resolutions of 1,024 x 768 and 1,280 x 1,024 without any anti-aliasing or anisotropic filtering and with 4X anti-aliasing and 8X anisotropic filtering enabled concurrently.

Using the Lost Coast add-on to Half-Life 2 allows us to take advantage of the Source engine's HDR lighting, which all three cards are capable of. Though ATI usually dominates Nvidia in Half-Life 2, the 6600 GT manages to gain a few FPS over the X1600 XT. Even when AA and AF are applied, the GeForce 6600 GT keeps the lead. Don't be to discouraged by the scores here though. The Lost Coast add-on is somewhat of a technology demo from Valve made to showcase HDR lighting, so the performance demands of the Lost Coast are higher than Half-Life 2, or Half-Life 2: Episode One.

Raising the resolution brings all three cards to their knees. As we mentioned, the Lost Coast is a bit of a technology demo, and the actual Half-Life 2 games will perform better. We like using this technology demo when benchmarking graphics cards though, as a graphically stressful demo such as this will help remove any CPU limitation.

Transparent
F.E.A.R.

   

Performance Comparisons with F.E.A.R
More Info: http://www.whatisfear.com/us/

F.E.A.R
One of the most highly anticipated titles of 2005 was Monolith's paranormal thriller F.E.A.R. Taking a look at the minimum system requirements, we see that you will need at least a 1.7GHz Pentium 4 with 512MB of system memory and a 64MB graphics card, that is a Radeon 9000 or GeForce4 Ti-class or better, to adequately run the game. Using the full retail release of the game patched to v1.02, we put the graphics cards in this review through their paces to see how they fared with a popular title. Here, all graphics settings within the game were set to the high values, but with soft shadows disabled (Soft shadows and anti-aliasing do not work together currently). Benchmark runs were then completed at resolutions of 1024x768 and 1280x1024 (by editing the cfg file), without anti-aliasing enabled or anisotropic filtering, and then with 4x anti-aliasing and 8x anisotropic filtering.

The X1600 XT enjoys an 4FPS (average) lead over the 6600 GT here. Using AA and AF increases the lead to 9FPS, though an average framerate of 35FPS is a bit hard to game with. The Sapphire X1600 Pro is able to remain above 40FPS, and while it's not the smoothest ride, it is playable. 

F.E.A.R. becomes a little difficult to play at a resolution of 1280x1024. You can forget about having a good time when AA and AF are used, as 24 FPS average is hard to enjoy.

Transparent
Quake 4

   

Performance Comparisons with Quake 4
Details: http://www.quake4game.com/

Quake 4
id Software, in conjunction with developer Raven, recently released the latest addition to the wildly popular Quake franchise, Quake 4. Quake 4 is based upon an updated and slightly modified version of the Doom 3 engine, and as such performance characteristics between the two titles are very similar.  Like Doom 3, Quake 4 is also an OpenGL game that uses extremely high-detailed textures and a ton of dynamic lighting and shadows, but unlike Doom3, Quake 4 features some outdoor environments as well. We ran this these Quake 4 benchmarks using a custom demo with the game set to its "High-Quality" mode, at resolutions of 1,024 x 768 and 1,280 x 1,024 without anti-aliasing enabled and then again with 4X AA and 8X aniso enabled simultaneously.

Nvidia has traditionally held the upper hand when it comes to OpenGL games, and id software's Doom 3 engine is no different. With Quake being based on the Doom 3 engine, it's no surprise the 6600 GT manages an almost 13FPS lead on the X1600 XT. However, when AA and AF are applied, the playing field becomes a bit more even, and the X1600 XT even gains a few FPS on the 6600 GT. The X1600 Pro is where it's been the whole time, stuck right in the middle.

With the resolution raised, Nvidia's GeForce 6600 GT still puts out a somewhat tolerable framerate at the default image quality settings, however Quake 4 becomes a bit sluggish when run on either of our X1600 cards.

Transparent
Prey

     

Performance Comparisons with Prey
Details: http://www.prey.com/

Prey
Originally announced in 1995, Prey has been a long time in the making. A series of technical issues put prey on the back burner in 1999, only to be revived by 3D Realms and Human Head studios in 2001. Being a unique title with it's 'Portal Technology', Prey allows players to warp to different areas of a level through the use of Portals, without a drop in framerates. Prey is also known for its extensive use of in-game gravity tricks, allowing the player to walk up walls and perform other unique moves. Prey is the latest game to use id software's Doom 3 engine, which is based on the OpenGL programming language.

As with Quake 4, Prey is an OpenGL title based on the Doom 3 engine. While the GeForce 6600 GT quickly claims the lead without AA or AF being used, the X1600 XT starts to shine once the graphics are turned up. The X1600 XT seems to handle higher resolutions and AA/AF better due to its increased memory bandwidth over the 6600 GT.

As we mentioned, the increased memory bandwidth really helps the X1600 XT shine. The 6600 GT's lead over the X1600 XT starts to fade with no extra pixel processong, and once we turn up the AA and AF, the X1600 XT manages to squeeze ahead by 1FPS. If you're planning on playing a lot of Doom 3 engine based titles, and are shopping for a mid-range card, we'd still have to recommend Nvidia because of its superior OpenGL performance.

Transparent
Overclocking Results

 

Overclocking Procedure: For our overclocking adventure, we decided to use version 0.25 Beta of ATI Tool. We used ATI Tool's artifact tester to test for visual anomalies as we slowly raised the clock speeds on the core and memory. Once we were able to pass 20 minutes of ATI Tool's artifact tester without a single error, we used those clock speeds to test our timedemo of Human Head's Prey. We ran Prey at a resolution of 1024x768, and used no AA and allowed the game to use it's default AF setting. 

Overclocking the Sapphire X1600 Pro
When it just needs to run faster

By overclocking the X1600 Pro to 596MHz core and 423MHz(846MHz DDR) memory, we managed to squeeze an extra 3.3FPS out of the card. We were rather impressed with the overclock we received on the core, however we feel our scores were held back a little by the 18MHz overclock we achieved on the memory. 

Overclocking the Sapphire X1600 XT
Faster than the Pro, but not fast enough

Our Sapphire Ultimate X1600 XT was able to reach 630MHz on the core and 760MHz(1.52GHz) on the memory. The X1600 XT turned out to be a decent overclocker. We were able to attain a moderate 40MHz overclock on the core, however we managed a 70MHz overclock on the memory, which isn't too shabby. In our Prey test, the overclock only amounted to a 2.3FPS increase, however results will vary depending on the game being played, game resolution, and levels of AA and AF applied.

Transparent
CrossFire Performance

Performance Comparisons with Half-Life 2: Lost Coast in CrossFire Mode
Details: http://www.half-life2.com/

Multi-GPU technology has advanced to the point where we can now not only mix different card vendors, but also mix different cards in some configurations. While higher end ATI products still require a master and slave card, the X1600 series and below needs only the proper driver support. Catalyst version 6.7 has recently been released, and we decided to use it as we wanted to take advantage of any extra CrossFire improvements. The way these two cards work in CrossFire, is the X1600 XT will run at the speed of the X1600 Pro. Basically we're benchmarking two X1600 Pros in CrossFire mode. 

Two cards won't give you exactly double the performance, however we do see very noticeable performance gain. Two X1600 Pros have no problem overtaking the GeForce 6600 GT in this test. The CrossFire solution also holds onto the lead when AA and AF are applied.

Even when gaming with 1280x1024, the two cards manage to obtain a playable framerate. Despite the power of two, AA and AF still manage to drag the performance down enough to make the demo a bit unplayable.

Transparent
Final Thoughts and Conclusion

Sapphire has been a popular brand for some time now, which is in part due to their innovative products. From the lowest card in the 3D graphics food chain, to the their flagship products, Sapphire has something for almost any price range. While they've produced X1600 cards before, their Ultimate series boasts a much improved heatsink design. Both the X1600 Pro and X1600 XT kept under 65oC during our testing (with an ambient temp of 28oC), and the design is small enough that it should fit in most cases without much of a hassle. While there are certainly better buys out there from a price / performance perspective, like the $140 EVGA e-GeForce 7600 GT for example, if you're in the market for an X1600 Pro or X1600 XT, the Sapphire Ultimate series is a good choice.

We have mixed feelings towards Sapphire's Ultimate X1600 Pro and Ultimate X1600 XT cards, however. Though the cards are moderately priced at about $100 to $140, respectively, and they do come with a decent selection of cables and adapters, they aren't the best performers in their respective price ranges. Overclocking results were mixed, but no one should rely solely on overclocking results when purchasing hardware anyway. Each card is unique, and may not overclock to the same levels at what we've shown you here. All things considered though, we prefer Sapphire's Ultimate X1600 Pro and Ultimate X1600 XT to just about any other Radeon X1600 out there.  With Sapphire's Ultimate cards you get the same or better performance than other X1600 Pro and X1600 XT cards, but with a superior cooling solution and a good bundle. HotHardware gives both the Sapphire Ultimate X1600 Pro and X1600 XT 7.5s on our Heat Meter. 

   
 
  • Unique Cooling Solution
  • Decent overclockers
  • Good Bundles
  • AVIVO
  • Quiet
  • Price / Performance Ratio
  • May not fit in some SFF / HTPC enclosures
 

Get into HotHardware's PC Hardware Forum Right Now!

If you liked this article, Please Digg It!



Content Property of HotHardware.com