Logo   Banner   TopRight
TopUnder
Transparent
Gigabyte GV-RX18L256V-B Radeon X1800XL
Transparent
Date: Feb 09, 2006
Section:Graphics/Sound
Author: Robert Maloney
Transparent
Introduction

 

With all the praise that has been heaped upon NVIDIA for their 7800 series of graphic cards, it was bound to cause a bit of dismay for rival ATi.  For nearly two years ATi had been riding high on the success of their 9700 and 9800 series of cards, while NVIDIA had been trying to save face after a few mishaps of their own.  Then, somewhat ironically, the tables were turned.  While the 7800GT and 7800GTX were taking the industry by storm with their outstanding performance and relatively quiet operation, ATi seemed to hit pothole after pothole.

So, it was with some eager anticipation that the Radeon X1000 series was released, with hopes of putting ATi back on the map and into the hearts of their fanbase. With low end Radeon X1300, midrange Radeon X1600, and higher end Radeon X1800 models announced at the time of the launch, there were certainly plenty of cards to choose from.  Today, we will be taking a look at a card based on the Radeon X1800 XL GPU, the Gigabyte GV-RX18L256V-B.

       

Gigabyte's GV-RX18L256V-B Specifications
The Crimson Empire Strikes Back!

Features - ATI Radeon X1800 XL
• 321 million transistors on 90nm fabrication process
• Ultra-threaded architecture with fast dynamic branching
• Sixteen pixel shader processors
• Eight vertex shader processors
• 256-bit 8-channel GDDR3/GDDR4 memory interface
• Native PCI Express x16 bus interface
• Dynamic Voltage Control

Ring Bus Memory Controller
• 512-bit internal ring bus for memory reads
• Programmable intelligent arbitration logic
• Fully associative texture, color, and Z/stencil cache designs
• Hierarchical Z-buffer with Early Z test
• Lossless Z Compression (up to 48:1)
• Fast Z-Buffer Clear
• Z/stencil cache optimized for real-time shadow rendering
• Optimized for performance at high display resolutions, including widescreen HDTV resolutions

Avivo Video and Display Engine
• High performance programmable video processor
_o Accelerated MPEG-2, MPEG-4, DivX, WMV9, VC-1, and H.264 decoding (including DVD/HD-DVD/Blu-ray playback), encoding & transcoding
_o DXVA support
_o De-blocking and noise reduction filtering
_o Motion compensation, IDCT, DCT and color space conversion
_o Vector adaptive per-pixel de-interlacing
_o 3:2 pulldown (frame rate conversion)
• Seamless integration of pixel shaders with video in real time
• HDR tone mapping acceleration
_o Maps any input format to 10 bit per channel output
• Flexible display support
_o Dual integrated dual-link DVI transmitters
_o DVI 1.0 / HDMI compliant and HDCP ready
_o Dual integrated 10 bit per channel 400 MHz DACs
_o 16 bit per channel floating point HDR and 10 bit per channel DVI output
_o Programmable piecewise linear gamma correction, color correction, and color space conversion (10 bits per color)
_o Complete, independent color controls and video overlays for each display
_o High quality pre- and post-scaling engines, with underscan support for all outputs
_o Content-adaptive de-flicker filtering for interlaced displays
_o Xilleon™ TV encoder for high quality analog output
_o YPrPb component output for direct drive of HDTV displays
_o Spatial/temporal dithering enables 10-bit color quality on 8-bit and 6-bit displays
_o Fast, glitch-free mode switching
_o VGA mode support on all outputs
• Compatible with ATI TV/Video encoder products, including Theater 550

Ultra-Threaded Shader Engine
Support for Microsoft DirectX 9.0 Shader Model 3.0 programmable vertex and pixel shaders in hardware
• Full speed 128-bit floating point processing for all shader operations
• Up to 512 simultaneous pixel threads
• Dedicated branch execution units for high performance dynamic branching and flow control
• Dedicated texture address units for improved efficiency
• 3Dc+ texture compression
_o High quality 4:1 compression for normal maps and two-channel data formats
_o High quality 2:1 compression for luminance maps and single-channel data formats
• Multiple Render Target (MRT) support
• Render to vertex buffer support
• Complete feature set also supported in OpenGL 2.0

Advanced Image Quality Features
• 64-bit floating point HDR rendering supported throughout the pipeline
_o Includes support for blending and multi-sample anti-aliasing
• 32-bit integer HDR (10:10:10:2) format supported throughout the pipeline
_o Includes support for blending and multi-sample anti-aliasing
• 2x/4x/6x Anti-Aliasing modes
_o Multi-sample algorithm with gamma correction, programmable sparse sample patterns, and centroid sampling
_o New Adaptive Anti-Aliasing feature with Performance and Quality modes
_o Temporal Anti-Aliasing mode
_o Lossless Color Compression (up to 6:1) at all resolutions, including widescreen HDTV resolutions
• 2x/4x/8x/16x Anisotropic Filtering modes
_o Up to 128-tap texture filtering
_o Adaptive algorithm with Performance and Quality options
• High resolution texture support (up to 4k x 4k)

Software Bundle

Counter Strike - Condition Zero
Xpand Rally
Power Director 3
Gigabyte Driver CD with PowerDVD 6

Adapter/Cable Bundle
Composite and S-Video cables
HDTV-out cable DVI adaptor
Power splitter cord

 

 


The Radeon X1800XL's VPU is composed of roughly 321 million transistors, and features ATI's self proclaimed "Ultra-threaded architecture" with fast dynamic branching. The VPU has 16-pixel shader processors, 8-vertex shader processors, and is equipped with a 256-bit 8-channel memory interface. The Gigabyte GV-RX18L256V-B uses the X1800XL, with its core clocked at the default 500MHz and its memory clocked at just under 1GHz. At these clock speeds, a large-single slot cooler is sufficient to keep core and memory temperatures in check. On the surface, the specifications seem much like an X850, but the X1800XL sports SM 3.0 support, the new Avivo engine, and a more advanced memory controller.  Even though the X1800s have fewer pixel shader pipelines than NVIDIA's GeForce 7800 GT and GTX, their higher clock speeds help keep fillrate competitive.  Today we will compare the RX18L256V-B against an X850XT and a 7800GT to see which one comes out on top.

 

Transparent
A closer look at the hardware and software

 

The Gigabyte GV-NX78T256V-B Close-up
Staying the course...

As we mentioned with the Gigabyte GV-NX78T256V about a month back, Gigabyte seems to stay the course when it comes to designing their cards, opting not to get too creative other than some flashy graphics on the cooler.  That is unless they go completely overboard, and design a card like the GV-3D1-68GT.  As seen below, the card is a single slot solution, with a relatively slim profile.  An oversized heatsink covers nearly the entire face of the card, which is used to cool off all of the card's major components.  Underneath the aluminum and copper heatsink are the R520 GPU and eight Samsung DDR3 memory chips, all 256MB placed in a neat array.  We also located the ATi Rage Theater chip hidden under here, which controls the VIVO functions of the card. 

       

       

Placed towards one end of the card, a small fan creates the airflow necessary to keep temperatures under control.  However, while under load, the fan must spin very rapidly and the sound levels seemed louder than what we experienced with their 7800GT model. On the other end of the card, a single strip of aluminum is placed to passively cool the FETs in the VRM.  The 6-pin power connector was placed in the upper corner, which is the mainstay of today's power-hungry graphics adapters.  New builders or upgraders should always remember to check that their power supply unit not only has enough wattage to power their system, but has open connectors to plug in one or two video cards (when used in a CrossFire setup).

       

The heatsink is attached using eight screws; four on the outer edges and four more used in conjunction with a small backplate.  On the bracket end, we found the standard layout for most modern cards; two DVI outputs with an S-VIDEO port nested in between.  Finally, we took the two Gigabyte cards and laid them out next to each other.  Although similar in overall appearance, the Radeon X1800XL version was just a hair longer than the 7800GT model. 

Gigabyte's Software Bundle
More of the same?

    

Not much has changed in the way of included materials either.  Cyberlink's Power Director 3 and PowerDVD 6 make up the entire list of included utilities, save for a set of display drivers and Direct X 9 runtime libraries.  Gigabyte also throws in a couple of games, namely Xpand Rally, and the popular CounterStrike - Condition Zero from Valve Software.  While the software selection was relatively weak overall, the included paraphernalia was not.   There appears to a full assortment of cables including Composite and S-Video cables, an HDTV out cable, a DVI to VGA adapter, as well as a power-splitter. 

One new addition that comes from ATi rather than Gigabyte was the latest release of Catalyst Drivers.  While we prefer the simplicity that is NVIDIA's ForceWare drivers for their GeForce cards, there's simply no denying the coolness factor of actually getting to see immediate results when applying various settings in the CCC (Catalyst Control Center).  In the past, users simply chose the graphics levels from the control panels, and basically took it on faith that their graphics were indeed being tweaked.  With the preview mode in the Catalyst drivers though, the screen updates with the choices made, giving an exact representation of what type of pixel processing is being performed. This feature has been around for quite some time, but we feel it's worth mentioning again for those that may not be familiar with ATI's more recent drivers. The real-time preview could certainly help less savvy users learn about how each specific setting affects image quality.

 

Transparent
Test Setup and 3DMark05 Comparisons

 

HOW WE CONFIGURED THE TEST SYSTEM: We tested the Gigabyte GV-RX78T256V-B on an ASUS P5WD2 i955X chipset-based motherboard, powered by an Intel Pentium 4 550 processor and 1GB of low-latency Corsair XMS2 DDR2 memory. The first thing we did when configuring this test system was enter the BIOS and loaded the "High Performance Defaults."  The hard drive was then formatted and Windows XP Professional with SP2 was installed. When the installation was complete, we installed the latest Intel chipset drivers, installed all of the other necessary drivers for the rest of our components, and removed Windows Messenger from the system.  Auto-Updating and System Restore were then disabled, the hard drive was defragmented, and a 1536 MB permanent page file was created on the same partition as the Windows installation. Lastly, we set Windows' Visual Effects to "best performance," installed all of the benchmarking software, and ran the tests.

The HotHardware Test System
The Everyday man's system

Processor -

Motherboard -


Video Cards -




Memory -

Audio -

Hard Driv
e -

 

Hardware Used:
Intel Pentium 4 550 (3.4GHz)

ASUS P5WD2 Premium
Intel i955x chipset

Gigabyte GV-RX18L256V-B
Gigabyte GeForce 7800GT
ATI Radeon X850XT PE

1024MB Corsair XMS2 DDR2

Integrated

Seagate Barracuda V

7,200RPM - SATA

Operating System -
Chipset Drivers -
DirectX -

Video Drivers
-




Synthetic (DX) -
DirectX -
DirectX -

DirectX -
DirectX -
OpenGL -
OpenGL -
Relevant Software:
Windows XP Professional SP2
Intel INF Drivers v7.22.1006
DirectX 9.0c

NVIDIA Forceware v81.98

ATI Catalyst v5.13


Benchmarks Used:
3DMark05 v1.2.0
Halo v1.06
Splinter Cell: Chaos Theory v1.04
FarCry v1.33*
Half Life 2*
Doom 3 v1.3*
Chronicles of Riddick v1.1*

* - Custom Test (HH Exclusive demo)
Performance Comparisons with 3DMark05 v1.2.0
Details: http://www.futuremark.com/products/3dmark05/

3DMark05
3DMark05 is the latest installment in a line of synthetic 3D graphics benchmarks from Futuremark Corp that date back to late 1998.  The benchmark itself isn't based on a playable game engine, but it does boast fairly impressive DX9-driven visuals and lighting effects.  It's also a versatile DirectX 9 benchmarking tool that we feel is relative in terms of presenting our readers a full performance profile for 3D graphics and host processors. It's not a metric that you can exclusively gauge real-world performance on but rather a complement to our entire suite of application-based testing.  We ran 3DMark05's default test (1,024 x 768) on all of the cards and configurations we tested.


Gigabyte GV-NX78T256V-B = GeForce 7800 GT
Gigabyte GV-RX18L256V-B =
Radeon X1800 XL

The Gigabyte RX18L256V-B gets off to a fast start, as it eclipsed the performance of the 7800GT based NX78T256V-B, by about 5%.  It seems that with the latest Catalyst drivers, performance has been vastly improved over older versions.  We had originally seen minimal differences between the two cards in past reviews.  The X1800XL also picks off the previous flagship from ATi, the X850XT, outgunning it by over 10% in 3DMark05.

 

Transparent
Halo 1.06 Performance

Halo 1.06 Performance Tests
All Patched & Ready To Go!

Halo
For many gamers, the release of Halo marked the end of a long wait because it was originally released as an XBOX exclusive a few years back. No additional patches or tweaks are needed to benchmark with Halo, as Gearbox has included all of the necessary information in its README file. The Halo benchmark runs through four of the cut-scenes from the game, after which the average frame rate is recorded. We patched the game using the latest v1.06 patch and ran this benchmark twice, once at 1280 x 1024 and then again at 1600 x 1200. Anti-aliasing doesn't work properly with Halo, so all of the tests below were run with anti-aliasing disabled.


Gigabyte GV-NX78T256V-B = GeForce 7800 GT
Gigabyte GV-RX18L256V-B =
Radeon X1800 XL

 


Gigabyte GV-NX78T256V-B = GeForce 7800 GT
Gigabyte GV-RX18L256V-B =
Radeon X1800 XL

Synthetic benchmarks be damned, as Halo has the GeForce 7800GT based NX78T256V-B on top, with the two Radeon's trailing behind.  In the lower resolution testing, the RX18L256V-B actually comes in a frame behind the X850XT.  However, at 1600x1200, it appears that the X1800XL's more efficient design allows it to keep pace with the 7800GT, while the X850XT slips more than ten frames back.

Transparent
Splinter Cell : Chaos Theory

 

Performance Comparisons with Splinter Cell: Chaos Theory v1.04
Details: http://www.splintercell3.com/us/

SC: Chaos Theory
Splinter Cell: Chaos Theory uses a heavily modified version of Epic's Unreal Engine, enhanced with a slew of DX9-class shaders, dynamic lighting, and mapping effects. This game produces some gorgeous imagery with its very immersive, albeit dark environments. We should note that the game engine has a shader model 3.0 code path that allows the GeForce 6 & 7 Series of cards to really shine, but it wasn't until the latest v1.04 patch was released that a shader model 2.0 path was added to take advantage of ATI's x8x0 hardware. For the test results listed below, we turned off HDR rendering and parallax mapping to somewhat level the playing field and benchmarked the game at resolutions of 1280 x 1024 and 1600 x 1200, both with and without anti-aliasing and anisotropic filtering.


Gigabyte GV-NX78T256V-B = GeForce 7800 GT
Gigabyte GV-RX18L256V-B =
Radeon X1800 XL

 


Gigabyte GV-NX78T256V-B = GeForce 7800 GT
Gigabyte GV-RX18L256V-B =
Radeon X1800 XL

Splinter Cell : Chaos Theory follows the pattern originally seen with Halo; the NX78T256V-B maintains a lead over both Radeon-based cards at both resolutions, with or without any additional pixel processing.  The difference between the cards is much more pronounced with AA enabled.  Whereas the 7800GT lost but a few frames with AA enabled, the X850XT and X1800XL both dropped off in the range of 20-25%.

Transparent
Far Cry 1.33 Comparisons

 

Performance Comparisons with FarCry v1.33
Details: http://www.farcry.ubi.com/

FarCry
FarCry is one of the most visually impressive games to be released on the PC platform to date, courtesy of its proprietary engine, dubbed "CryEngine" by its developers. FarCry's game-play is enhanced by graphics with Polybump mapping, advanced environment physics, destructible terrain, dynamic lighting, motion-captured animation, and surround sound. The game is rife with lush, brightly lit outdoor environments, and high-poly models that are capable of taxing most of today's high-end 3D graphics hardware. We benchmarked the cards in this review with a custom-recorded demo run taken in the "Catacombs" area checkpoint, at various resolutions without anti-aliasing or anisotropic filtering enabled, and then with 4X AA and 16X aniso enabled concurrently.

 


Gigabyte GV-NX78T256V-B = GeForce 7800 GT
Gigabyte GV-RX18L256V-B =
Radeon X1800 XL

 

 


Gigabyte GV-NX78T256V-B = GeForce 7800 GT
Gigabyte GV-RX18L256V-B =
Radeon X1800 XL

The latest patched version of Far Cry led us to some interesting observations.  At both 1280x1024 and 1600x1200, the GeForce 7800GT based card was the fastest.  Second place was reserved for the X850XT, with the RX18L256V-B coming in a close third at the lower res, but a disappointing ten frames back at 1600x1200.  Now, crank up the settings, and we've got the exact opposite order: the X850XT moved to the front, followed by the X1800XL, and the NX78T256V-B was in last. 

Transparent
Head-to-head with Doom 3

 

Performance Comparisons with Doom 3
Details: http://www.doom3.com/

Doom 3
Unlike the vast majority of titles currently shipping on the PC platform, which are based on Microsoft's DirectX API, iD software's Doom 3 is powered by an OpenGL-based game engine. The Doom 3 engine is capable of producing extremely realistic looking visuals through the use of high-detailed textures and a unified lighting and shadowing system with complex animations and scripting that generates real-time, fully dynamic per-pixel lighting and stencil shadowing. We ran this batch of single-player Doom 3 benchmarks using one of our own custom recorded demos with the game set to its "High-Quality" mode, at resolutions of 1280 x 1024 and 1600 x 1200 without anti-aliasing enabled, and again with 4X AA and 16X aniso enabled simultaneously.


Gigabyte GV-NX78T256V-B = GeForce 7800 GT
Gigabyte GV-RX18L256V-B =
Radeon X1800 XL

 


Gigabyte GV-NX78T256V-B = GeForce 7800 GT
Gigabyte GV-RX18L256V-B =
Radeon X1800 XL

Watching the 7800GT simply outclass the ATi cards in Doom 3 is not so much of a surprise as it is to see the overall delta.  We're not simply talking about a different ballpark, we're talking about a whole other sport.  On the average, the two Radeon cards were between 50-75% slower than their NVIDIA counterpart.  As we've seen in previous testing, the X850XT typically puts up better numbers than the newer X1800XL, for about the same price.

Transparent
Breaking out of Butcher Bay

 

Performance Comparisons with Chronicles of Riddick: Escape From Butcher Bay
Details: http://www.riddickgame.com/

Chronicles of Riddick
Starbreeze Studios is responsible for creating the surprisingly impressive Chronicles of Riddick: Escape From Butcher Bay. Riddick is a first person shooter that's powered by the company's proprietary Starbreeze Engine. This engine is currently being used for all projects at Starbreeze and is geared toward indoor or semi-indoor game environments, with per light-source light-mapping and high-quality shading with normal mapping. According to Starbreeze Studios, the engine has been under development for over seven years and is currently being used on the PC, PS2, Xbox, and GameCube platforms. However, the Starbreeze engine is also being ported onto next-generation platforms as well. Chronicles of Riddick: Escape From Butcher Bay has impressive visuals, a mature story line and has proven to be a tough challenge.

 


Gigabyte GV-NX78T256V-B = GeForce 7800 GT
Gigabyte GV-RX18L256V-B =
Radeon X1800 XL

 


Gigabyte GV-NX78T256V-B = GeForce 7800 GT
Gigabyte GV-RX18L256V-B =
Radeon X1800 XL

The results of the testing with Chronicles of Riddick closely mirrors the Doom 3 graphs, but only with the basic tests.  At both resolutions, the two Radeon cards are simply no match for the 7800GT.  However, unlike the role reversal we had seen earlier, the GeForce card holds onto and maintains sizeable leads when Anti-Aliasing and Anisotropic Filtering methods are applied.  Also dissimilar to the Doom 3 scores, the RX18L256V-B manages to beat out the X850XT, albeit by relatively small margins.

 

Transparent
Half Like 2 Benchmarks

 

Performance Comparisons with Half-Life 2
Details: http://www.half-life2.com/

Half Life 2
Half-Life 2 is a game that needs no introduction. The underlying technology behind HL2, however, may not be as well known. The Source Engine powering Half-Life 2 is responsible for the game's realistic visuals, with support for shader model 2.0 shaders, bump mapping, and Cube and environment mapping. Half-Life 2 also makes use of dynamic lights, vertex lighting and light maps, and water with refraction and fresnel effects, among a host of other rendering features. We benchmarked Half-Life 2 with a long, custom-recorded timedemo taken in the "Canals" map, that takes us through both outdoor and indoor environments. These tests were run at resolutions of 1280 x 1024 and 1600 x 1200 with 4XAA anti-aliasing and 16X anisotropic filtering enabled concurrently.


Gigabyte GV-NX78T256V-B = GeForce 7800 GT
Gigabyte GV-RX18L256V-B =
Radeon X1800 XL

 


Gigabyte GV-NX78T256V-B = GeForce 7800 GT
Gigabyte GV-RX18L256V-B =
Radeon X1800 XL

Half-Life 2 , for all of its graphical beauty, runs very well on almost all current generation graphic cards, so we won't see the disparity that is prevalent in the other benchmarks.  In fact, without any additional pixel processing, the three cards are separated by three frames at the most with the X1800XL "winning" one contest and the 7800GT taking the other.  Once we turn up the graphic settings, though, there is a bit of a shakedown.  At both resolutions, the X850XT led the way, followed by the X1800XL, and finally the 7800GT.

 

Transparent
Overclocking Results

 

Overclocking the GV-RX18L256V-B
Xtra speed with the X1800XL

    

Gigabyte doesn't offer any specific tools to overclock the GV-RX18L256V-B, so we downloaded and installed the latest version of ATiTool.  For a free piece of software, this is a great tool for finding out everything you need to know about your graphic card.  Switching over to the overclocking screen, we can change the core and memory speeds, while simultaneously checking for artifacts.  There are also options to automatically find the Max speeds, but we wanted that fun for ourselves.  We moved the sliders for each clock speed a few MHz at a time until we ran into a few system lockups or driver crashes.  After some further tweaking, we arrived at a completely artifact-free environment.  The final results were a Core clock speed of 577 MHz, and we pushed the memory to 621MHz (1242MHz effective).  To see what kinds of gains these higher speeds afforded us, we re-ran two of our benchmarks and compared the new results to the original scores. 

 

 

 

By raising the core and memory speeds, we were able to get above the X850XT's speeds.  In doing so, we've increased our 3DMark05 score by 8%, and Doom 3 by almost 20%.  It's a nice jump in performance, especially when it's essentially accomplished for free.  Unfortunately, as evidenced by the Doom 3 results, it still might not be enough to overcome the 7800GT in many of our tests.

Transparent
Performance Summary and Conclusion

 

Performance Summary: The performance of the Gigabyte GV-RX18L256V-B was good, but not stellar.  While it was competitive in most of the benchmarks, it never really established itself as the top card, often falling behind not only NVIDIA's GeForce 7800GT, but sometimes even the older X850XT.

There's really nothing terribly special about the RX18L256V-B.  It does the job well from a performance standpoint, but doesn't shine in any one area.  The hardware is standard fare, basically copied from ATi's reference design.  The accompanying bundle is also a bit bland for our tastes with just the basics covered, and only one game that's worth taking a look at, although even then it's almost two years old.  The one bright spot might be the full set of cables that are included, as it means less to purchase down the road should users want to connect their PC to different types of displays.

What's left is a package that doesn't really impress.  The current average price of about $400 simply isn't worth the investment when other cards like Gigabyte's own GeForce 7800 GT can be had for less.  Perhaps, when the price comes down somewhat, this might seem like a good choice, but for now we'll keep our eyes and our money elsewhere.  We're giving the Gigabyte GV-RX18L256V-B a 7 on the HotHardware Heat Meter.  The GPU has lots of nice features, but there are better values out there.

_Good overall performance
_Overclocks well
_Single-slot solution
_Priced Higher than other models
_Fan becomes noisy under load
_No Stand Out Features

Discuss This Or Any Other Article
in HotHardware's PC Hardware Forum!

 



Content Property of HotHardware.com