Radeon HD 3870 X2 Round-Up: Asus, HIS - HotHardware

Radeon HD 3870 X2 Round-Up: Asus, HIS

0 thumbs up



A few short weeks ago, AMD unleashed the dual-GPU powered ATI Radeon HD 3870 X2, which marked the company’s re-entry into the ultra high-end desktop graphics card space that had been dominated by NVIDIA since the introduction of the GeForce 8800 series.   As we mentioned in our launch coverage, the Radeon HD 3870 X2 utilizes a pair of R670 graphics processors working together on a single PCB, for what is essentially a single-card CrossFire configuration.
 
AMD’s reference design called for two R670 GPUs clocked at 825MHz with 1GB (512MB per GPU) of 900MHz frame buffer memory.   Leading up to the launch, representative from AMD had mentioned, however, that some of their board partners were likely to releases X2 cards that differed from the reference design.  With that in mind, we were eager to get our hands on some retail-ready Radeon HD 3870 X2 cards to truly see what AMD’s board partners had in store.

In this article, we’ll be taking a look at three Radeon HD 3870 X2 cards, two from Asus and one from HIS.  A pair of the cards are much like AMD’s reference design, but one of them is hardly recognizable as a Radeon HD 3870 X2 – at least not at first glance.


ATI Radeon HD 3870 X2 Reference Design

ATI Radeon HD 3870 X2
Features & Specifications

666 million transistors on 55nm fabrication process (x2)

256bit 8-channel GDDR3/4 memory interface (x2)

Ring Bus Memory Controller

  • Fully distributed design with 512-bit internal ring bus for memory reads and writes
  • Optimized for high performance HDR (High Dynamic Range) rendering at high display resolutions

Unified Superscalar Shader Architecture

  • 320 stream processing units (x2)
    • Dynamic load balancing and resource allocation for vertex, geometry, and pixel shaders
    • Common instruction set and texture unit access supported for all types of shaders
    • Dedicated branch execution units and texture address processors
       
  • 128-bit floating point precision for all operations
  • Command processor for reduced CPU overhead
  • Shader instruction and constant caches
  • Up to 80 texture fetches per clock cycle
  • Up to 128 textures per pixel
  • Fully associative multi-level texture cache design
  • DXTC and 3Dc+ texture compression
  • High resolution texture support (up to 8192 x 8192)
  • Fully associative texture Z/stencil cache designs
  • Double-sided hierarchical Z/stencil buffer
  • Early Z test, Re-Z, Z Range optimization, and Fast Z Clear
  • Lossless Z & stencil compression (up to 128:1)
  • Lossless color compression (up to 8:1)
  • 8 render targets (MRTs) with anti-aliasing support
  • Physics processing support

Full support for Microsoft DirectX 10 / 10.1 

  • Shader Model 4.0
  • Geometry Shaders
  • Stream Output
  • Integer and Bitwise Operations
  • Alpha to Coverage
  • Constant Buffers
  • State Objects
  • Texture Arrays

Dynamic Geometry Acceleration

  • High performance vertex cache
  • Programmable tessellation unit
  • Accelerated geometry shader path for geometry amplification
  • Memory read/write cache for improved stream output performance

Anti-aliasing features

  • Multi-sample anti-aliasing (up to 8 samples per pixel)
  • Up to 24x Custom Filter Anti-Aliasing (CFAA) for improved quality
  • Adaptive super-sampling and multi-sampling
  • Temporal anti-aliasing
  • Gamma correct
  • Super AA (CrossFire configurations only)
  • All anti-aliasing features compatible with HDR rendering

CrossFire Multi-GPU Technology

  • Scale up rendering performance and image quality with 2 or more GPUs
  • Integrated compositing engine
  • High performance dual channel interconnect
Texture filtering features
  • 2x/4x/8x/16x high quality adaptive anisotropic filtering modes (up to 128 taps per pixel)
  • 128-bit floating point HDR texture filtering
  • Bicubic filtering
  • sRGB filtering (gamma/degamma)
  • Percentage Closer Filtering (PCF)
  • Depth & stencil texture (DST) format support
  • Shared exponent HDR (RGBE 9:9:9:5) texture format support

ATI Avivo HD Video and Display Platform

  • Two independent display controllers
    • Drive two displays simultaneously with independent resolutions, refresh rates, color controls and video overlays for each display
    • Full 30-bit display processing
    • Programmable piecewise linear gamma correction, color correction, and color space conversion
    • Spatial/temporal dithering provides 30-bit color quality on 24-bit and 18-bit displays
    • High quality pre- and post-scaling engines, with underscan support for all display outputs
    • Content-adaptive de-flicker filtering for interlaced displays
    • Fast, glitch-free mode switching
    • Hardware cursor
       
  • Two integrated dual-link DVI display outputs
    • Each supports 18-, 24-, and 30-bit digital displays at all resolutions up to 1920x1200 (single-link DVI) or 2560x1600 (dual-link DVI)
    • Each includes a dual-link HDCP encoder with on-chip key storage for high resolution playback of protected content
       
  • Two integrated 400 MHz 30-bit RAMDACs
    • Each supports analog displays connected by VGA at all resolutions up to 2048x1536
       
  • HDMI output support
    • Supports all display resolutions up to 1920x1080
    • Integrated HD audio controller with multi-channel (5.1) AC3 support, enabling a plug-and-play cable-less audio solution
       
  • Integrated Xilleon HDTV encoder
    • Provides high quality analog TV output (component / S-video / composite)
    • Supports SDTV and HDTV resolutions
    • Underscan and overscan compensation
       
  • HD decode for H.264/AVC, VC-1, DivX and MPEG-2 video formats
    • Flawless DVD, HD DVD, and Blu-Ray playback
    • Motion compensation and IDCT (Inverse Discrete Cosine Transformation)
       
  • HD video processing
    • Advanced vector adaptive per-pixel de-interlacing
    • De-blocking and noise reduction filtering
    • Edge enhancement
    • Inverse telecine (2:2 and 3:2 pull-down correction)
    • Bad edit correction
    • High fidelity gamma correction, color correction, color space conversion, and scaling
  • MPEG-2, MPEG-4, DivX, WMV9, VC-1, and H.264/AVC encoding and transcoding
  • Seamless integration of pixel shaders with video in real time
  • VGA mode support on all display outputs

PCI Express 2.0 x16 bus interface

OpenGL 2.0 support

 

 



Above are the complete specifications and feature-set of the Radeon HD 3870 X2.  Due to the fact that we’ve already covered the underlying technology employed in the R670 GPUs powering the X2, and have already detailed the reference card’s features and performance, we’re not going to rehash those details again here.   If you’re unfamiliar with the Radeon HD 3870 X2, however, and would like more details we suggest checking out a couple of our recent articles.  In our Radeon HD 3870 and 3850 coverage, we discuss the R670 GPU in depth.  And in our Radeon HD 3870 X2 launch article, we cover the card’s main features and inner workings.  If you check out those two pieces, they’ll get you up to speed.

Article Index:

0
+ -

I was really hoping to see HIS turbo model. Rumored to use DDR4. The asus one looks good but I am not crazy about the cooler than doesnt tunnel the heat outside of the case.

0
+ -

i do have one question for you, why is it that every single review of these cards, no matter who does them, no matter which site, never bothers to explore/test this feature:

 HD video processing:

MPEG-2, MPEG-4, DivX, WMV9, VC-1, and H.264/AVC encoding and transcoding

any reason why you chose not to test this feature, because if it works as advertised, it definately gives the end user a big incentive to choose this card over something from nvidia. 

0
+ -

They used to do those tests. But I think that with processors so powerful today its not quite as important of a feature as it once was.

0
+ -

FlyinBrian:

They used to do those tests. But I think that with processors so powerful today its not quite as important of a feature as it once was.

 

Indeed...I tested the HD capability of a 8600 GT for a rig I built for someone, the 1080p movie by itself on a q6600 @ 3ghz ran flawlessly without the assistance of the GPU, but on a certain segment the video lagged up due to a lot of processing needed on screen for a certain scene. CPU usage shot up and it just couldn't handle it. Renamed the file the MP4 (It was a h24 mkv container filer), popped it into power dvd that support HD and GPU decoding..and the cpu usage was around 1-4 percent. So yea, there really is no need anymore..The low end can handle it easily..so we know the high ends that support a UVD would do it without worry.

0
+ -

I just got around to the article and thought it was great.  The only choice which I thought was poor was using a 680i based board for the testing with a quad core chip.  The 680i chipset was released before the quad core was a gleam in most peoples eye.  The only reason it even recognize's a quad core chip is on account of BIOS updates.  At some point, and I think that point was reach with the QX6850, the 680i cannot be modified enough with mere BIOS changes to keep up with the tremendous differences beween the Extreme Quad and previous generations of cards.  I know that when I switched from an X6800 to a QX6850 using my Striker Extreme, without overclocking, I started to see far too many BSOD.  After much research I decided to purchase an X38 based board because it seems to be about the only chipset acutally designed for and built around the quad core chipset.  I purchased the Gigabyte GA-EX38-DQ6, traded my 8800 GTX for an Asus 3870 X2 1gb (thus the reason for my writing) and have never been happier.  Without any overclocking whatsoever, I tested the system the day I got it put together and my 3dMark06 score had, due to a combination of the X38 vs 680i and the 3870 x2 vs 8800 gtx, jumped from 12,800 to 16,889.  I am right now looking at how I can reconfigure my hard drives to fit another Asus in my case to effectively run four cards in crossire, assuming the drivers are available.  For years I have bought nothing but Nvidia cards.  I even bought the 7950 GX2 (same concept) even though I think to this day they still have not gotten the driver right for it.  While I may have had a few ATI's over the years since my PCjr, since Nvidia started cranking out cards I have been a fan.  The X38 chipset has forced me to take a fresh look at ATI and so far I like what I see!  Hopefully this competition will hold as competition in the high end graphics card market is good for us consumers,  Once again thanks for a well written review and I can't wait to see how this card overclocks, just waiting for my cooling solution to get in. Gigabyte  X38 Motherboard (soon to be Crossfire Quad)

One Asus HD 3870 X2 1GB

Gigabyte X38 Motherboard (GA-EX38-DQ6) 

Intel Quad Core Extreme QX6850 

Coolit Freezone II Refrigerated Water Cooling

Thermaltake Tsunami Case

1,000W Modular PSU

4GB DDR2

1 Mad Dog DVD +-RW

1 High Speed CD

2 x 300gb Diamond HDD

0
+ -

 Actually  quads had been out for some time when the 680i's were release. The only issue with quads has to do with their overclocking abilities with certain boards. There is ZERO difference between the 65nm regular and extreme quads other than multipliers (and the unlock extreme's). 45nm quads are not supported on most 680i boards, but that is a board issue not chipset.

 

Your BSOD issue is an issue with your overclock and settings, most 680i boards are unstable at stock settings, you have to manually find a stability point via adjusting your voltages.

 

Also, the enter key on your keyboard is there for a reason.

0
+ -

Actually, I believe the first retail quad core from Intel came out in late 2006.  I recall the first quad core came just about one and a half weeks after I purchased a new X6800 dual core.  I distinctly remember this because I timed my purchase to get a better deal on the X6800.  Also I remember installing the X6800 in my old Asus Striker Extreme motherboard based on the 680i chipset.  Therefore, the 680i was around at least awhile before the Intel quad cores were released to the market, not to mention the time between manufacturing, marketing and final release.  You can do a Google seach to confirm this timeline, I did.

Also I have a right to an opinion and it is mine that the X38 has to be better able to handle features of the newer quad cores chips than the 680i, simply because, like the new Nvidia 790i chipset (but possibly not the 780i which seems to be a minor tweak of the 680i) the X38 was built and designed around newer quad core specifications; this is a fact if everything I have read on the matter is to be believed.  Therefore, it is my opinion that a newer quad core will be more stable in a board with the X38 chipset or the 790i chipset then one trying to get by with the 680i chipset with a number of radical BIOS updates.  

As for your other comments re overclocking being the only issue with quads, I have been building, writing code for and overclocking computers on and off since 1979.   For the last 10 years I have built every computer I have owned and the computers for my kids.  Therefore I am somewhat familier with with computers, the methods of overclocking them, the numerous reasons for BSOD,s, and the reasons for a lot of them.

The BSODs I was getting after no change other than installing a quad core (before I even had a chance to overclock a single setting, no voltage overclock whatsover, not even anything as simple as a multiplier change) led me to the conclusion that something about the Striker or the 680i chipset could not handle the quad core to . 

I tried voltage changes in the hope that I could force the board to become more stable, it just made matters worse to the point where some profiles would have me racking up 10 BSODs a day.  With stock settings and the latest BIOS which was supposed to solve all compatiblity issues with quad core, I was getting at least one or two a day, as you know one a year is one too many.

The performance of the CPU in my new x38 based board seems to bear my conclusion out.  I simply do not think you can keep breathing new life into a chipset forever with BIOS updates, there comes a time when you have to call it quits, thus the 790i.

Please do not take offense again at my comments and opinions.  And I am sorry if I do not use the "enter" key enough but I do not like unnessessarily long posts.

Login or Register to Comment
Post a Comment
Username:   Password: