R680 Has Landed: ATI Radeon HD 3870 X2 - HotHardware

R680 Has Landed: ATI Radeon HD 3870 X2

0 thumbs up



When AMD released the Radeon HD 2900 XT in May of last year, we were left with a bitter taste in our mouths.  Not because the product was all that bad, but because it seemed AMD was content to leave NVIDIA uncontested at the high-end of the 3D graphics card market.  If you remember, when the Radeon HD 2900 XT launched AMD had targeted the mid-range GeForce 8800 GTS, and not NVIDIA’s high-end 8800 GTX or Ultra.  It seemed as if AMD just didn’t want to be in the fight for 3D graphics supremacy any longer and enthusiasts like yourselves, who seek nothing but the highest performing products, were left with only a single option.

Thankfully things have changed in the last few months.  After the R600 shipped, AMD’s engineers went right back to work.  And in November AMD released the Radeon HD 3800 series of products, which were based on an updated GPU design that performed much like the R600, but with more moderate power and thermal characteristics.  The new GPU, formerly codenamed RV670, still wasn’t quite as fast as NVIDIA’s higher-end offerings, but its low power profile gave AMD the ability to engineer the product we’ll be showcasing here today, the Radeon HD 3870 X2.

The Radeon HD 3870 X2 was codenamed R680 throughout its development.  Although that codename implies the card is powered by a new GPU, it is not.  The Radeon HD 3870 X2 is instead powered by a pair of RV670 GPUs linked together on a single PCB by a PCI Express fan-out switch.  In essence, the Radeon HD 3870 X2 is “CrossFire on a card”.  Take a look...



ATI Radeon HD 3870 X2

ATI Radeon HD 3870 X2
Features & Specifications

666 million transistors on 55nm fabrication process (x2)

256bit 8-channel GDDR3/4 memory interface (x2)

Ring Bus Memory Controller

  • Fully distributed design with 512-bit internal ring bus for memory reads and writes
  • Optimized for high performance HDR (High Dynamic Range) rendering at high display resolutions

Unified Superscalar Shader Architecture

  • 320 stream processing units (x2)
    • Dynamic load balancing and resource allocation for vertex, geometry, and pixel shaders
    • Common instruction set and texture unit access supported for all types of shaders
    • Dedicated branch execution units and texture address processors
       
  • 128-bit floating point precision for all operations
  • Command processor for reduced CPU overhead
  • Shader instruction and constant caches
  • Up to 80 texture fetches per clock cycle
  • Up to 128 textures per pixel
  • Fully associative multi-level texture cache design
  • DXTC and 3Dc+ texture compression
  • High resolution texture support (up to 8192 x 8192)
  • Fully associative texture Z/stencil cache designs
  • Double-sided hierarchical Z/stencil buffer
  • Early Z test, Re-Z, Z Range optimization, and Fast Z Clear
  • Lossless Z & stencil compression (up to 128:1)
  • Lossless color compression (up to 8:1)
  • 8 render targets (MRTs) with anti-aliasing support
  • Physics processing support

Full support for Microsoft DirectX 10 / 10.1 

  • Shader Model 4.0
  • Geometry Shaders
  • Stream Output
  • Integer and Bitwise Operations
  • Alpha to Coverage
  • Constant Buffers
  • State Objects
  • Texture Arrays

Dynamic Geometry Acceleration

  • High performance vertex cache
  • Programmable tessellation unit
  • Accelerated geometry shader path for geometry amplification
  • Memory read/write cache for improved stream output performance

Anti-aliasing features

  • Multi-sample anti-aliasing (up to 8 samples per pixel)
  • Up to 24x Custom Filter Anti-Aliasing (CFAA) for improved quality
  • Adaptive super-sampling and multi-sampling
  • Temporal anti-aliasing
  • Gamma correct
  • Super AA (CrossFire configurations only)
  • All anti-aliasing features compatible with HDR rendering

CrossFire Multi-GPU Technology

  • Scale up rendering performance and image quality with 2 or more GPUs
  • Integrated compositing engine
  • High performance dual channel interconnect
Texture filtering features
  • 2x/4x/8x/16x high quality adaptive anisotropic filtering modes (up to 128 taps per pixel)
  • 128-bit floating point HDR texture filtering
  • Bicubic filtering
  • sRGB filtering (gamma/degamma)
  • Percentage Closer Filtering (PCF)
  • Depth & stencil texture (DST) format support
  • Shared exponent HDR (RGBE 9:9:9:5) texture format support

ATI Avivo HD Video and Display Platform

  • Two independent display controllers
    • Drive two displays simultaneously with independent resolutions, refresh rates, color controls and video overlays for each display
    • Full 30-bit display processing
    • Programmable piecewise linear gamma correction, color correction, and color space conversion
    • Spatial/temporal dithering provides 30-bit color quality on 24-bit and 18-bit displays
    • High quality pre- and post-scaling engines, with underscan support for all display outputs
    • Content-adaptive de-flicker filtering for interlaced displays
    • Fast, glitch-free mode switching
    • Hardware cursor
       
  • Two integrated dual-link DVI display outputs
    • Each supports 18-, 24-, and 30-bit digital displays at all resolutions up to 1920x1200 (single-link DVI) or 2560x1600 (dual-link DVI)
    • Each includes a dual-link HDCP encoder with on-chip key storage for high resolution playback of protected content
       
  • Two integrated 400 MHz 30-bit RAMDACs
    • Each supports analog displays connected by VGA at all resolutions up to 2048x1536
       
  • HDMI output support
    • Supports all display resolutions up to 1920x1080
    • Integrated HD audio controller with multi-channel (5.1) AC3 support, enabling a plug-and-play cable-less audio solution
       
  • Integrated Xilleon HDTV encoder
    • Provides high quality analog TV output (component / S-video / composite)
    • Supports SDTV and HDTV resolutions
    • Underscan and overscan compensation
       
  • HD decode for H.264/AVC, VC-1, DivX and MPEG-2 video formats
    • Flawless DVD, HD DVD, and Blu-Ray playback
    • Motion compensation and IDCT (Inverse Discrete Cosine Transformation)
       
  • HD video processing
    • Advanced vector adaptive per-pixel de-interlacing
    • De-blocking and noise reduction filtering
    • Edge enhancement
    • Inverse telecine (2:2 and 3:2 pull-down correction)
    • Bad edit correction
    • High fidelity gamma correction, color correction, color space conversion, and scaling
  • MPEG-2, MPEG-4, DivX, WMV9, VC-1, and H.264/AVC encoding and transcoding
  • Seamless integration of pixel shaders with video in real time
  • VGA mode support on all display outputs

PCI Express 2.0 x16 bus interface

OpenGL 2.0 support

 

 


   
ATI Really Wanted To Protect This Baby


To get more familiar with AMD's new ATI Radeon HD 3870 X2, the company's previous GPU architectures, and their key features, we recommend you read a few recent articles we've posted here at HotHardware. The Radeon HD 3870 X2's pair of RV670 GPUs are derived from of the R600, and as such they have a number of key features in common that we've already covered in much greater detail that we will here today. The articles we suggest you check out include:

If you haven't already done so, we recommend scanning through our Radeon HD 3800 and 2900 series coverage, our CrossFire Multi-GPU technology preview, and the Radeon X1950 Pro with Native CrossFire article. In those four pieces, we cover a large number of the features offered by the new Radeon HD 3870 X2 and explain many of the features of DirectX 10 and 10.1. We recommended reading these articles because there is quite a bit of background information in them that'll make it easier to fully digest what we're going to showcase here today.

Article Index:

1 2 3 4 Next
0
+ -

Very nice, If the price is right.. I might just pick this one up. 

0
+ -

I appreciate the HH Review. Between Anandtech and Kyle over at Hardocp flaming about which review method is better It makes me all more appreciative for the HH reviews. I think Kyle has a point in some ways about caned benchmarks but I think its a easy work around by creating your own timedemos, like you guys did with quake wars. The subjective review is just too dependant on the person and not the actual data and you cant repeat it exactly accross the system.

0
+ -

Pretty terrible review. How about doing a proper comparison of the cards? It was good showing the performance of the 3700 series and Crossfire mode to the X2 card... but why only show the nVidia single card scores? It'd be a better comparison to put the SLI scores in there. Take the 8800GTX and the 8800GT 512MB in SLI mode as a comparison... as I highly doubt that the X2 comes close to either one in scores.

0
+ -

sniperdoc:

Pretty terrible review. How about doing a proper comparison of the cards? It was good showing the performance of the 3700 series and Crossfire mode to the X2 card... but why only show the nVidia single card scores? It'd be a better comparison to put the SLI scores in there. Take the 8800GTX and the 8800GT 512MB in SLI mode as a comparison... as I highly doubt that the X2 comes close to either one in scores.

 

QFT

What I don't get is why you would use two different boards/chipsets when it was completely unnecessary. This is not good testing methodology and seems to hint at you having performed SLI benchmarks as well but then simply leaving those results out since AMD apparently rubs you guys the right way.

Yeah that's right, GeForce cards will run on crossfire boards, believe it or not.

Also, perhaps I overlooked this but I didn't see anywhere whether or not a crossfire board is required to run this card.

I guess there's a good reason I've never heard of hothardware.com before today. Wow, your forums suck too.. flash-based? bleh...
 

0
+ -

Hello general assumptions.. give me a break.  They tested a single card against another single card.. They are one of the few sites that included standard crossfire numbers as well..  Why would they include sli when that is way out of the price of this X2 ? And on top of that you can see it in relation to the GTX which is basically the same price. Next time you post here make sure you read the review first because you make it sound like their being biased which youd know isnt true if you took the time to read. 

They used two boards because those are the boards with "official" support for their technologies SLI and crossfire respectively. The majority of people are not gona buy a crossfire intel board and stick a GTX in it and vice versa. When you fellas grow up, ya'll come back now. 

0
+ -

FlyinBrian:

They used two boards because those are the boards with "official" support for their technologies SLI and crossfire respectively. The majority of people are not gona buy a crossfire intel board and stick a GTX in it and vice versa. When you fellas grow up, ya'll come back now. 

 

But they didn't do SLI benchmarks so using two boards was a mistake. All it did is add another unnecessary variable into the equation. Your reasoning lacks reason Brian. FAIL.

The fact is that 680i boards are a rip-off compared to P35 boards and the P35s also overclock a bit better. Hence myself and many others who've done their research are running 8800-series cards on crossfire-capable boards, because that's where the best bang per buck is at the moment. Yeah, guess what, PCI-E is a standard so they are interchangeable until you want to run crossfire or SLI.

Way to be a fanboy though.

-phresh
 

0
+ -

I am really too old to be arguing about something as trivial as a pc component.. However I will explain it for you once again.. If your going to build around a Nvidia GTX would it not make sense to use a Nforce board so that one day you can put in another GTX later ? They didnt test SLI because there was no point, given the performance vs a single GTX you can clearly see that if SLI would of won. But then you are far outside of the price/performance of the single 3870x2 solution.  I once again reccomend for you to read the review and not just flip through the graphs.

0
+ -

To phresh: This forum is awesome and I respect it so I'm gonna refreign from starting a flame war, but if you think it sucks why create an account here & post twice. If your primary reason for joining this forum was to slam it then there's no need for a 3rd post!

BTW: It's common knowledge that any single card will run on any chipset... duh.

0
+ -

Phresh,

Thanks for your input. We always value a new perspective here of course. However, you're sorely uninformed, or at least, you're not thinking through things carefully.

To your point on testing platforms - Let me ask you, what's the objective of a test setup for benchmarking? I'll answer for you. In a direct comparison, it's supposed to be evaluating what each product, under it's best performance condition, can do. Ask NVIDIA Marketing what their recommended test platform is and they'll tell you its an NVIDIA chipset, and that's FACT. Ask AMD and they'll say, it's an Intel chipset though it sort of even goes against their grain.

Now, consider that and the fact that this is a GPU REVIEW. The subtle variances between motherboard chipsets when you're testing at high res, high AA settings and high image quality settings, are totally meaningless. The tests are GPU bound, no if, ands or buts about it. Drop in the same CPU and you'll be looking at a 1-2% variance at most.

I do see your point about testing SLI versus the card. A pair of 8800GTs offer a comparable price point versus one of these cards. However, you have no idea of how much testing and under what timelines we had to get this testing done in. Three new driver drops in a week actually. So, would we have liked to show an 8800GT SLI setup? Yes but there just weren't enough hours in the day before launch.

Regardless, we appreciate your input but it sure would be nice if the accusations were kept in check and we could all have an intelligent conversation over it...

0
+ -

Davo:
The subtle variances between motherboard chipsets when you're testing at high res, high AA settings and high image quality settings, are totally meaningless. The tests are GPU bound, no if, ands or buts about it.
 

Given that, why don't we test them on completely different boards for the *** of it then! It'll be fun! We'll have to reload Windows and everything!

 

You guys are awesome at missing the mark. 

 

1 2 3 4 Next
Login or Register to Comment
Post a Comment
Username:   Password: