R680 Has Landed: ATI Radeon HD 3870 X2 - HotHardware

R680 Has Landed: ATI Radeon HD 3870 X2

0 thumbs up



At first glance, the Radeon HD 3870 X2 looks much like AMD’s previous high-end GPU offerings.  The card features ATI’s signature red PCB with an up-close-and-personal look into Ruby’s eyes emblazoned on the fan shroud.  Flip the card over, however, and it becomes abundantly clear that the Radeon HD 3870 X2 has a lot going on under its cooler.


     


    


    
AMD's Reference ATI Radeon HD 3870 X2 Card


 
As we’ve already mentioned, the Radeon HD 3870 X2 is powered by a pair of RV670 GPUs, the same chips used on the Radeon HD 3870 and 3850.  The two GPUs are linked together on the PCB through a PCI Express fan-out switch from PLX.  That switch takes the 16 PCI express lanes coming from the PEG slot and distributes them to both of the GPUs.  We should note, however, that although the RV670 GPU has a native PCI Express 2.0 interface, the on-board switch is PCI Express 1.1 compliant only.  Also note that the 3870 X2 has only a single CrossFire edge connector along the top of its PCB.  It has only one because the other connection is already utilized on the PCB.  Although the Radeon HD 3870 X2 is equipped with a CrossFire connector, at this time drivers are not available that will allow end users to link two of these cards together for quad-GPU CrossFireX.  Those drivers are coming though.

Each of the GPUs on the Radeon HD 3870 X2 has its own 512MB frame buffer, for a total of 1GB of on-board memory.  As you can see, the entire assembly is quite large and results in a 10.5” PCB – a little longer than a standard ATX motherboard.  The cooler used on the card is much like previous offerings, but there are some noteworthy elements to the design.  Each of the GPUs gets its own all copper heatsink, but the surrounding heatplate and heatsinks for the switch and RAM is made of aluminum.  AMD went with a hybrid aluminum and copper cooler to keep the card’s weight down, but it is still quite heavy.

Like the other members of the Radeon HD 3800 series, AMD’s reference ATI Radeon HD 3870 X2 has a pair of dual-link DVI outputs, and an HD video output.  One of the DVI outputs can be converted to an HDMI output with audio using an included adapter.  Because the Radeon HD 3870 X2 has dual GPUs though, it is capable of supporting four DVI outputs.
 


 

If you look at the breakdown above, the Radeon HD 3870 X2 isn’t simply two Radeon HD 3870s fused together on a single PCB.  The GPUs on the Radeon HD 3870 X2 will be clocked at a minimum of 825MHz, up from the standard 3870’s 775MHz.  The X2’s frame buffer memory will be clocked lower, however, 1.8GHz vs. 2.25GHz.  What this means is that in applications that are limited by shader performance and fillrate, the X2 should be faster than a pair of Radeon HD 3870 cards running in CrossFire mode.  Conversely, in applications that are memory bandwidth bound, the dual-card CrossFire configuration should be somewhat faster.

Article Index:

1 2 3 4 Next
0
+ -

Very nice, If the price is right.. I might just pick this one up. 

0
+ -

I appreciate the HH Review. Between Anandtech and Kyle over at Hardocp flaming about which review method is better It makes me all more appreciative for the HH reviews. I think Kyle has a point in some ways about caned benchmarks but I think its a easy work around by creating your own timedemos, like you guys did with quake wars. The subjective review is just too dependant on the person and not the actual data and you cant repeat it exactly accross the system.

0
+ -

Pretty terrible review. How about doing a proper comparison of the cards? It was good showing the performance of the 3700 series and Crossfire mode to the X2 card... but why only show the nVidia single card scores? It'd be a better comparison to put the SLI scores in there. Take the 8800GTX and the 8800GT 512MB in SLI mode as a comparison... as I highly doubt that the X2 comes close to either one in scores.

0
+ -

sniperdoc:

Pretty terrible review. How about doing a proper comparison of the cards? It was good showing the performance of the 3700 series and Crossfire mode to the X2 card... but why only show the nVidia single card scores? It'd be a better comparison to put the SLI scores in there. Take the 8800GTX and the 8800GT 512MB in SLI mode as a comparison... as I highly doubt that the X2 comes close to either one in scores.

 

QFT

What I don't get is why you would use two different boards/chipsets when it was completely unnecessary. This is not good testing methodology and seems to hint at you having performed SLI benchmarks as well but then simply leaving those results out since AMD apparently rubs you guys the right way.

Yeah that's right, GeForce cards will run on crossfire boards, believe it or not.

Also, perhaps I overlooked this but I didn't see anywhere whether or not a crossfire board is required to run this card.

I guess there's a good reason I've never heard of hothardware.com before today. Wow, your forums suck too.. flash-based? bleh...
 

0
+ -

Hello general assumptions.. give me a break.  They tested a single card against another single card.. They are one of the few sites that included standard crossfire numbers as well..  Why would they include sli when that is way out of the price of this X2 ? And on top of that you can see it in relation to the GTX which is basically the same price. Next time you post here make sure you read the review first because you make it sound like their being biased which youd know isnt true if you took the time to read. 

They used two boards because those are the boards with "official" support for their technologies SLI and crossfire respectively. The majority of people are not gona buy a crossfire intel board and stick a GTX in it and vice versa. When you fellas grow up, ya'll come back now. 

0
+ -

FlyinBrian:

They used two boards because those are the boards with "official" support for their technologies SLI and crossfire respectively. The majority of people are not gona buy a crossfire intel board and stick a GTX in it and vice versa. When you fellas grow up, ya'll come back now. 

 

But they didn't do SLI benchmarks so using two boards was a mistake. All it did is add another unnecessary variable into the equation. Your reasoning lacks reason Brian. FAIL.

The fact is that 680i boards are a rip-off compared to P35 boards and the P35s also overclock a bit better. Hence myself and many others who've done their research are running 8800-series cards on crossfire-capable boards, because that's where the best bang per buck is at the moment. Yeah, guess what, PCI-E is a standard so they are interchangeable until you want to run crossfire or SLI.

Way to be a fanboy though.

-phresh
 

0
+ -

I am really too old to be arguing about something as trivial as a pc component.. However I will explain it for you once again.. If your going to build around a Nvidia GTX would it not make sense to use a Nforce board so that one day you can put in another GTX later ? They didnt test SLI because there was no point, given the performance vs a single GTX you can clearly see that if SLI would of won. But then you are far outside of the price/performance of the single 3870x2 solution.  I once again reccomend for you to read the review and not just flip through the graphs.

0
+ -

To phresh: This forum is awesome and I respect it so I'm gonna refreign from starting a flame war, but if you think it sucks why create an account here & post twice. If your primary reason for joining this forum was to slam it then there's no need for a 3rd post!

BTW: It's common knowledge that any single card will run on any chipset... duh.

0
+ -

Phresh,

Thanks for your input. We always value a new perspective here of course. However, you're sorely uninformed, or at least, you're not thinking through things carefully.

To your point on testing platforms - Let me ask you, what's the objective of a test setup for benchmarking? I'll answer for you. In a direct comparison, it's supposed to be evaluating what each product, under it's best performance condition, can do. Ask NVIDIA Marketing what their recommended test platform is and they'll tell you its an NVIDIA chipset, and that's FACT. Ask AMD and they'll say, it's an Intel chipset though it sort of even goes against their grain.

Now, consider that and the fact that this is a GPU REVIEW. The subtle variances between motherboard chipsets when you're testing at high res, high AA settings and high image quality settings, are totally meaningless. The tests are GPU bound, no if, ands or buts about it. Drop in the same CPU and you'll be looking at a 1-2% variance at most.

I do see your point about testing SLI versus the card. A pair of 8800GTs offer a comparable price point versus one of these cards. However, you have no idea of how much testing and under what timelines we had to get this testing done in. Three new driver drops in a week actually. So, would we have liked to show an 8800GT SLI setup? Yes but there just weren't enough hours in the day before launch.

Regardless, we appreciate your input but it sure would be nice if the accusations were kept in check and we could all have an intelligent conversation over it...

0
+ -

Davo:
The subtle variances between motherboard chipsets when you're testing at high res, high AA settings and high image quality settings, are totally meaningless. The tests are GPU bound, no if, ands or buts about it.
 

Given that, why don't we test them on completely different boards for the *** of it then! It'll be fun! We'll have to reload Windows and everything!

 

You guys are awesome at missing the mark. 

 

1 2 3 4 Next
Login or Register to Comment
Post a Comment
Username:   Password: