Can Legacy Dual-Core CPUs Drive Modern GPUs?

rated by 0 users
This post has 36 Replies | 2 Followers

Top 10 Contributor
Posts 25,886
Points 1,173,720
Joined: Sep 2007
ForumsAdministrator
News Posted: Fri, Feb 8 2013 12:36 PM

A few weeks back, we examined whether a new GPU like the GeForce GTX 660 could breathe new life into an older quad-core gaming system built in mid 2008. The answer, we concluded, was definitely yes -- but many of you of you asked us to reconsider the question, this time using a dual-core Core 2 Duo.

We've done so. And just for fun, we've added overclocked CPU results into the mix as well.

Can Legacy Dual-Core CPUs Drive Modern GPUs?

  • | Post Points: 410
Top 150 Contributor
Posts 481
Points 3,825
Joined: Jun 2012

I was just thinking about putting in an old q6600 in to my nephews pc to replace his E8750, now i can confirm it will be a nice upgrade :)

  • | Post Points: 20
Top 100 Contributor
Posts 1,072
Points 11,625
Joined: Jul 2009
Joel H replied on Fri, Feb 8 2013 7:45 PM

Definitely. And if you're up for overclocking, the results are even better.

  • | Post Points: 5
Top 25 Contributor
Posts 3,466
Points 46,975
Joined: Nov 2005
Location: Metropolis
ForumsAdministrator
Moderator

That's a very good article there, Joel - very enlightening. Thanks for that!

 SPAM-posters beware! ®

  • | Post Points: 5
Top 10 Contributor
Posts 8,577
Points 103,230
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator

Good article Joel. I guess that this puts some of the comments on the original piece to rest.

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 5
Top 100 Contributor
Posts 983
Points 9,085
Joined: Mar 2012
Location: LA, CA
sevags replied on Fri, Feb 8 2013 10:05 PM

Nice... Thought the first article was interesting but just less practical for most, this piece answers questions that pertain to more of us and results are still good! I figured the quad core would have faired much better than a dual core of that era but here are the results! My current PC (yes I call it a PC because I run win7) is a 2.66ghz dualcore iMac so I can't even upgrade the videocard lol but hey it was free!

  • | Post Points: 5
Top 10 Contributor
Posts 8,577
Points 103,230
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Fri, Feb 8 2013 10:22 PM

PC sounds like the right terminology to me. And a free iMac is a damn good thing too. I miss the one I had.

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 5
Top 100 Contributor
Posts 983
Points 9,085
Joined: Mar 2012
Location: LA, CA
sevags replied on Sat, Feb 9 2013 1:05 AM

Good friend of mine recently upgraded to he new one and this one has been dropped the corner dented with the glass broken but other than that perfect but he saw it as a damaged hand-me-down. 24" so it's great for YouTube viewing with friends in th garage. Runs dota2 just fine but that's all I e thrown at its gforcce 9400m =\

  • | Post Points: 5
Not Ranked
Posts 13
Points 110
Joined: Feb 2013

The era of dual cores ended at the beginning of 2010 with the release of BFBC2, which was extremely core-heavy. It ate dual cores for breakfast, launch and dinner... and have only showed that a dual core, no matter how fast it is, just didn't cut it anymore.

  • | Post Points: 5
Top 100 Contributor
Posts 1,072
Points 11,625
Joined: Jul 2009
Joel H replied on Sat, Feb 9 2013 1:34 PM

Inquisius,

Given that Intel's cheapest quad-core is $179, I'm not sure I'd agree with this sentiment.

  • | Post Points: 20
Not Ranked
Posts 13
Points 110
Joined: Feb 2013

What does the price of a chip has to do with it's performance? If you kept with the times, you maybe wouldn't even have bothered to write this article off because it was known that dual cores DIED once 2010 came in:

http://www.techpowerup.com/reviews/Intel/Core_i5_661/15.html

More so, the fact that you're suggesting people second hand Q6600's is pure absurd. Yeah, surely the Q6600 does better than dual cores in the majority of games nowadays, but what's the point in investing money on such old, inefficient, and dead platform when you can get a Sandy or Ivy combo from Microcenter for not so much more?

  • | Post Points: 5
Top 100 Contributor
Posts 1,072
Points 11,625
Joined: Jul 2009
Joel H replied on Sat, Feb 9 2013 8:49 PM

"What does the price of a chip has to do with it's performance?"

Uh. Everything? Chips are priced by their performance. Now, then, and always. Faster chips cost more. Period.

I am recommending the Q6600 as an upgrade for people who have old gaming systems and limited funds. Would I recommend people buy a new Q6600-based rig today? Obviously not. But for $50-$70? That's a great chip.

If I had an older Core 2 Duo CPU and didn't have the $450 I'd need to spend to buy new desktop hardware but could manage $270 for GPU + CPU, then that's a good upgrade. If you want to argue otherwise, you'll need to show me how you can build a better game system for the same amount of money. Keep in mind, you need to cover the cost of motherboard, CPU, RAM, and GPU.

  • | Post Points: 20
Not Ranked
Posts 13
Points 110
Joined: Feb 2013

My point was that you were trying to say dual cores are NOT dead cause the cheapest quad is still $175, which is nonsense. If you want to game properly, you need a quad at this time and date, period, and no $50-75 chip will get you the same performance.

Your recommendation of Q6600 is total BS at best since it has horrible single threaded performance. It's no more than a pair of E6600 dualies coupled together... and is a TERRIBLE buy at this time and date. Yes, EVEN at $50-75. Nobody in there right mind should go ahead and milk their old ass machine with a Q6600 when they can save that money towards a new machine. There's no point in spending ANY money on a Q6600 in 2013. None at all. Period. Full stop.

You don't need $450 to upgrade a C2D machine. You can do it for half as much and still have a way better machine.

And what does covering the cost of a GPU have to do with the CPU? You're keeping the GPU in both cases, I thought this article was all on about the CPU upgrade and NOT the GPU upgrade... you make little sense, if any at all. D'oh well.

And no, while I can't show you how you can build a machine as "cheap" and good performing (since you're buying old *** second-hand), I can show you how you can spend your money wisely towards an entirely new system.

  • | Post Points: 20
Not Ranked
Posts 2
Points 25
Joined: Feb 2013

I would just like to say thanks for this article. It seems like every new generation of GPUs the websites update the entire platform for the review, which i guess makes sense for the enthusiasts out there, but for those of us who don't upgrade all the time it leaves us in the dark.

I remember a long long time ago somebody did one of these with like an Athlon 64 and a 7000 series Nvidia GPU.......

I just jumped on a deal from Microcenter for a Core i3 and a motherboard for $140 so it doesn't really apply anymore but it's still nice to know.

I had a Core 2 Duo E4500 (so that's like 5 years?) and a little while ago I swapped my 9800GTX+ for a GTX460. Fact is most games out there don't interest me so it never made sense for me to be plunking down 2-300 every year for a computer part. The biggest reason for even getting the i3 was to see how much of a difference it would make in FSX (the game i play the most) And I got that THQ bundle for $6 a few weeks ago, trying to play Metro2033 was a bit ridiculous on my system (sometimes the FPS dropped into the single digits on mid-high settings)

I'm still waiting for BF3 to drop from $40 and then maybe I'll splurge on a GTX 660TI (Yah i'm not like most gamers, spending more than $30 on a game is just ridiculous to me as well as more than $300 on video card/s, i'll never get it)

As far as the next generation of consoles, i'm not falling for that one again. I waited for the Ps3 to be $400 and the dang thing still broke on me, twice!! It's been over a year and i haven't bothered to get it fixed. I figure by the time i do i'll have a big pile of cheap good games to get through :)

  • | Post Points: 20
Not Ranked
Posts 2
Points 25
Joined: Feb 2013

Yah i wouldn't recommend it either, but still it's nice to know you can upgrade for $50, a simple drop-in no need to re-install windows upgrade (or replace motherboards) esp. in these economic times. Heck I know there are some out there who still didn't want to upgrade from Windows XP hehe and i think i now understand those people a bit more.

  • | Post Points: 5
Top 100 Contributor
Posts 1,072
Points 11,625
Joined: Jul 2009
Joel H replied on Mon, Feb 11 2013 10:41 AM

Inquisius,

Yeah, you're just wrong. Thanks for playing. 

  • | Post Points: 20
Not Ranked
Posts 13
Points 110
Joined: Feb 2013
Inquisius replied on Mon, Feb 11 2013 3:12 PM

Wow, and how exactly I'm wrong? Care to explain? Playing? Oh wow. Am I talking to a brick wall here?

So let me get this straight, what you're saying is that if someone had $450 in their pocket and a C2D dualie, they should go with a second hand Q6600 + a 670 or a 7950? That is, as was tried to be pointed out before, flat out silly, pointless and absurd. Why? Because the majority of the gains that would come from a 670/7950 WILL be lost, the card will be severely bottlenecked there wouldn't even be any point in making such combination. I mean, a Q6600 with a 7950? Seriously? Do you comprehend the nonsense you're dividing into?

A Q6600 will bottleneck a 580 let alone a 670... yes, EVEN when the application is completely multi-threaded, there WILL be massive gains with a faster chip over a Q6600. So what you should do in such case is to first upgrade the platform, then get a 7870 or something, which would turn in just as good, if not better performance.

The point here is that your point is just pointless and the fact that you're so keen on it is pathetic.

It is what it is. A Q6600 IS a POOR, and a DEAD investment at this time and date. Stop beating this OLD and DEAD horse so that we can go on already.

  • | Post Points: 5
Top 100 Contributor
Posts 1,072
Points 11,625
Joined: Jul 2009
Joel H replied on Mon, Feb 11 2013 5:56 PM

Inquisius,

You're carrying on both sides of this conversation for me; I feel no particular need to participate. :P

"So let me get this straight, what you're saying is that if someone had $450 in their pocket and a C2D dualie, they should go with a second hand Q6600 + a 670 or a 7950? "

Nope. Not what I said, at any juncture.

*I* said: "The combined cost of a used Q6600 and a GeForce GTX 660 should still come in below $300 while delivering far better performance than any bottom-end desktop you might assemble for that price tag."

You have consistently missed the point. This article is written to explore the question of whether you can get a decent upgrade out of a new GPU (and possibly new CPU) if you have an old dual-core. It implicitly assumes that you have limited funds to spend.

That last sentence is vital to the story.

Yes, $500 will buy you a better system than $300 will. But that's not the question. It's not the point. If you only have $300 to spend, period, the fact that a better system can be had for $500 is irrelevant.

If we consider the situation stretched over two years, and the person in question intends to use the GTX 660 both as an upgrade and next year, as a GPU in a new rig, than the "cost" of upgrading the old system is $50-$70. Again, that's a useful data point for someone with limited funds to spend.

There's also the fact that a quick CPU and GPU upgrade is typically faster than a full system swap out -- you might be able to nurse a Windows install across chipsets when going from Core 2 era to Core i7, but I wouldn't recommend it. A third interested reader might want to know how much of a benefit they'd see if they upgraded an older secondary system for occasional LAN games, a child, or loved one.

The problem I have with your increasingly frenetic argument is that you set up and knock down straw men in quick succession. The point of this story isn't to advocate everyone spend $300 instead of $500. I don't claim that a Q6600 is just as good as a Core i3 / i5 -- I don't compare against those processors at all.

This story, and the one that preceded it, answer the same question: "What's the benefit of combining an older CPU and modern GPU?"

I stand by my results, as tested. I stand by my conclusion, as drawn. For certain enthusiasts with limited funds, this upgrade makes sense -- especially over two years. If I had an E6850 and was buying a GTX 660 no matter what, I'd pay $60 for a 12-month upgrade of 30-35% to my gaming frame rates, particularly if I planned to keep the system thereafter as a second rig.

  • | Post Points: 20
Not Ranked
Posts 13
Points 110
Joined: Feb 2013
Inquisius replied on Tue, Feb 12 2013 9:52 AM

You talked about "covering the cost of the GPU", which INCLUDES buying a new GPU AND CPU. In such case, buying EVEN a 570 together with a Q6600 is, again, flat out pointless and dumb.

What you should do under such case is to save up money towards an ENTIRE NEW box and do it that way.

Suggesting anyone to get a 660 WITH a Q6600 is so ludacris and moronic I'm out of words.

It's at least better than people that still suggest those with 280's etc. go to SLi, or people that still suggest VelociRaptors...

-sigh.

  • | Post Points: 20
Top 100 Contributor
Posts 1,072
Points 11,625
Joined: Jul 2009
Joel H replied on Tue, Feb 12 2013 11:15 AM

Inquisius,

*ludicrous.

"What you should do under such case is to save up money towards an ENTIRE NEW box and do it that way."

Why? I'd gladly pay $70 for better game performance for 12-18 months. Data shows that doing so improves frame rates by as much as 40%. Battlefield 3 on the Q6600 at 3GHz was noticeably, obviously faster than BF3 on the Q6600 at 2.4GHz, which was faster than the E6850 at 3GHz.

If I upgrade my CPU this year and then flip that chip for $20 next year, I'll have paid $30-$50 for a significant performance boost. That's an easy decision.

You are certainly welcome to your opinion on the relative benefit of upgrading. We have different views on this topic.

  • | Post Points: 5
Not Ranked
Posts 13
Points 110
Joined: Feb 2013
Inquisius replied on Tue, Feb 12 2013 11:50 AM

IDK what should I speak about actually. Whether you're to keen on the Q6600 suggestion or you didn't even get that I was making a pun by using the word "Ludacris" instead of "ludicrous" ...

Data shows that a Q6600 offers 40% better performance over a E6850? And how much difference does THAT data would show if you put that Q6600 at 2.4 head-to-head with a 3770 at 3.5? Especially with a card as fast as a heavily OC'ed 7950 reference with Volterra VRM, up to *** fast speeds higher than a 7970? In that case, the Ivy chip would blow the dog *** out of the Q6600 much that I wouldn't be surprised if you saw a 100-200% gain in a CPU intensive area of BF3...

your view on this topic is honestly laughable. You're suggesting people to pair up 670's/7950's with Q6600's... and that's just because the games are more threaded nowadays. A few years ago that wouldn't even have been the case. That chip has God awful single or dual threaded performance at this time and date.

Unless you could OC the *** out of the Q6600 back to hell and forth (say over 4 Ghz), it's, I repeat, NOT a worthwhile chip period.

I recently upgraded a 3.6 QX9650 machine with a 3770K (570 SLi) and have seen a nice and significant boost in the games I play, even with the Ivy chip at stock.

  • | Post Points: 5
Top 100 Contributor
Posts 1,072
Points 11,625
Joined: Jul 2009
Joel H replied on Tue, Feb 12 2013 12:59 PM

"And how much difference does THAT data would show if you put that Q6600 at 2.4 head-to-head with a 3770 at 3.5?"

I haven't tested it. But let's assume it was twice as fast as the Q6600 at 2.4GHz. 78 FPS.

The Q6600 is $50-$70. The 3770K is $319. You've just spent 4.5x as much money for a 2x frame rate gain. Not very good from an efficiency stand point.

" You're suggesting people to pair up 670's/7950's with Q6600's"

Nope. I'm suggesting people pair a GTX 660 with an older chip when that older chip is available for very little money and they've got an older dual core. I've previously stated that scaling with a Radeon 7950 should be similar to a GTX 660 card, but noted that I haven't tested this configuration.

"I recently upgraded a 3.6 QX9650 machine with a 3770K (570 SLi) and have seen a nice and significant boost in the games I play, even with the Ivy chip at stock."

Sure. I have no problem believing that you might see a 20-25% frame rate increase from that jump-- possibly even more, depending on the particulars of the game.

"Worthwhile" depends on what you own, what you want to do with it, and what you can afford to replace it with.

We're talking in circles and you refuse to acknowledge that not everyone has the same amount of money to drop on an upgrade or the same desires when considering options. Have a pleasant day.

  • | Post Points: 5
Not Ranked
Posts 13
Points 110
Joined: Feb 2013
Inquisius replied on Tue, Feb 12 2013 1:25 PM

Then it WOULD be worthwhile. How is an average 38 FPS even acceptable at all?

And no, the 7950 is MUCH faster than the 660. It even beats the 670 significantly once both are OC'ed heavily. They aren't even in the same class to begin with.

One more thing is that very, very few still own a Conroe on their primary machine. Most on S775 are on Wolfdale so getting a Q6600 wouldn't be a wise choice, at all. There's little to no point in swapping a E8400 with a Q6600.

The thing is, like I said before, you don't need hundreds of dollars to upgrade a C2D machine. Microcenter currently has a deal that offers a FM2 board for FREE with a FM2 chip...

http://www.techpowerup.com/forums/showthread.php?t=179944

now look at that. The A8 5400K is an extremely mere $60. So for a mere $60, you can have a brand new barebone. Just add some $10-20 low-profile DDR3 on top of it and you're good to go. And how much better would that setup be over an old Q6600 setup..? You tell me.

  • | Post Points: 5
Top 100 Contributor
Posts 1,072
Points 11,625
Joined: Jul 2009
Joel H replied on Tue, Feb 12 2013 4:00 PM

There is no A8-5400K.

There's an A6-5400K (dual-core), with weak single-threaded performance and an A8-5600K for $109 at NewEgg. Given Bulldozer / Piledriver's weak IPC, it's not clear that this is a benefit. Might be. I'd have to benchmark it. Remember, Phenom II offered roughly equal performance to Core 2 Duo, clock for clock -- but Piledriver's IPC is 10-15% slower than Llano clock-per-clock.

The only way to answer this question would be to build a new set of systems for comparison -- and I have other projects what need doing. You might be correct -- a cheap AMD rig might outperform an old Core 2 Duo and would still be cheaper than a new Core i5/i7 system. Teasing out the accuracy of that question is impossible without a new round of tests.

  • | Post Points: 5
Not Ranked
Posts 13
Points 110
Joined: Feb 2013
Inquisius replied on Tue, Feb 12 2013 4:34 PM

BD has weak ass IPC but an excellent OC'ability potential and multi-threaded performance, and PileDriver doesn't have "weak" IPC at all as it's IPC is as good as Nehalem's. It's a benefit. Both from IPC and especially from clock speeds. The FM1 chip has a much higher clock speed and OC's much further than the Q6600. Ghz wise, the Q6600 is so slow it's laughable LOL you know, 2.4 Ghz. It has a sick OC'ing potential though that is NOT to be expected with all Q6600's so I wouldn't bet on it. Especially given that you need a beastly board to push it.

The AM2+ Phenom 2's with the DDR2 controller offered similar performance to Q6600, such as the X4 940. The AM3 Phenom's with the increased IPC and better cache/clock speeds, such as the 965 BE, outperform Kentsfields such as Q6600.

The BullDozer based A8-5600K would easily beat the Q6600. It has significantly faster IPC, faster clock speeds and faster OC'ing potential.

Best thing is you'd end up with a newer platform that is more efficient. The Q6600 is a power hog once OC'ed heavily. It's basically a room heater. Not to mention the old boards the people on E6600's etc. have, with liquid caps and all that old, dislikeable ***.

Anand actually has those chips in their chart, and yeah the FM1 setup is indeed faster:

http://www.anandtech.com/bench/Product/676?vs=53

Looking at the Cinebench charts, which is the closest thing to a game on that comparison, the FM1 setup ends up to be significantly faster.

  • | Post Points: 20
Top 100 Contributor
Posts 1,072
Points 11,625
Joined: Jul 2009
Joel H replied on Tue, Feb 12 2013 5:12 PM

Piledriver doesn't have "weak" IPC at all as it's IPC is as good as Nehalem's.

Piledriver's IPC is on par with Bulldozer's. See here:

http://www.extremetech.com/computing/138394-amds-fx-8350-analyzed-does-piledriver-deliver-where-bulldozer-fell-short/2

The performance data in that article was measured with Turbo Mode turned *off* for all processors in order to take a closer look at IPC and efficiency.

Piledriver's improved performance in general single-threaded benchmarks (when Turbo Mode is enabled) comes from two improvements:

1) The chip spends more time in Turbo mode than Bulldozer did. BD ran closer to its envelope and had a harder time actually utilizing Turbo mode.

2) Higher clock speeds.

Once you normalize against clock speed and deactivate Turbo Mode, Piledriver's improvements are on the order of 5-7%. It's not as strong a chip as Nehalem. The Thuban-based Phenom II hits a single-threaded score of 1.0x at 3.3GHz with its Turbo Mode off. The Nehalem-based Core i7-970 scored ~1.2 in the CB 11.5 test (at 3.46GHz).

The FX-8350, with Turbo Mode enabled, only scores a 1.1. That's a 4.2GHz clock speed.

If the Core i7-970 is pulling a 1.2 at 3.46GHz while the FX-8350 does 1.10 at 4.2GHz, it indicates that Nehalem has a significant IPC advantage over AMD's Piledriver.

Also:

The Anandtech benchmarks do indeed show a stock A8-5600K beating a Q6600. Your point is taken. While the A10-5600K would still be more expensive than $50-$70 for a new Q6600, it would be substantially less expensive than upgrading to an Intel Core i7 quad-core + new mobo + RAM.

  • | Post Points: 5
Not Ranked
Posts 13
Points 110
Joined: Feb 2013
Inquisius replied on Tue, Feb 12 2013 6:28 PM

Well, thanks for all that. Didn't really keep up with AMD tech to know about those as I only checked out one or two FX-8350 reviews, BUT, it still has slightly improved IPC over Bulldozer. I'd say it's somewhere in between Nehalem and Yorkfield, which is somewhat respectable.

Anyway, I personally could care less about all that TBH. This Z77 Extreme9 + 3770K is so good I'm treating it like my baby. The Extreme9 has a maximum OC button in it's UEFI which just OC's the *** out of the CPU to 4.8 with a single click... and the setup is so much more fluid, stable and efficient than that old, fat and hot ass QX9650 which barely did 3.6 after a good amount of setting and tweaking...

anyway, what I was, from the beginning, saying is that Q6600 really can't be the way to go when setups like that FM1 can be had. Hell, even second hand *** like Phenom 2 hardware from eBay would be easily better than a Q6600. The problem with Q6600 is mostly people on an as old as E6600 chips have old boards and platforms that can now be considered "shitty" that it'd, honestly, IMHO, just be better to upgrade the whole thing when you can do it for so cheap:

http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=7683952&Sku=A79-1055T

  • | Post Points: 5
Not Ranked
Posts 13
Points 110
Joined: Feb 2013
Inquisius replied on Tue, Feb 12 2013 6:35 PM

Also, that is NOT Nehalem. Your comparison is false since you're comparing a 6-core, 12 thread Gulftown to the PileDriver. It's easily a superior chip. The i7 920 is a Nehalem and has worse single-threaded performance, less cores and less OC'ability potential.

  • | Post Points: 5
Top 100 Contributor
Posts 1,072
Points 11,625
Joined: Jul 2009
Joel H replied on Tue, Feb 12 2013 6:50 PM

You really ought to deepen your understanding of these topics if you want to debate them cogently.

Westmere's benefits over Nehalem were:

Hardware AES instructions.

Improved virtualization performance.

A greater number of cores (6, instead of 4).

Higher clock speed.

Outside of these specific areas, Westmere's IPC was *not* significantly higher than Nehalem's. Benefits were in the 3-4% range, tops.

  • | Post Points: 5
Not Ranked
Posts 13
Points 110
Joined: Feb 2013
Inquisius replied on Tue, Feb 12 2013 7:04 PM

I didn't say Westmere's IPC was significantly higher than Nehalem's. I know it was just about that much better as it didn't offer way better single threaded performance when it came down to the real usage of it.

What I was saying is that, you were comparing Cinebench scores of Gulftown to Nehalem, which is somewhere the Gulftown, definately WOULD, at whichever percent, come on top of Nehalem.

And I know that from first hand experience since I had an E5620 on an X58 UD9 (which unfortunately shorted out and died the day I booted it up) and it, generally, was barely any faster than an i7 920. Was mostly a give or take.

  • | Post Points: 5
Top 100 Contributor
Posts 1,072
Points 11,625
Joined: Jul 2009
Joel H replied on Wed, Feb 13 2013 11:58 AM

It doesn't matter, in this instance. Let me show you why, using Anandtech's database.

http://www.anandtech.com/bench/CPU/344

Remember, we're looking for IPC, so clock speeds matter. The simplest way to get a comparable metric is to divide the score by the clock speed. The trick here is that we have to use Turbo Speeds. This figure gives us a steady comparison between clock speed and Cinebench 11.5 performance.

Core i7-3770K = 0.000425.

Core i7-2600K = 0.00040.

Core i3-3220 = 0.000415 (IVB part, slightly more efficient than the Core i7-2600K).

Core i7-990X = 0.000346 (Westmere)

An overclocked Core i7-920 at 3.8GHz turns in a Cinebench 11.5 score of 1.39. You can check that here:

http://hothardware.com/Reviews/Paradigm-SHIFT-MainGears-Unique-Enclosure-GTX-480-SLI-Powerhouse-Reviewed/?page=9

Core i7-920 OC'D = 0.000365

Core i7-990X OC'D = 0.000345

Why is Nehalem slightly faster than Westmere's single-threaded performance when overclocked? Let's call that a test errata. But we've got a steady pattern here. Intel's single-threaded efficiency in CB11.5 has improved 23% between Westmere and Ivy Bridge.

Now let's compare AMD. We're back to Anandtech's numbers.

FX-8350 (Piledriver) = 0.000261

FX-8150 (Bulldozer) = 0.000261

A10-5800K = 0.000257

A8-3850 = 0.000306.,

X6-1100T = 0.000308

Phenom II 975 BE = 0.000307

And there you have it. Once we normalize clock speed and compare performance-per-MHz, you see that 1) the FX-8350 isn't any more efficient than the FX-8150 in single-threaded tests and 2) AMD is so far behind Intel as far as single-threaded efficiency, it obviates the advantage of overclocking.

  • | Post Points: 5
Not Ranked
Posts 13
Points 110
Joined: Feb 2013
Inquisius replied on Wed, Feb 13 2013 2:44 PM

And how it doesn't matter? Do you realize the 970 has a stock clock of 3.2 agaist the 2.66 of the 920, right? And that it can easily pull a 4.5 whereas the 920 barely does 4.2? And that it also has slightly better per Mhz performance?

That comparison you did there is absurd. There's no way for an OC'ed i7 920 to beat a 990X. The 990 can easily do 5 Ghz abouts that there's no *** way for a 920 to beat it out. The test is either errored out as you said, or maybe even MADE UP to shill that pre-built machine which got into the test.

It's more like a difference between test setups that made the difference. You're comparing things from different sites... mind you, the test benches have to be the EXACT SAME for them to be 100% accurate.

I admit that I didn't know PileDriver's IPC was about the same as Bulldozer's and that it was relying heavily on Turbo speeds. And that's because I personally didn't care about any of that.

Regardless, an OC'ed Westmere will come on top of an OC'ed Nehalem %90 of the time. The other %10 is very rare cases where the 920 does 4.5-5 Ghz. And yeah, AMD's in-game performance doesn't look too good either. Even when things are threaded, the FX-8150 still used to get spanked by a QX9770...

  • | Post Points: 5
Top 100 Contributor
Posts 1,072
Points 11,625
Joined: Jul 2009
Joel H replied on Wed, Feb 13 2013 3:05 PM

You don't seem to understand what a single-threaded test *is*.

Single-thread. One core.

If Westmere and Nehalem have the same architecture as far as Cinebench 11.5 is concerned (and they do), then Westmere and Nehalem will perform identically once clock speed variation is taken into account. It doesn't matter that Westmere has 6 cores and Nehalem has 4. We're testing ONE core.

Also? I wrote that review. Implying that I made up the benchmark results is insulting and inaccurate.

Finally, Cinebench 11.5 is widely quoted because it's a simple test. A Core i7-3770K will perform the same in Cinebench 11.5 regardless of RAM loadout, RAM speed, or QPI linkages.

The data I've just showed you compared performance in Cinebench as a function of clock speed. The stock clocks are irrelevant.

Look at the Core i7-3770K vs. the Core i3-3220. The Core i7-3770K scores 0.000425 points per MHz. The Core i3-3220 scores 0.000415 points per MHz. That difference (2.4%) is either margin of error or due to different cache sizes. The point is that the scaling is *consistent.*

And *because* scaling is consistent, we can get a sense for the relative performance of other architectures. That's how we see that IPC between AMD chips and Nehalem is still quite wide.

  • | Post Points: 5
Not Ranked
Posts 13
Points 110
Joined: Feb 2013
Inquisius replied on Wed, Feb 13 2013 4:57 PM

What I was saying is, Gulftown/Westmere is EASILY a better chip than Nehalem when you put it on the run in real World conditions. Gulftown will even beat Sandy Bridge when a game scales across all 6 threads very well. Nehalem isn't as good of a chip as Westreme is and that's a fact.

Now, what I was implying is, when you put the FX-8350 on real world gaming benches, it WILL be up there with an i7 920 as far as performance in concerned. The reason being turbo speeds, IPC whatever. I'm talking about the end result.

But yes, you're correct that AMD's single threaded, or even multi-threaded performance isn't anywhere as good as Intel's and in no line I despited that. I agree about that because even in some completely threaded apps the FX-8150 still loses to a Sandy Bridge chip. You don't need to repeat so basic things as if I'm a moron over and over again.

  • | Post Points: 5
Top 150 Contributor
Posts 756
Points 7,645
Joined: Nov 2012
Location: Dallas, Tx
Dorkstar replied on Thu, Feb 14 2013 11:53 AM

I wrote a crap ton, and decided to summarize.

 

I think we can all agree that once you enter a scenario such as the one in reference, the cost per performance ratio is the most important thing, or if you plan on never playing another game, getting the maximum level of performance in that particular game is the most important.  Yes, you may hit a bottleneck, but any budget PC is going to have a bottleneck somewhere.  What's important is gauging your level of satisfaction after installing your "new" hardware, and this article shows that you indeed could find a satisfactory level of gaming by simply upgrading to a new GPU.

  • | Post Points: 5
Not Ranked
Posts 18
Points 105
Joined: Apr 2013

Those are some solid numbers I can sit behind... though I'm still sitting on a 6-core, so I'm not too far behind.

  • | Post Points: 5
Not Ranked
Posts 1
Points 5
Joined: May 2013
mouacyk replied on Fri, May 10 2013 4:31 PM

This is valuable information for those us making the most of our systems, and do not upgrade so often. My Q6600 system is still running from 2008 and will be upgraded when Haswell arrives.

Overclocked to 3.52GHz, I too have noticed significant performance increases going from a GTX 460 to a GTX 570 and GTX 660. At 99% GPU load and 39% to 47% oc on CPU, the following were my results in The Witcher 2 on high settings at 1080p:

3.33GHz CPU -- GTX 460 (763MHz): 25fps

3.33GHz CPU -- GTX 570 (732MHz): 44fps

3.52GHz CPU -- GTX 570 (950MHz): 49fps

3.52GHz CPU -- GTX 660 (1124MHz): 44fps

Currently only getting 6060 in 3DMark11 on the GTX660, but once haswell is released, I expect up to another 33% increase in performance from better overall system throughput, based on other people's results with i7-3770k's. Also my ram is 2x4GB DDR3 running at 1174MHz with timings 5-6-6-14-2T, running on an ASRock P35 board. Frankly, I'm impressed at how game-able the system still is at 1080p on high settings in BF3, Crysis 3, and Borderlands 2.

I would like to add that I got the GTX 570 for $100 (friend), and the GTX 660 (ASUS DC2O) for $120 (online deals), so definitely was worth throwing them into an older system while I wait for Haswell.

  • | Post Points: 5
Page 1 of 1 (37 items) | RSS