Can A New GPU Rejuvenate A 5 Year Old Gaming PC?

rated by 0 users
This post has 46 Replies | 4 Followers

Top 10 Contributor
Posts 25,818
Points 1,166,545
Joined: Sep 2007
ForumsAdministrator
News Posted: Thu, Jan 24 2013 1:21 PM

Can A New GPU Rejuvenate A 5 Year Old Gaming PC?New video card launches from AMD and NVIDIA are almost always reviewed on hardware that’s less than 12 months old. That's not an arbitrary decision -- it helps reviewers make certain that GPU performance isn't held back by older CPUs and can be particularly important when evaluating the impact of new interfaces or bus designs. The downside of this policy is that it leaves a gap in product coverage. Gamers with older systems often miss out on whether or not a new graphics card will be a meaningful upgrade for aging systems. That's particularly important as the speed of the desktop replacement cycle has slowed.

Here, we're going to compare the performance impact of upgrading the graphics card on an older system that doesn't have access to any of the substantial performance gains Intel introduced with Nehalem in late 2008. Our upgrade card of choice is an EVGA GeForce GTX 660 SuperClocked with 2GB of RAM:

Can A New GPU Rejuvenate A 5 Year Old Gaming PC?
 

  • | Post Points: 470
Top 25 Contributor
Posts 3,542
Points 54,455
Joined: Jul 2004
Location: United States, Massachusetts
ForumsAdministrator
MembershipAdministrator
Dave_HH replied on Thu, Jan 24 2013 2:19 PM

Kudos to Joel for a great story angle. I've been wondering this myself as GPUs have become more and more efficient at offloading the rendering pipeline. Good stuff.

Editor In Chief
http://hothardware.com


  • | Post Points: 5
Not Ranked
Posts 1
Points 35
Joined: Jan 2013
Tiaech replied on Thu, Jan 24 2013 2:24 PM

The only thing i dislike about this test is why you would use an quad-core and not a dual-core which most people sat with 5 years ago. So the test would have been best to try by using a dual-core.

  • | Post Points: 35
Not Ranked
Posts 1
Points 5
Joined: Jan 2013

Is the primes of this article accurate considering this is a old pc but it was like top of the line for when it came out? The only reason I say this is because anyone with a gaming pc using parts that old might not be on the same boat motherboard and cpu or even ram wise.

  • | Post Points: 5
Top 100 Contributor
Posts 1,070
Points 11,570
Joined: Jul 2009
Joel H replied on Thu, Jan 24 2013 3:05 PM

"Is the primes of this article accurate considering this is a old pc but it was like top of the line for when it came out?"

It wasn't.

If we date from the youngest component (the GTX 260), the motherboard was new -- but the X48, X38, and P35 were equally fast when considering single GPU configurations. The X38 and X48 had an advantage over P35 in multi-GPU tests because the P35's dual-GPUs were configured in an x16 / x4 configuration with the second slot hanging off the south bridge.

The Q6600 launched on November 2, 2006. The GTX 260 was released in June, 2008. By the time the GPU was new, the CPU had been on the market nearly two years and there were faster 45nm chips available. Furthermore, the GTX 260 was slower than both the GTX 280 and AMD's Radeon HD 4800 family of cards, which launched in summer, 2008. It was Nvidia's second-fastest card, but it wasn't the second-fastest card on the market. 

As for the overall system build, this was, in fact, my system for several years. It's a balanced upper-end rig that wasn't top of the line when new, but still offered excellent performance.

  • | Post Points: 5
Not Ranked
Posts 1
Points 35
Joined: Jan 2013
EShirou replied on Thu, Jan 24 2013 4:21 PM

yep, they should have used Dual Core CPU and an PCI-E 1.0 or 1.1 not 2.0 :/

  • | Post Points: 35
Top 500 Contributor
Posts 266
Points 2,975
Joined: Sep 2009
Location: Port Orchard, WA

It was good artice about upgrade to new video card. I Already upgraded to new HIS Radeon 7870 GHz Edition about 1 1/2 month ago. It currenty in my tower that is bit over 1 year old. No problems with it.

  • | Post Points: 5
Top 10 Contributor
Posts 8,572
Points 103,130
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Thu, Jan 24 2013 5:06 PM

I think that the story stands on it own. Getting a new GPU will help your performance. If you want to nit-pick the system that Joel used in this test, consider that it's what he had at the time. If you want to see results from other base systems, please do so and post your results.

The GTX-660 card performed well, and would be an improvement over many of the cards we use. I would love to have a pair of them for SLI use.

Thanks for the comparison Joel.

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 20
Top 100 Contributor
Posts 1,070
Points 11,570
Joined: Jul 2009
Joel H replied on Thu, Jan 24 2013 5:11 PM

No single system can account for all possible variables. It's true that PCIe 1.1 vs. 2.0 could make a difference. So could dual vs quad-core. Expanding the test to cover a dual-core would've doubled the amount of work, if I even had one handy (and I don't). 

But if you've got a dual-core C2D system, a used quad-core Q6600 can be had on Ebay for $55. The 45nm Q9550 (2.83GHz) is available for $100.

 

  • | Post Points: 5
Top 100 Contributor
Posts 983
Points 9,085
Joined: Mar 2012
Location: LA, CA
sevags replied on Thu, Jan 24 2013 5:20 PM

Well the it still should have been a dual core CPU... It's true that not many had quad cores 5 years ago and youth no 2 systems are the same a more standard baseline could have been used. Upgrading a computer 5 years old with bleeding edge tech at the time is a little unfair it's like giving the GTX 660 the best possible chance to perform well while on a dualcore the performance gains might become more negligible. Also with top hardware like that makes justifying a mobo/CPU/ram upgrade less likely but someone with a standard system might see higher gains going that route over a new videocard.

Overall anything that gives old hardware new life I'm all for!

  • | Post Points: 20
Top 500 Contributor
Posts 208
Points 2,180
Joined: Feb 2010

Regardless of the specific components, I think this article shows that, yes, a GPU upgrade can breathe new life into an aging system. Well done.

  • | Post Points: 5
Top 100 Contributor
Posts 1,070
Points 11,570
Joined: Jul 2009
Joel H replied on Thu, Jan 24 2013 6:02 PM

Sebags, 

I'll be happy to write you a custom story for my usual article fee and the dual core CPU. :) 

I put this story together using components I had on hand, with equipment I had purchased for personal use. I promise you, I didn't build my rig in 2008 so that in 2013, I could write an article painting the GTX 660 in the best possible light. :P 

  • | Post Points: 20
Not Ranked
Posts 1
Points 5
Joined: Jan 2013
DanFrench replied on Thu, Jan 24 2013 6:15 PM

This was my experience with almost identical hardware: P35 mobo in my case, and 8GB of RAM, but the same Q6600 and various quality SSD's over the years. I moved from the original 8800GT to a 460ti to a 660ti and noticed big improvements in each case.

That Q6600 was a beast, especially overclocked 150%... I love science.

  • | Post Points: 5
Not Ranked
Posts 1
Points 20
Joined: Jan 2013

It is a great article :) Thank you.

I'm an owner of Q9300@3GHz which is about the same as overclocked Q6600.

There is an 8GB of 1100Mhz DDR2, Intel 330 SSD and GTX570.

And that supposed to drive gaming on 24" 1920x1200 monitor.

One of my favored games is WoT which runs on pretty ugly written engine which rapes your system.

One of my friend uses GTX2xx card, but coupled with i7 (second gen).

We both have to lower gfx quality to almost minimum to get in to the 60fps range (ensures no drops below 35fps) and we both get the same fps.

From what I feel, the next GPU upgrade will require whole system upgrade since the CPU will struggle to feed the GPU.

  • | Post Points: 20
Top 10 Contributor
Posts 8,572
Points 103,130
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Thu, Jan 24 2013 6:44 PM

I don't know if many current video cards will saturate existing PCI-E 2.0 or 3.0 bandwidth yet. (even in pairs)

I have two GTX-570s running SLI in my 2600K PC and they don't do it.

I have two Radeon R7770s running Crossfire in my FX-4170 and they don't use all of my PCI-E bandwidth either.

I do know that I went from a GTX-285 2GB card to a GTX-570 1.25GB card and saw a huge difference in game play when I did the swap.

The GTX-285 was a fine card but it didn't do DX11.

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 5
Not Ranked
Posts 1
Points 5
Joined: Jan 2013

Actually, this Christmas I had acquired an old dell XPS 600 that was made in 2006 with a nvidia 7900 GS card, 512 MB of ram, a Pentium D (3.4 Ghz) processor, and no operating system (well it didn't boot after posting). So I bought 2 GTX 550 ti's and 4 GB ddr2 of value ram. So after experiencing first had as to what it happens after putting a couple of new GPU's in a pc I can say honestly say it does breath new life into an old, outdated system.

  • | Post Points: 5
Top 100 Contributor
Posts 1,085
Points 10,995
Joined: Jun 2010
Location: Pennsylvania
CDeeter replied on Thu, Jan 24 2013 6:48 PM

"Sebags,

I'll be happy to write you a custom story for my usual article fee and the dual core CPU. :) " LOL

Seriously though, I'm kinda surprised that you were able to get the 660 to run on that motherboard, but then again, since you used DDR2 memory I guess the board prolly supports PCI 2.0.

I have an old HP a1017c system that came out when PCIe was new. I have a Zotac Geforce 7200GS in it and tried to get a Sapphire 5570 to work to no avail.

  • | Post Points: 5
Top 100 Contributor
Posts 1,085
Points 10,995
Joined: Jun 2010
Location: Pennsylvania
CDeeter replied on Thu, Jan 24 2013 6:54 PM

I don't think he could have used a PCI 1.0 or 1.1 motherboard. A 2.0 card won't work in those slots.

  • | Post Points: 20
Not Ranked
Posts 1
Points 20
Joined: Jan 2013

Meh, the EVGA 680 I got through NewEgg overheated after a minute in any game more demanding than Mass Effect 3. Now the ASUS 680 I got to replace it; that's quality.

  • | Post Points: 20
Top 100 Contributor
Posts 1,070
Points 11,570
Joined: Jul 2009
Joel H replied on Thu, Jan 24 2013 7:36 PM

Reilneil, 

AMD and NV have candidly said that modern GPUs don't stress PCIe 3.0 vs. 2.0. If they did, the vendors themselves would have told us. I can pick up differences in synthetic benchmarks designed to measure PCIe bus bandwidth, but I mainly used those for confirming that things were working as they ought. 

 

Cameron, 

If you're on a Prescott-era Pentium D, you're losing huge frame rates. What we've said about CPU upgrades possibly not being a major factor doesn't apply to any system using a Prescott-derived Pentium 4. Those chips were miserable for gaming and the benefits of switching to something newer could be 30-50%. That's even comparing a Core 2 Duo to a Pentium-D.  

If you've got a pair of 550s, I guarantee your CPU *will* hold you back materially in a lot of games. 

Comrade, 

I haven't checked an EVGA GTX 680, but most early cards from all vendors follow NV's reference designs closely. If you had a GPU overheating, it's possible that there was a problem with the card or the card's BIOS rather than the vendor. 

 

  • | Post Points: 20
Not Ranked
Posts 1
Points 35
Joined: Jan 2013

"By CDeeter on Jan 24, 2013

I don't think he could have used a PCI 1.0 or 1.1 motherboard. A 2.0 card won't work in those slots.

"

actually you would be quite wrong. 2.0 cards work in 1.0 and 1.1 slots because of backwards compatibility. The main changes between 1.0 1.1 2.0 and 3.0 are throughput based changes. meaning higher transfer rates between card and motherboard.

I for instance had an 8800gtx which is a 2.0 card on a system with a 1.1 PCIe port which worked just fine.

  • | Post Points: 35
Top 500 Contributor
Posts 161
Points 1,390
Joined: Jan 2010
Location: Texas City
detnight replied on Thu, Jan 24 2013 8:46 PM

Thanks for the report. I was looking at getting a new CPU since mine has always been the weak point of my system. Plus I was looking at a new Video card to replace my GTX285 and replace the 8gig of ram with 16. And of course use these as hand me downs for my wifes computer. Great Report, just what I needed

Asus M4A79T deluxe,AMD Athlon II X4 620,PNY GTX285 1gb,Patriot Viper DDR3 1333 8gb,Creative Sound Blaster X-Fi Titanium Fatal1ty Pro,Diamond ATI TV Wonder HD 650 Comb,Plextor DVD-CD burner,Patriot PS-100 SSD 32gb ,WD 500gb 32mg,Antec Twelve Hundred Full Tower,Antec True Power Quattro 1000watts,Toshiba 40in 1080p for a monitor

  • | Post Points: 5
Top 100 Contributor
Posts 983
Points 9,085
Joined: Mar 2012
Location: LA, CA
sevags replied on Thu, Jan 24 2013 11:24 PM

Joel H; yeahhhh no thank you. No need to get defensive either I didn't knock your article but the premise of "breathing new life in an aging system" isn't exactly correct in this situation. We all know adding a new videocard will add "new life" to ANY computer of any age as long as it fits... yes any and we all knew that already including you. So this article was really about finding how much performance is really gained. But many of us don't have top end gear from 5 years ago so this article is really pointless in determining whether someone should upgrade an old PC videocard or how much performance would be gained. All it does is prove new hardware can help old hardware perform better. Old news.

That's how you get defensive.

  • | Post Points: 5
Top 100 Contributor
Posts 1,070
Points 11,570
Joined: Jul 2009
Joel H replied on Fri, Jan 25 2013 12:15 AM

"But many of us don't have top end gear from 5 years ago."

I talked to a few system vendors who confirmed that among enthusiasts, the Q6600 was the most popular CPU of this time period. You didn't own one. Lots of other people didn't, either. No problem.

"All it does is prove new hardware can help old hardware perform better."

The question of any investigative article of this nature is "How much?"

I'm amused at your declaration that this isn't news because I myself had no idea what the results would look like. Several things about these outcomes surprised me: 

The GTX 260 was faster than I expected in modern titles.
The GTX 660 gave a larger FPS increase than I expected it would. Even tossing out Civ V, the GTX 660 upgrade improved framerates by over 50%. 

Power efficiency was also better than anticipated. 

 

 

  • | Post Points: 5
Not Ranked
Posts 1
Points 20
Joined: Jan 2013
JohnWalsh replied on Fri, Jan 25 2013 12:40 AM

When I upgraded from my Old GTX260 216 (on the Intel 775 platform with Q9550 and a Gigabyte EP-45ud3) I was surprised to see that BFBC2 only picked up about 20 fps on low settings. The amazing thing was when you cranked up the settings. The GTX 260 on the highest settings would struggle to maintain 30 - 35 FPS and would often dip into the 20's. The new GTX560ti would never go below 45 fps on the highest settings. And the heat difference was pretty amazing as well.

One thing I notice about reviews like this they always run on high or even ultra instead of show a frame rate that is playable on both cards. NVidia says my system will run on ultra settings. I'm sorry but an average of 40 fps IS NOT PLAYABLE. The frequent dips down into the twenties and the associated lags make it targeting very difficult.

I just updated from a Q9550 and a Gigabyte EP-45ud3r motherboard with 8 gig of ram to running an I5 3570K and an ASRock Extreme 4 Motherboard with 8 gig of ram. Both systems had Win8 an SSD and a EVGA GTX560ti gpu.

I saw only a minor increase in the overall speed playing BF3 but I did notice a much better ability to run with higher settings. I like to limit my frame rate because I believe the game plays better if the frame rate is stabile. I was running a mix of low and medium setting and I was able to maintain a stabile 60 fps frame rate. Now I can run a mix of medium and high settings and get a very consistent 60fps.

I hope later this summer I can get a better gpu.

  • | Post Points: 20
Not Ranked
Posts 1
Points 5
Joined: Jan 2013

Still waiting for the time that MMO's are taken into account with these kind of reviews. Most MMO's are CPU and RAM heavy and not so much on the GPU. I play EQ2 and SWTOR and I recently upgraded my mobo, CPU and RAM and the improvement is amazing. I don't think that only a new GPU would have had the same difference, because in EQ2 my RAM usage goes up to 5 GB within seconds and only had 4GB, and that was my major bottleneck

  • | Post Points: 5
Top 150 Contributor
Posts 619
Points 5,260
Joined: Dec 2011

Dude... my PC from five years ago which is still in service had PCIe 2.0. Built at the end of 2007 using the QX6850. That said, most cards struggle to even max out PCIe 1.1 bandwidth. There are many, many posts on this topic. Linus from TechTips and AnandTech, HardOCP and many more!

  • | Post Points: 5
Top 150 Contributor
Posts 619
Points 5,260
Joined: Dec 2011

Great article. This is a long known fact in hardware forums... before you upgrade your GPU, make sure your PSU can handle it! A lot of dual cores from the Q6600 era have only started showing their age last year (as per my non-scientific poll caused by friends going my computer is slow guy, fix it).

  • | Post Points: 5
Top 150 Contributor
Posts 653
Points 5,925
Joined: May 2008
Location: Stockholm
mhenriday replied on Fri, Jan 25 2013 6:23 AM

Have to agree with Dave, realneil, Paul, et al ; this was indeed a really worthwhile initiative on the part of Joel ! Nice to know that a GPU upgrade does, in practice, make so great a difference even when other components aren't the absolutely latest and greatest. Relevant for me, as a little more than a year ago I upgraded my main box with a GA-990FXA-UD3 motherboard, a Phenom II X 4 955 processor, a FSP650-80EGN PSU, and 16 GB RAM, but retained an ancient GeForce 7900 GTX GPU. I've been looking for a good deal on a GTX 660, but now that AMD has released a driver that seems to work decently with Linux OSs, I'm thinking of widening my search. Good to learn that such an upgrade would have significant (positive) effects !...

Henri

Not Ranked
Posts 1
Points 5
Joined: Jan 2013
phd24 replied on Fri, Jan 25 2013 7:20 AM

Really interesting article, Joel.

I'm in a pretty similar situation to your test rig - Q6600, 4Gb DDR2, GeForce 250 GTS, MSI P6N nForce 650 sli.

However, it's probably the MB that is the limiting factor in my rig. I'm about to spend a bit of cash on an upgrade, and am all-set to buy a new i5 3570k, a z77 MB (pr. gigabyte Z77-D3H) and 8Gb DDR3. However, you're article makes me pause for thought. I think I'll probably go with the CPU & MB as gaming comes second to image editing, but I'm still torn...

  • | Post Points: 5
Not Ranked
Posts 1
Points 5
Joined: Jan 2013
eyce9000 replied on Fri, Jan 25 2013 9:53 AM

Excellent article Joel.

I think this would merit following up with a comparison with a system with a modern CPU/Motherboard but the same 660 graphics card to see what sort of effect that would produce.

I am sitting on a 6 year old gaming system with an E6600 and I upgraded it last year with a GTX 560. It handles most games well, but I am curious what sort of performance benefits I would see for the considerable money necessary to replace the motherboard, CPU and RAM for a new intel i5/i7 system.

  • | Post Points: 5
Not Ranked
Posts 1
Points 5
Joined: Jan 2013
wtburnette replied on Fri, Jan 25 2013 10:26 AM

I thought the article was well done. My system 5 years ago was similar, except the video card was slower (8800GTS 512MB) and I had 4GB of RAM instead of 3GB. Same rocking Q6600 though. I didn't notice, but was your processor overclocked? The two things that I was curious about was, first of all, would an overclock of the processor to 3Ghz or 3.2Ghz help with the framerates much and also, what about adding another 1GB of RAM? I guess another question that comes to mind is, how badly did the SSD skew the results? If you upgraded only the video card in a system from back then that would almost certainly have had a standard HDD, I wonder how much slower it would be? In effect, you have to add the price of the SSD into the equation, but it's still a worthy upgrade for an older system at ~$300.

  • | Post Points: 5
Top 100 Contributor
Posts 1,070
Points 11,570
Joined: Jul 2009
Joel H replied on Fri, Jan 25 2013 10:49 AM

"One thing I notice about reviews like this they always run on high or even ultra instead of show a frame rate that is playable on both cards."

 

John, 

When I put this review together, I decided on a few things. 

1)  I wanted to use DX10/DX11.  The GTX 260 was NV's third generation of GPU to offer DX10 support. Dropping back to DX9 would've improved the card's frame rates, but would've also meant more screenshotting in an attempt to capture the visual differences. BF3 doesn't even have a DX9 mode for PCs, so this would've tossed in more variables. 

2)  Civ V was the only game that wasn't playable. Borderlands and BF3 were choppy in places, but I extensively playtested both. 

I picked settings that *pushed* the GTX 260, but didn't overwhelm it. That was purposeful. 

  • | Post Points: 5
Top 10 Contributor
Posts 8,572
Points 103,130
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Fri, Jan 25 2013 10:55 AM

Joel H:
Reilneil,  AMD and NV have candidly said that modern GPUs don't stress PCIe 3.0 vs. 2.0. If they did, the vendors themselves would have told us. I can pick up differences in synthetic benchmarks designed to measure PCIe bus bandwidth, but I mainly used those for confirming that things were working as they ought.

Agreed, this is why adding a better video card nets you a better experience almost every time. If your PSU can handle the extra power draw that some of these new cards have, (yes, I'm aware that the newest GPU designs use less power overall, but I also know that a top notch PSU makes a huge improvement too)

PCIe 2.0 bandwidth is wide enough and fast enough to give you great performance on any decently coded game if you have the GPU to handle it. That's why I agree that a GPU upgrade is such a great way to improve your gaming experience and my first recommended upgrade to others most of the time.

But if they have the cash for a whole new platform, that's how I steer them.

 

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 5
Not Ranked
Posts 1
Points 5
Joined: Jan 2013
HiBob replied on Fri, Jan 25 2013 12:40 PM

I resemble that remark ...

E8500 (Core 2 Duo, 3.16), HD4850, 4gb ram, g41 motherboard (@Eshirou: PCI-E 1.0a), 7200 RPM storage. Windows XP 32 bit. Pretty much a second class machine compared to the one used in the article (depending on whether an app utilizes more than one core).

Borderlands 2 was completely playable at 1080p with the eye candy turned way down. The only problem was a bug that would hang the application at selected spots unless I restricted the CPU to 1 core.

I upgraded from the 4850 to a 7850 last month (which runs fine in the 1.0a socket, didn't even have to upgrade the BIOS) mainly because it came free with Far Cry 3 and the 4850 is now restricted to legacy drivers/DX 10. Much faster now, but then I don't really play many titles that are CPU intensive.

  • | Post Points: 5
Top 100 Contributor
Posts 1,085
Points 10,995
Joined: Jun 2010
Location: Pennsylvania
CDeeter replied on Fri, Jan 25 2013 6:31 PM

Hmm I wonder what the issue is then for me that I couldn't go from a 7200gs to a HD 5570.

  • | Post Points: 5
Top 100 Contributor
Posts 1,085
Points 10,995
Joined: Jun 2010
Location: Pennsylvania
CDeeter replied on Fri, Jan 25 2013 6:31 PM

Hmm I wonder what the issue is then for me that I couldn't go from a 7200gs to a HD 5570.

  • | Post Points: 5
Top 100 Contributor
Posts 1,085
Points 10,995
Joined: Jun 2010
Location: Pennsylvania
CDeeter replied on Fri, Jan 25 2013 6:42 PM

Well this article sure got a nice amount of attention. Keep um coming Joel!

  • | Post Points: 5
Not Ranked
Posts 1
Points 5
Joined: May 2012
setiroN replied on Sat, Jan 26 2013 8:47 AM

I've recently upgraded an E5200 (@3.8GHz) on a P35 board and 4GB RAM with an HD7850, which was brought to 1200/5200.

It plays every game at quite high settings and a 1920x1200 resolution. Thanks to MLAA/FXAA, the biggest issue with lower end machines has been largely dealt with.

Easy times for gaming indeed.

  • | Post Points: 5
Page 1 of 2 (47 items) 1 2 Next > | RSS