Can A New GPU Rejuvenate A 5 Year Old Gaming PC? - HotHardware
Can A New GPU Rejuvenate A 5 Year Old Gaming PC?

Can A New GPU Rejuvenate A 5 Year Old Gaming PC?

New video card launches from AMD and NVIDIA are almost always reviewed on hardware that’s less than 12 months old. That's not an arbitrary decision -- it helps reviewers make certain that GPU performance isn't held back by older CPUs and can be particularly important when evaluating the impact of new interfaces or bus designs. The downside of this policy is that it leaves a gap in product coverage. Gamers with older systems often miss out on whether or not a new graphics card will be a meaningful upgrade for aging systems. That's particularly important as the speed of the desktop replacement cycle has slowed.

Here, we're going to compare the performance impact of upgrading the graphics card on an older system that doesn't have access to any of the substantial performance gains Intel introduced with Nehalem in late 2008. Our upgrade card of choice is an EVGA GeForce GTX 660 SuperClocked with 2GB of RAM:

Can A New GPU Rejuvenate A 5 Year Old Gaming PC?
 

0
+ -

Kudos to Joel for a great story angle. I've been wondering this myself as GPUs have become more and more efficient at offloading the rendering pipeline. Good stuff.

0
+ -

The only thing i dislike about this test is why you would use an quad-core and not a dual-core which most people sat with 5 years ago. So the test would have been best to try by using a dual-core.

0
+ -

yep, they should have used Dual Core CPU and an PCI-E 1.0 or 1.1 not 2.0 :/

0
+ -

I don't think he could have used a PCI 1.0 or 1.1 motherboard. A 2.0 card won't work in those slots.

0
+ -

"By CDeeter on Jan 24, 2013

I don't think he could have used a PCI 1.0 or 1.1 motherboard. A 2.0 card won't work in those slots.

"

actually you would be quite wrong. 2.0 cards work in 1.0 and 1.1 slots because of backwards compatibility. The main changes between 1.0 1.1 2.0 and 3.0 are throughput based changes. meaning higher transfer rates between card and motherboard.

I for instance had an 8800gtx which is a 2.0 card on a system with a 1.1 PCIe port which worked just fine.

0
+ -

Hmm I wonder what the issue is then for me that I couldn't go from a 7200gs to a HD 5570.

0
+ -

Hmm I wonder what the issue is then for me that I couldn't go from a 7200gs to a HD 5570.

0
+ -

Dude... my PC from five years ago which is still in service had PCIe 2.0. Built at the end of 2007 using the QX6850. That said, most cards struggle to even max out PCIe 1.1 bandwidth. There are many, many posts on this topic. Linus from TechTips and AnandTech, HardOCP and many more!

0
+ -

I resemble that remark ...

E8500 (Core 2 Duo, 3.16), HD4850, 4gb ram, g41 motherboard (@Eshirou: PCI-E 1.0a), 7200 RPM storage. Windows XP 32 bit. Pretty much a second class machine compared to the one used in the article (depending on whether an app utilizes more than one core).

Borderlands 2 was completely playable at 1080p with the eye candy turned way down. The only problem was a bug that would hang the application at selected spots unless I restricted the CPU to 1 core.

I upgraded from the 4850 to a 7850 last month (which runs fine in the 1.0a socket, didn't even have to upgrade the BIOS) mainly because it came free with Far Cry 3 and the 4850 is now restricted to legacy drivers/DX 10. Much faster now, but then I don't really play many titles that are CPU intensive.

0
+ -

Is the primes of this article accurate considering this is a old pc but it was like top of the line for when it came out? The only reason I say this is because anyone with a gaming pc using parts that old might not be on the same boat motherboard and cpu or even ram wise.

+3
+ -

"Is the primes of this article accurate considering this is a old pc but it was like top of the line for when it came out?"

It wasn't.

If we date from the youngest component (the GTX 260), the motherboard was new -- but the X48, X38, and P35 were equally fast when considering single GPU configurations. The X38 and X48 had an advantage over P35 in multi-GPU tests because the P35's dual-GPUs were configured in an x16 / x4 configuration with the second slot hanging off the south bridge.

The Q6600 launched on November 2, 2006. The GTX 260 was released in June, 2008. By the time the GPU was new, the CPU had been on the market nearly two years and there were faster 45nm chips available. Furthermore, the GTX 260 was slower than both the GTX 280 and AMD's Radeon HD 4800 family of cards, which launched in summer, 2008. It was Nvidia's second-fastest card, but it wasn't the second-fastest card on the market. 

As for the overall system build, this was, in fact, my system for several years. It's a balanced upper-end rig that wasn't top of the line when new, but still offered excellent performance.

+2
+ -

It was good artice about upgrade to new video card. I Already upgraded to new HIS Radeon 7870 GHz Edition about 1 1/2 month ago. It currenty in my tower that is bit over 1 year old. No problems with it.

+2
+ -

I think that the story stands on it own. Getting a new GPU will help your performance. If you want to nit-pick the system that Joel used in this test, consider that it's what he had at the time. If you want to see results from other base systems, please do so and post your results.

The GTX-660 card performed well, and would be an improvement over many of the cards we use. I would love to have a pair of them for SLI use.

Thanks for the comparison Joel.

+1
+ -

No single system can account for all possible variables. It's true that PCIe 1.1 vs. 2.0 could make a difference. So could dual vs quad-core. Expanding the test to cover a dual-core would've doubled the amount of work, if I even had one handy (and I don't). 

But if you've got a dual-core C2D system, a used quad-core Q6600 can be had on Ebay for $55. The 45nm Q9550 (2.83GHz) is available for $100.

 

0
+ -

Well the it still should have been a dual core CPU... It's true that not many had quad cores 5 years ago and youth no 2 systems are the same a more standard baseline could have been used. Upgrading a computer 5 years old with bleeding edge tech at the time is a little unfair it's like giving the GTX 660 the best possible chance to perform well while on a dualcore the performance gains might become more negligible. Also with top hardware like that makes justifying a mobo/CPU/ram upgrade less likely but someone with a standard system might see higher gains going that route over a new videocard.

Overall anything that gives old hardware new life I'm all for!

+1
+ -

Sebags, 

I'll be happy to write you a custom story for my usual article fee and the dual core CPU. :) 

I put this story together using components I had on hand, with equipment I had purchased for personal use. I promise you, I didn't build my rig in 2008 so that in 2013, I could write an article painting the GTX 660 in the best possible light. :P 

0
+ -

"Sebags,

I'll be happy to write you a custom story for my usual article fee and the dual core CPU. :) " LOL

Seriously though, I'm kinda surprised that you were able to get the 660 to run on that motherboard, but then again, since you used DDR2 memory I guess the board prolly supports PCI 2.0.

I have an old HP a1017c system that came out when PCIe was new. I have a Zotac Geforce 7200GS in it and tried to get a Sapphire 5570 to work to no avail.

+2
+ -

Regardless of the specific components, I think this article shows that, yes, a GPU upgrade can breathe new life into an aging system. Well done.

0
+ -

This was my experience with almost identical hardware: P35 mobo in my case, and 8GB of RAM, but the same Q6600 and various quality SSD's over the years. I moved from the original 8800GT to a 460ti to a 660ti and noticed big improvements in each case.

That Q6600 was a beast, especially overclocked 150%... I love science.

0
+ -

It is a great article :) Thank you.

I'm an owner of Q9300@3GHz which is about the same as overclocked Q6600.

There is an 8GB of 1100Mhz DDR2, Intel 330 SSD and GTX570.

And that supposed to drive gaming on 24" 1920x1200 monitor.

One of my favored games is WoT which runs on pretty ugly written engine which rapes your system.

One of my friend uses GTX2xx card, but coupled with i7 (second gen).

We both have to lower gfx quality to almost minimum to get in to the 60fps range (ensures no drops below 35fps) and we both get the same fps.

From what I feel, the next GPU upgrade will require whole system upgrade since the CPU will struggle to feed the GPU.

+1
+ -

I don't know if many current video cards will saturate existing PCI-E 2.0 or 3.0 bandwidth yet. (even in pairs)

I have two GTX-570s running SLI in my 2600K PC and they don't do it.

I have two Radeon R7770s running Crossfire in my FX-4170 and they don't use all of my PCI-E bandwidth either.

I do know that I went from a GTX-285 2GB card to a GTX-570 1.25GB card and saw a huge difference in game play when I did the swap.

The GTX-285 was a fine card but it didn't do DX11.

0
+ -

Actually, this Christmas I had acquired an old dell XPS 600 that was made in 2006 with a nvidia 7900 GS card, 512 MB of ram, a Pentium D (3.4 Ghz) processor, and no operating system (well it didn't boot after posting). So I bought 2 GTX 550 ti's and 4 GB ddr2 of value ram. So after experiencing first had as to what it happens after putting a couple of new GPU's in a pc I can say honestly say it does breath new life into an old, outdated system.

0
+ -

Meh, the EVGA 680 I got through NewEgg overheated after a minute in any game more demanding than Mass Effect 3. Now the ASUS 680 I got to replace it; that's quality.

+1
+ -

Reilneil, 

AMD and NV have candidly said that modern GPUs don't stress PCIe 3.0 vs. 2.0. If they did, the vendors themselves would have told us. I can pick up differences in synthetic benchmarks designed to measure PCIe bus bandwidth, but I mainly used those for confirming that things were working as they ought. 

 

Cameron, 

If you're on a Prescott-era Pentium D, you're losing huge frame rates. What we've said about CPU upgrades possibly not being a major factor doesn't apply to any system using a Prescott-derived Pentium 4. Those chips were miserable for gaming and the benefits of switching to something newer could be 30-50%. That's even comparing a Core 2 Duo to a Pentium-D.  

If you've got a pair of 550s, I guarantee your CPU *will* hold you back materially in a lot of games. 

Comrade, 

I haven't checked an EVGA GTX 680, but most early cards from all vendors follow NV's reference designs closely. If you had a GPU overheating, it's possible that there was a problem with the card or the card's BIOS rather than the vendor. 

 

0
+ -

Joel H:
Reilneil,  AMD and NV have candidly said that modern GPUs don't stress PCIe 3.0 vs. 2.0. If they did, the vendors themselves would have told us. I can pick up differences in synthetic benchmarks designed to measure PCIe bus bandwidth, but I mainly used those for confirming that things were working as they ought.

Agreed, this is why adding a better video card nets you a better experience almost every time. If your PSU can handle the extra power draw that some of these new cards have, (yes, I'm aware that the newest GPU designs use less power overall, but I also know that a top notch PSU makes a huge improvement too)

PCIe 2.0 bandwidth is wide enough and fast enough to give you great performance on any decently coded game if you have the GPU to handle it. That's why I agree that a GPU upgrade is such a great way to improve your gaming experience and my first recommended upgrade to others most of the time.

But if they have the cash for a whole new platform, that's how I steer them.

 

1 2 Next
Login or Register to Comment
Post a Comment
Username:   Password: