NVIDIA GeForce GTS 250 Mainstream GPU

rated by 0 users
This post has 7 Replies | 1 Follower

Top 10 Contributor
Posts 26,099
Points 1,183,605
Joined: Sep 2007
ForumsAdministrator
News Posted: Mon, Mar 2 2009 11:25 PM

NVIDIA has gotten a lot of mileage out of their G92 GPU architecture. Starting with the GeForce 8800 GT, which featured a 65nm variant of the G92 GPU, on up through the GeForce 9800 GTX+, which used an updated version manufactured on a more advanced 55nm process. The G92 GPU has been featured on no less than seven different GeForce branded desktop graphics cards, not to mention the slew of mobile GeForces based on the G92 that are also in production.

Although it has been around for quite some time now, NVIDIA is launching yet another graphics card based on the G92 today, the GeForce GTS 250. The GeForce GTS 250's name suggests it is something new, but it is not entirely different from some previous GeForce 9800 series graphics cards. Instead, what NVIDIA is doing today is using the GeForce GTS 250 launch to align more of their product stack with the new nomenclature introduced with the GeForce GTX series, and to introduce a value-priced G92-based graphics card with a 1GB frame buffer. Click the link below and take a look...

NVIDIA GeForce GTS 250 Mainstream GPU




  • | Post Points: 50
Top 10 Contributor
Posts 6,181
Points 90,135
Joined: Aug 2003
Location: United States, Virginia
Moderator

I have a hard time feeling like this is anything other than Nvidia trying to sell more old cards. Ever try and explain that a 8800gt and a 9800gt is the same thing to someone. I have. Its annoying.

  • | Post Points: 5
Top 50 Contributor
Posts 2,747
Points 42,815
Joined: Sep 2006
Location: United States, California

Fo sheezy... marketing bs ftl! lol

  • | Post Points: 20
Not Ranked
Posts 9
Points 135
Joined: Feb 2009
Location: MI, USA

It seems like the only place where the 250 was better was at high resolutions with AA enabled, bringing the frame buffer into the equation... though not at playable fps.

I wonder if disabling AA at those resolutions would be playable, and if so, by removing teh frame buffer from the equation, would there be a difference in fps?

Either way, i understand why Nvidia would re-package a G92 ($$$), but still, rather shady to completely rename it.

Main Rig: Q9700@3.15ghz under a Scythe NinjaCu, Asus P5BDeluxeWifiAP, 8gb DDR2800, OCZ, BFG GTX260+ Maxcore, RaptorX150, WD Black 1TB, BFG GS650, Vista Ultimate 64-bit

HTPC: e7400, Intel DG33TL, 3GBram, Seagate 1TB, WDGreen 1TB, ATi4650, Logitech DiNovo Mini, Vista HP 32bit

Netbook: Acer Aspire One, Atom, 1.5gb ram, 120gb HD, Vista HP 32bit

  • | Post Points: 20
Top 50 Contributor
Posts 2,359
Points 48,655
Joined: Apr 2000
Location: United States, Connecticut
ForumsAdministrator
MembershipAdministrator
Marco C replied on Tue, Mar 3 2009 11:27 AM

Don't forget, we test at the highest image quality settings to better show each card's strengths and weaknesses--simply lowering the AA and aniso levels, and dropping a few in-game image quality settings will boost performance considerably.

Marco Chiappetta
Managing Editor @ HotHardware.com

Follow Marco on Twitter

  • | Post Points: 35
Not Ranked
Posts 9
Points 135
Joined: Feb 2009
Location: MI, USA
Modred189 replied on Tue, Mar 3 2009 12:00 PM

Exactly. My point is that on, say FarCry 2, at 2650X1600, all that AA (not so much teh AF) is killig the 9800GTX+ due to the smaller frme buffer. However, on neither it, nor the GTX260's is the game playable at those settings (13-25fps). SO while I understand WHY you tested with those fps, and I agree it should be done, the point is that it's unplayable.

I agree that droppping the AA will boost fps significantly, but becasue doing so removes the GTX250's hardware advantage form the equation(extra frame buffer), I wonder if there would be much of a difference between teh three cards. My hunch is no.

This is supported at the 1900X1200 resolution where the GTX260 and 9800GTX+ are very close. That resolution with AA enabled does not fill the 512mb frame buffer, and so the equivalent GPU's perform roughly equally.

To me this says: These cards are not made for resolutions over 1900X1200 with any eye candy on, and at that resolution and below, they are nearly the saem card in terms of performance. 

And I assume that the GTX250 will be replaceing the 9800GTX+ at retail, not selling side by side?

 

Great review, though. I was thinking of getting a 6800 for a physx card, and waws debating waiting for the 250, due to it's smaller size, but this leads me to think the 9800 is the way to go.

Main Rig: Q9700@3.15ghz under a Scythe NinjaCu, Asus P5BDeluxeWifiAP, 8gb DDR2800, OCZ, BFG GTX260+ Maxcore, RaptorX150, WD Black 1TB, BFG GS650, Vista Ultimate 64-bit

HTPC: e7400, Intel DG33TL, 3GBram, Seagate 1TB, WDGreen 1TB, ATi4650, Logitech DiNovo Mini, Vista HP 32bit

Netbook: Acer Aspire One, Atom, 1.5gb ram, 120gb HD, Vista HP 32bit

  • | Post Points: 5
Top 75 Contributor
Posts 1,248
Points 11,580
Joined: Jan 2005
Location: Florida

Sure it is pretty cheap, but still in the same price range as the card it is a 'clone' of.  CUDA and PhysX are pretty much worthless.  The 1GB of memory is pretty much worthless since areas where it helps the card is still unplayable with.

Nvidia is going to sell a bunch of these, and a mean a bunch, to little joe average who thinks he is getting a version of the latest and greatest instead of just renamed and packaged 2 year old tech.  I feel really sorry for the poor guy with an 8800GTS who buys one of these thinking "Well it just came out and is in the 200 family, so it has to be faster then my current card."  And all of you *KNOW* this will happen.  Hell, some of the shops (because of Greed or stupid store sales people) will even push that exact upgrade on people in the public.

Rebranding/naming old products as if it was a member of the current generation is wrong since it really only serves the purpose of taking advantage of the less informed public.

*EDIT*
The reason I said CUDA and PhysX are pretty much worthless is they do not provide anything to Joe Average or even Joe Enthusist.  As far as GPU computing, CUDA is a more robust and widely adopted solution than ATI Stream. CUDA has made more inroads in the consumer space, and especially in the HPC space than Stream. There aren't that many differences in the programming model, but CUDA for C does have some advantages over Brook+.  However, ATI opens up it's ISA down to the metal (along side a virtual ISA), while NVIDIA only offers a virtual ISA.

The key is adoption: technology is only of value if the end user has a use for it. Nether CUDA or Stream really provide anything of note to end users.  OpenCL most likely will close the gap between NVIDIA and ATI and should put them both on the same playing field.  This is actually pretty important for Nvidia, since they need some means to keep sales up with Intel entering the high end GPU market soon and working with a common standard is easier then trying to force your standard over one or two other companies.  Also a common standard will hopefully have close to the same effect as DirectX, that is, increase the use of the technology to end users since it will run (though with varies performance) no matter whos hardware they have.

Smooth Creations LANShark "Blue Flame" + ASUS G73JH-A2 + ASUS EeePC S101H

"I frag therefore I am!"

  • | Post Points: 5
Not Ranked
Posts 5
Points 55
Joined: Feb 2009
Jacen replied on Wed, Mar 4 2009 1:42 AM

This has Geforce 4MX written all over it.  Sell old tech looking like new tech, profit.  The under pants gnomes would be proud.  It is interesting to see this compared to the 4850 eveywhere as clearly everyone started the reviews before the 4870 price cuts which now are at the GTS 250 level.  So most of these reviews are out dated the moment they were published.  HotHardware was nice enough to include it in their review so it is painfully obvious that the 4870 1GB is the better deal after the price drops.

  • | Post Points: 5
Page 1 of 1 (8 items) | RSS