NVIDIA GeForce GTS 250 Mainstream GPU - HotHardware

NVIDIA GeForce GTS 250 Mainstream GPU

2 thumbs up

FarCry 2
DirectX Gaming Performance


FarCry 2

Like the original, FarCry 2 is one of the more visually impressive games to be released on the PC to date.  Courtesy of the Dunia game engine developed by Ubisoft, FarCry 2's game-play is enhanced by advanced environment physics, destructible terrain, high resolution textures, complex shaders, realistic dynamic lighting, and motion-captured animations.  We benchmarked the graphics cards in this article with a fully patched version of FarCry 2, using one of the built-in demo runs recorded in the "Ranch" map.  The test results shown here were run at various resolutions with 4X AA and No anisotropic enabled concurrently.

FarCry 2 clearly shows the benefit of the GTS 250's 1GB frame buffer, versus the 512MB frame buffers of the Radeon HD 4850 and 9800 GTX+.  At the lower resolution, all of the mainstream cards perform somewhat similarly, with the GTS 250 finishing out in front of the similarly priced cards (the 4870 and GTX 260 were another story).  With the resolution cranked up to 2560x1600, however, the 1GB GTS 250 significantly outperforms the 512MB 9800 GTX+ or Radeon HD 4850.

Article Index:

0
+ -

I have a hard time feeling like this is anything other than Nvidia trying to sell more old cards. Ever try and explain that a 8800gt and a 9800gt is the same thing to someone. I have. Its annoying.

0
+ -

Fo sheezy... marketing bs ftl! lol

0
+ -

It seems like the only place where the 250 was better was at high resolutions with AA enabled, bringing the frame buffer into the equation... though not at playable fps.

I wonder if disabling AA at those resolutions would be playable, and if so, by removing teh frame buffer from the equation, would there be a difference in fps?

Either way, i understand why Nvidia would re-package a G92 ($$$), but still, rather shady to completely rename it.

0
+ -

Don't forget, we test at the highest image quality settings to better show each card's strengths and weaknesses--simply lowering the AA and aniso levels, and dropping a few in-game image quality settings will boost performance considerably.

0
+ -

Exactly. My point is that on, say FarCry 2, at 2650X1600, all that AA (not so much teh AF) is killig the 9800GTX+ due to the smaller frme buffer. However, on neither it, nor the GTX260's is the game playable at those settings (13-25fps). SO while I understand WHY you tested with those fps, and I agree it should be done, the point is that it's unplayable.

I agree that droppping the AA will boost fps significantly, but becasue doing so removes the GTX250's hardware advantage form the equation(extra frame buffer), I wonder if there would be much of a difference between teh three cards. My hunch is no.

This is supported at the 1900X1200 resolution where the GTX260 and 9800GTX+ are very close. That resolution with AA enabled does not fill the 512mb frame buffer, and so the equivalent GPU's perform roughly equally.

To me this says: These cards are not made for resolutions over 1900X1200 with any eye candy on, and at that resolution and below, they are nearly the saem card in terms of performance. 

And I assume that the GTX250 will be replaceing the 9800GTX+ at retail, not selling side by side?

 

Great review, though. I was thinking of getting a 6800 for a physx card, and waws debating waiting for the 250, due to it's smaller size, but this leads me to think the 9800 is the way to go.

0
+ -

Sure it is pretty cheap, but still in the same price range as the card it is a 'clone' of.  CUDA and PhysX are pretty much worthless.  The 1GB of memory is pretty much worthless since areas where it helps the card is still unplayable with.

Nvidia is going to sell a bunch of these, and a mean a bunch, to little joe average who thinks he is getting a version of the latest and greatest instead of just renamed and packaged 2 year old tech.  I feel really sorry for the poor guy with an 8800GTS who buys one of these thinking "Well it just came out and is in the 200 family, so it has to be faster then my current card."  And all of you *KNOW* this will happen.  Hell, some of the shops (because of Greed or stupid store sales people) will even push that exact upgrade on people in the public.

Rebranding/naming old products as if it was a member of the current generation is wrong since it really only serves the purpose of taking advantage of the less informed public.

*EDIT*
The reason I said CUDA and PhysX are pretty much worthless is they do not provide anything to Joe Average or even Joe Enthusist.  As far as GPU computing, CUDA is a more robust and widely adopted solution than ATI Stream. CUDA has made more inroads in the consumer space, and especially in the HPC space than Stream. There aren't that many differences in the programming model, but CUDA for C does have some advantages over Brook+.  However, ATI opens up it's ISA down to the metal (along side a virtual ISA), while NVIDIA only offers a virtual ISA.

The key is adoption: technology is only of value if the end user has a use for it. Nether CUDA or Stream really provide anything of note to end users.  OpenCL most likely will close the gap between NVIDIA and ATI and should put them both on the same playing field.  This is actually pretty important for Nvidia, since they need some means to keep sales up with Intel entering the high end GPU market soon and working with a common standard is easier then trying to force your standard over one or two other companies.  Also a common standard will hopefully have close to the same effect as DirectX, that is, increase the use of the technology to end users since it will run (though with varies performance) no matter whos hardware they have.

0
+ -

This has Geforce 4MX written all over it.  Sell old tech looking like new tech, profit.  The under pants gnomes would be proud.  It is interesting to see this compared to the 4850 eveywhere as clearly everyone started the reviews before the 4870 price cuts which now are at the GTS 250 level.  So most of these reviews are out dated the moment they were published.  HotHardware was nice enough to include it in their review so it is painfully obvious that the 4870 1GB is the better deal after the price drops.

Login or Register to Comment
Post a Comment
Username:   Password: