Logo   Banner   TopRight
TopUnder
Transparent
Gigabyte GV-N250OC-1GI GeForce GTS 250
Transparent
Date: Jul 29, 2009
Section:Graphics/Sound
Author: Robert Maloney
Transparent
Introduction and Specifications

 

A few months back NVIDIA "released" the GeForce GTS 250 priced at just around $150.  The reason behind the quotations in the previous statement is that the GTS 250 is not exactly something new.  While a lot of NVIDIA's R&D dollars have gone into their higher-end products such as the GTX 285 and GTX 295, to satisfy their mid-level needs they turned to an old stand-by: rebranding a previously higher-end card.  In the case of the GeForce GTS 250, we're pretty much looking at the same specifications that we saw with the GeForce 9800 GTX+, with a 1GB frame buffer being the only major difference (and even that isn't a meaningful differentiating factor as 9800 GTX+ cards with 1GB of onboard memory have been around for quite some time.) 

Still, it's not all as bad as it might seem, for the 9800 GTX+ was a powerful card in its prime, which means you're getting that same power for much less than the 9800 GTX+ retailed for just over a year ago.  Additionally, with the rebranding, NVIDIA hopes to eliminate some of the confusion that comes from having different product lines out there.  For example, one might expect a "9800" series card to outperform a "285" series, even though we as enthusiasts know better.  Renumbering the cards, as well as placing the GT / GTS/ GTX marker before it, should give a better indicator of where the performance level should be, relative to other NVIDIA offerings.

So, then, what's a manufacturer to do with their current and future product listings?  To remain competitive, and not have products lingering in the warehouses, companies need to adjust for these changes quickly.  For Gigabyte, they went down parallel paths in regards to the GTS 250.  First, they took the GeForce 9800 GTX+ GPUs currently in their possession, flashed their BIOSes, updated the cooler, and rechristened them as the GV-N250ZL-1GI.  Newer shipments that were already marked as GTS 250s were used to create the card that we're looking at today, the GV-N250OC-1GI.  Although it might be noted that "OC" is part of the product name, this card actually does not get overclocked past the default specifications laid out by NVIDIA.  Instead, it appears that the OC refers to the differences in speeds between the 'ZL' model and the 'OC' model, with the ZL clocking in at 740 / 1850 / 2000 MHz and the OC at 738 / 1836 / 2200 MHz. The biggest variance appears to be with the memory clocks, although a recent BIOS update for the OC version will update the core and shader clock speeds to match the ZL.  We also became aware, however, of a revision to the GV-N250OC-1GI that comes with higher core and shader clock speeds, bringing the GPU up to 765 MHz and the shaders up to 1912 MHz.  For now, we'll focus on the original version's specifications, listed below:


Gigabyte GV-N250OC-1GI GeForce GTS 250
Gigabyte NV-N250OC-1GI GeForce GTS 250

NVIDIA GeForce GTS 250
Specifications and Features


As we mentioned in the introduction, the majority of these specifications match those of the GeForce 9800 GTX+.   We can even keep the memory buffer numbers intact, as not only were there 512 MB and 1 GB variants of the 9800 GTX+, but we have seen the same amounts of memory installed on GTS 250s as well.  Since we're not covering anything new in terms of the underlying technology behind the GeForce GTS 250 and the G92 that it's based on, we suggest instead revisiting a couple of the following HotHardware articles regarding NVIDIA's previous and current GPU architectures:

Our GeForce 8800 GTX launch article goes in depth on NVIDIA's previous generation G80 GPU architecture, which is fundamentally very similar to the G92, save for its wider memory interface, and explains NVIDIA's CUDA GPGPU technology. Also, our GeForce 8800 GT and 8800 GTS 512MB articles cover much of the technology employed in NVIDIA's G92 GPU architecture.  From there, we invite you to take a look at the GTX 260 and GTX 280 launch piece, in which we discuss the GT200 GPU, and the differences and similarities between the two GPUs. We will be including a GTX 260 Core 216 in our benchmarks as well to show the performance deltas between these two sub-$200 graphics cards.

Transparent
A Closer look at the GV-N250OC-1GI

 

Gigabyte ships their GTS 250 in a non-ostentatious package - small and light with black coloring and a menacing robot on the front.  Cleverly integrated into the artwork is a graphical representation of Gigabyte's UDV technology, short for Ultra Durable VGA.   Similar to what they have done with their Ultra Durable line of motherboards, UDV consists of 2 oz. copper layers in the PCB, Japanese solid capacitors, Ferrite core chokes, and Lower RDS(on) MOSFETS, all of which are designed to reduce temperature and increase power efficiency which should also result in higher overclock speeds.

GV-N250OC-1GI Package   GV-N250OC-1GI Bundle

Checking inside the package we found hardly any surprises: just the GV-N250OC-1GI itself with a DVI-to-HDMI adapter, S/PDIF audio cable, 6-pin power cable and the user's manual and drivers/utility disc.  Gamer HUD Lite is the proprietary software to be found on that CD, and it allows the user to not only monitor and adjust the voltage to the GPU, but also tweak the core, shader, and memory clock speeds as well.  And, although installing or replacing a graphics card is a relatively simple process, Gigabyte's manual does a decent job at explaining not only how to properly seat the card, but also which cables need to be installed where and a briefing on how to get the most out of the card using NVIDIA's Control Panel.

Gigabyte GV-N250OC-1GI Front View   Gigabyte GV-N250OC-1GI Rear View 

The final piece of UDV is the inclusion of "Tier 1" memory from either Samsung or Hynix.  In this case, we found 1 GB of Hynix H5RS1H23MFR-N2C GDDR3 that is rated for a frequency of 1200 MHz.  On the box, the memory is noted as being "overclocked", although this is the memory speed that the GTS 250 (and 9800 GTX+) is currently rated, while the GPU itself remains untouched.  Thus, it appears that a good deal of the "OC" in the product name will come after the fact and by the user, aided in part by the Zalman cooler placed onboard.

Zalman Cooler   Zalman VF-1050   Zalman VF-1050 Heatpipes

The Hynix memory is left exposed, but should receive some cooling from the airflow emanating from the fan embedded within the Zalman VF1050 cooler - a copper/aluminum hybrid heatsink that covers nearly the entire length of the card.  Four nickel-plated heatpipes conduct heat away from the aluminum base and run through the copper fins - two towards the front of the mechanism and two placed a little further back.  Although the Zalman VF1050 is massive dimensionally, the weight of the card overall is not excessive by any means.  MOSFETs and chokes are spaced evenly at the far end of the card, and, like the memory, do not receive any direct forms of cooling.  Although a close cousin to the 9800 GTX+, the GTS 250 is noticeably shorter and more compact and only requires a single 6-pin power connection.

Hynix H5RS1H23MFR-N2C GDDR3   Power Regulation on the GV-N250OC-1GI   GV-N250OC-1GI Ports

Gigabyte has a made a habit of capping the open ports and connections such as the SLI bridge, which we assume is to protect them during shipping or storage.  While we can't remember a time where we had a problem in this regard, we surely applaud them for at least thinking of the consumer first, and cost (no matter how minor it might be) second.  There is also only a single SLI connection along the top of the card, quickly dashing any thoughts of an inexpensive 3-Way SLI setup, especially in light of the news that 9800 GTX+ and GTS 250 cards can be linked together.  Output options includes native HDMI, VGA (D-Sub) and DVI-I as well as S/PDIF audio for connecting to an HDMI TV.  These options make the GV-N250OC-1GI a good choice for not only mainstream gaming PCs, but higher-end HTPC boxes as well.

Transparent
Test System and 3DMark Vantage

 

HOW WE CONFIGURED THE TEST SYSTEMS: We tested all of the graphics cards used in this article on an Asus Rampage II Extreme motherboard powered by a Core i7 920 quad-core processor and 3GB of Qimonda DDR3.  The first thing we did when configuring these test system was enter the system BIOS and set all values to their "optimized" or "high performance" default settings.  We also used "CPU Level Up" in the BIOS to automatically overclock our system to the same level as the Core i7 965 Extreme in order to provide the most computational power we could afford to our suite of graphics cards. Finally, we manually configured the memory timings and disabled any integrated peripherals that wouldn't be put to use.  The hard drive was formatted, and Windows Vista Ultimate was installed and updated to Service Pack 1. With the operating system installed, we installed the latest hotfixes, along with the necessary drivers and benchmark applications.

HotHardware's Test Systems
Intel Powered


Hardware Used:
Core i7 920 (overclocked to 3.2 GHz)

Asus Rampage II Extreme
(Intel X58 Express chipset)

Radeon HD 4870 1GB
Radeon HD 4850
GeForce GTX 260 Core 216 OC
GeForce 9800 GTX+
GeForce 9800 GT
GeForce 8800 GTS 512
Gigabyte GV-N250OC-1GI GeForce GTS 250

3GB Qimonda DDR3-1066 C7
(3 X 1GB, Triple Channel DDR3)

Integrated Audio
Integrated Network

Seagate Barracuda 7200.11
(7,200RPM - SATAII - 750GB)


Relevant Software:

Windows Vista Ultimate SP1
DirectX November 2008 Redist

NVIDIA Forceware v185.63
ATI Catalyst v9.5

Benchmarks Used:
3DMark Vantage v1.0.1
High Preset - 1680x1050, 2x AA/8x Aniso
Crysis v1.21*
Very High Quality Settings - No AA/No Aniso
Enemy Territory: Quake Wars v1.5*
High Quality Settings - 4x AA/16x Aniso
Far Cry 2 v1.02
Very High Quality Settings - 4x AA
Left 4 Dead*
High Quality Settings - 4x AA /16x Aniso
Tom Clancy's H.A.W.X. (DX10/DX10.1)
High Quality Settings - 4x AA

* - Custom Benchmark

Futuremark 3DMark Vantage
Synthetic DirectX Gaming


3DMark Vantage

The latest version of Futuremark's synthetic 3D gaming benchmark, 3DMark Vantage, is specifically bound to Windows Vista-based systems because it uses some advanced visual technologies that are only available with DirectX 10, which y isn't available on previous versions of Windows.  3DMark Vantage isn't simply a port of 3DMark06 to DirectX 10 though.  With this latest version of the benchmark, Futuremark has incorporated two new graphics tests, two new CPU tests, several new feature tests, in addition to support for the latest PC hardware.  We tested the graphics cards here with 3DMark Vantage's Extreme preset option, which uses a resolution of 1680x1050, with 2x anti-aliasing and 8x anisotropic filtering.






Gigabyte's version of the GTS 250 land at about the middle of the pack, actually falling slightly behind the GeForce 9800 GTX+ that preceded it.  Considering the fact that the current price of around $130 is less than half that of the 9800 GTX+ at launch, that's doesn't sound too bad, however, much has changed in the graphics world since then.  For instance, the two Radeons in our comparison group, the HD 4850 and HD 4870, both outperform the GTS 250, yet are priced at similar, if not lower, price points.

Transparent
Enemy Territory: Quake Wars

 

Enemy Territory: Quake Wars
OpenGL Gaming Performance


Enemy Territory:
Quake Wars

Enemy Territory: Quake Wars is Based on a radically enhanced version of id's Doom 3 engine and viewed by many as Battlefield 2 meets the Strogg, and then some.  In fact, we'd venture to say that id took EA's team-based warfare genre up a notch or two.  ET: Quake Wars also marks the introduction of John Carmack's "Megatexture" technology that employs large environment and terrain textures that cover vast areas of maps without the need to repeat and tile many smaller textures.  The beauty of megatexture technology is that each unit only takes up a maximum of 8MB of frame buffer memory.  Add to that HDR-like bloom lighting and leading edge shadowing effects and Enemy Territory: Quake Wars looks great, plays well and works high end graphics cards vigorously.  The game was tested with all of its in-game options set to their maximum values with soft particles enabled in addition to 4x anti-aliasing and 16x anisotropic filtering.

 

The GTS 250 gets to flex a little muscle in Enemy Territory: Quake Wars, pushing up past the Radeon HD 4850 but only at the lowest resolution settings.  Once we start to raise the resolution to 1680x1050 and finally on to 1920x1080, we find that the HD 4850 is not only able to overcome the GTS 250, but the 9800 GTX+ as well.  Price drops on the HD 4870 position it directly against the GTS 250, although the performance advantage appears to clearly be in the Radeon's favor.

Transparent
Crysis

 

Crysis v1.21
DirectX 10 Gaming Performance


Crysis

If you're at all into enthusiast computing, the highly anticipated single player, FPS smash-hit Crysis, should require no introduction. Crytek's game engine produces some stunning visuals that are easily the most impressive real-time 3D renderings we've seen on the PC to date.  The engine employs some of the latest techniques in 3D rendering like Parallax Occlusion Mapping, Subsurface Scattering, Motion Blur and Depth-of-Field effects, as well as some of the most impressive use of Shader technology we've seen yet.  In short, for those of you that want to skip the technical jib-jab, Crysis is a beast of a game.  We ran the full game patched to v1.21 with all of its visual options set to 'Very High' to put a significant load on the graphics cards being tested  A custom demo recorded on the Ice level was used throughout testing.

 

Crysis had the older 9800 GTX+ and newer GTS 250 finishing within fractions of a frame of each other, and the HD 4850 also gets thrown into the same mix.  None of these come close to the performance level of the GTX 260 Core 216, or the Radeon HD 4870 though.  Extra memory bandwidth, especially with the GDDR5 used on the HD 4870, keep these cards afloat even under this strenuous testing.  Performance deltas between these two cards and the GTS 250 (and similar cards) are 25% or more at all resolutions. 

Transparent
Far Cry 2

 

FarCry 2
DirectX Gaming Performance


FarCry 2

Like the original, FarCry 2 is one of the more visually impressive games to be released on the PC to date.  Courtesy of the Dunia game engine developed by Ubisoft, FarCry 2's game-play is enhanced by advanced environment physics, destructible terrain, high resolution textures, complex shaders, realistic dynamic lighting, and motion-captured animations.  We benchmarked the graphics cards in this article with a fully patched version of FarCry 2, using one of the built-in demo runs recorded in the "Ranch" map.  The test results shown here were run at various resolutions with 4X AA and No anisotropic enabled concurrently.

 

FarCry 2 clearly shows the benefit of the 1 GB frame buffer on Gigabyte's GTS 250, versus the 512 MB frame buffers of the Radeon HD 4850 and 9800 GTX+.  At the lower resolution, all of the mainstream cards perform somewhat similarly, with the GTS 250 actually placing a close second behind the 9800 GTX+. With the resolution cranked up to 1920x1080, however, the 1 GB GTS 250 significantly outperforms the 512 MB 9800 GTX+ or Radeon HD 4850.

Transparent
Left 4 Dead

 

Left 4 Dead
DirectX Gaming Performance


Left 4 Dead

Left 4 Dead is a co-operative, survival horror, first-person shooter that was developed by Turtle Rock Studios, which was purchased by Valve part-way into development. Like Half Life 2, the game uses the Source engine, however, the visual in L4D are far superior to anything seen in the Half Life universe to date. The game pits four Survivors of an apocalyptic pandemic against hordes of aggressive zombies. We tested the game at three different resolutions with 4x anti-aliasing and 16x anisotropic filtering enabled and all in game graphical options set to their maximum values.

 

The Radeon HD 4850 slipped past the GTS 250 and 9800 GTX+ by just a couple of frames per second in our custom Left 4 Dead benchmark, besting our NVIDIA duo by a slight margin at all three resolutions.  Once again though, the Radeon HD 4870 and the GeForce GTX 260, specifically, offer much more performance for the extra $20 to $60 investment, respectively.

Transparent
H.A.W.X.

 

Tom Clancy's H.A.W.X.
DirectX Gaming Performance


Tom Clancy's H.A.W.X.

Tom Clancy's H.A.W.X. is an aerial warfare video game that takes place during the time of Tom Clancy's Ghost Recon Advanced Warfighter.  Players have the opportunity to take the throttle of over 50 famous aircrafts in both solo and 4-player co-op missions, and take them over real world locations & cities in photo-realistic environments created with the best commercial satellite data provided by GeoEye.  We used the built-in performance test at three resolutions with all quality settings set to High, using the DX10-based engine for the GeForce cards, and DX10.1 libraries for the Radeons.

As we reported on in our GTX 275 / HD 4890 round-up, H.A.W.X. is a title that fully utilizes the DX 10.1 support available on the Radeons.  In doing so, the performance increases on those cards virtually leaves the GeForce cards in their virtual wake.  The best example in this case would be the GTS 250 vs. Radeon HD 4850 comparison.  Whereas up until now these two cards have mostly been neck and neck, in H.A.W.X. we find the HD 4850 performing nearly 40% faster in each benchmark run.

Transparent
Overclocking Results

 

Overclocking The Gigabyte GV-N250OC-1GI
Putting the Pedal To The Metal

Although the Gigabyte GV-N250ZL-1GI doesn't come overclocked out of the box, it does come with a handy utility called Gamer HUD Lite that allows you to change GPU, Shader, and Memory clock speeds.  Although the full version of the application allows a user to modify GPU voltage in order to achieve even higher overclocks, when we installed a downloaded version from Gigabyte's website, the Lite version interface was always displayed.  Thus, we had to "settle" for whatever overclocked speeds we could hit using the card's Ultra Durable features and Zalman's cooler, of course.  Gamer HUD Lite provides real time status updates for the temperature as well, and we will be able to use this to see if high core temps become a deciding factor.

Gamer HUD Lite
Gigabyte Gamer HUD Lite

Default speeds of 738 / 1836 / 1100 MHz for the GPU, Shader, and Memory are displayed after launching the application, and to modify these speeds we needed to change the 2D/3D Auto-optimized setting from 'Enable' to 'Disable'.  After that, it's a relatively simple process of raising one slider a few notches at a time until we reach some point of instability; graphical anomalies, application crashes, or other inconsistencies.  When all was said and done, we had managed to get as high as an 820 MHz clock on the GPU, 2100 MHz for the Shader clock, and 1185 MHz for the memory.  Temperatures, recorded at idle, were 10 degrees C higher than originally reported at default clock speeds.

   GPU-Z Default Specs 

   GPU-Z Overclocked Specs 

Gigabyte GV-N250ZL-1GI
Default Clock Speeds

Gigabyte GV-N250ZL-1GI
Overclocked Speeds

A quick check using TechPowerUp's GPU-Z reported a 1.3 GPixel/s increase in the Pixel Fillrate and a boost of 5.3 GTexel/s to the texture Fillrate - roughly 11% faster than stock.  Memory bandwidth also increased by 5.4 GB/s, or about 8% more bandwidth.  These increases aren't necessarily enough to allow the GTS 250 to enter GTX 260 or HD 4870 territory, but it should help distinguish the GV-N250ZL-1GI from competing GTS 250s, or HD 4850 cards for that matter.

Transparent
Power Consumption

 

We'd like to cover a few final data points before bringing this article to a close. Throughout all of our benchmarking and testing, we monitored how much power our test systems were consuming using a power meter. Our goal was to give you an idea as to how much power each configuration used while idling and under a heavy workload. Please keep in mind that we were testing total system power consumption at the outlet here, not just the power being drawn by the motherboards alone.

Total System Power Consumption
Tested at the Outlet

 

Even when it comes down to power consumption, we find that the GTS 250 and HD 4850 are joined at the hip.  At idle, the GTS 250 is pulling down 2 Watts or so less than the HD 4850, and was the lowest consumer of the entire group.  Placed under load, the tables are slightly reversed, with the HD 4850 becoming the most power-friendly card we tested. 

 

Since we're on the subject of power consumption, which is related to heat, we should also talk for a moment about the Zalman cooler used on Gigabyte's GTS 250.  As shown in the graph above, the temperatures we recorded were by far the lowest of any of the cards we used.  Throw in the near silent operation of the Zalman VF1050 and we've got a win-win situation for the GV-N250OC-1GI.

Transparent
Performance Summary and Conclusion


Performance Summary:
Overall, the Gigabyte GV-N250OC-1GI placed nearly in the middle of the pack of our mid-priced assortment of graphics cards.  It wasn't really a challenger to the higher priced GTX 260 Core 216 or Radeon HD 4870 cards, but easily surpassed the older GeForce 9800 GT and 8800 GTS 512.  Compared directly with a GeForce 9800 GTX+, however, things get really tight.  Although the two cards are, for the most part, equals on the battlefield, we consistently saw small performance gains in our benchmarks by the older 9800 GTX+.   The only instance where the GTS 250 really did outgun the 9800 GTX+ was in Far Cry, where the card's 1GB frame buffer kept frame rates much higher on the GTS 250 than its counterpart.  Comparisons with a Radeon HD 4850 were also a mixed. Depending on the game engine, each card had its minor victories, with the notable exception of H.A.W.X., where the DX 10.1 compatible part from ATI took the lead.

 

At $150, Gigabyte's GV-N250OC-1GI is a decent value.  You're basically getting a rebranded GeForce 9800 GTX+ at a fraction of its original cost, with a few tweaks thrown into the mix as well.  2 oz. copper layers in the PCB and top-tier electronics keep the card running cool and stable, prolonging its lifespan.  The PCB layout has been reworked as well, with the GTS 250 shipping on a slightly smaller 9 inch card and requiring only a single 6-pin power connector.  Gigabyte has also gone ahead and used another one of Zalman's OEM heatsink / fan creations to maintain cool temperatures while keeping noise output to a minimum.  

All of the positives can't hide the fact, however, that the GTS 250 isn't necessarily the best buy out there.  It's quite possible to find not only the equally performing Radeon HD 4850 for less, but with recent price drops find the much more powerful Radeon HD 4870 retailing for almost the same price as the GTS 250.  Those looking for an even larger boost in performance might be best served by saving up a few more dollars and getting the GTX 260 Core 216 instead.  At the end of the day, it really comes down to what serves your needs best.  Those looking for a true performance beast may want to look elsewhere, but if lower power consumption, temps, and noise is what you're looking for, than the GV-N250OC-1GI might fit the bill.


     
  • Good performance vs. price ratio
  • 1GB Memory Frame Buffer
  • Zalman ZF-1050 Heatsink/Fan
  • Potential for high overclocks
  • Low Power Consumption

 

  • No DX 10.1 support
  • Older GPU technology
  • Single SLI connector
  • Other similarly priced cards outperform the GTS 250

 



Content Property of HotHardware.com