Logo   Banner   TopRight
NVIDIA's GeForce FX 5700 Ultra
Date: Oct 23, 2003
Author: HH Editor
NVIDIA's GeForce FX 5700 Ultra - Page 1

NVIDIA's GeForce FX 5700 Ultra
NVIDIA's immediate answer to the 9600 XT

By Robert Maloney
October 23rd,  2003

What a difference a week can make.  Just a week ago yesterday, when the NDA lifted on the ATi Radeon 9600 XT, we had nothing but praise for the great performance that ATi was offering with their new mainstream solution.  Our tests compared the 9600 XT to the previous model, the 9600 Pro, and a card from NVIDIA's camp, the GeForce FX 5600.  A quick recap of that review, had the Radeon 9600 XT winning all of the benchmarks, easily capturing the lead in the mainstream video card business - at least for the time being.  We told you, however, to keep your eyes out for a response from Santa Clara. 

Today, NVIDIA comes at you with a double-pronged launch, releasing a new card for both the hard-core gamer, the GeForce FX 5950 Ultra, and one for the rest of us, the GeForce FX 5700 Ultra.  The 5700 Ultra, previously known as the NV36, has just about everything one could hope for in a mainstream video solution, including the latest high-speed DDR2 memory, AGP8X, and NVIDIA's CineFX 2 and Intellisample technologies.  Building on the strengths of the GeForce FX 5600 Ultra (NV31), NVIDIA has found some more headroom in their 0.13 micron GPUs, and have raised the clock speed from 400MHz for the 5600 Ultra all the way up to 475MHz.  The rest of the architecture remains relatively unchanged, although 128MB of DDR2 memory has been added to the board, something one wouldn't normally expect to find on a "value" card.  With similar specs to the ATi Radeon 9600 XT, we expect that the battle between these two mainstream cards will be fast and furious, so let's take a quick peek at the card itself, and then move onto the benchmarks.

Specifications & Features of the 128MB NVIDIA GeForce FX 5700 Ultra
The NV36 is finally revealed


NVIDIA GeForce FX 5700 Ultra GPU
  • CineFX 2.0 Engine
  • Intellisample Technology HCT
  • High-Precision Graphics
  • nView Multi-display Technology
  • Digital Vibrance Control (DVC)
  • Unified Driver Architecture (UDA)
  • AGP 8X including Fast Writes and sideband addressing
  • 0.13 Micron Process Technology from IBM
  • Copper vias and wiring
  • 400MHz RAMDACs

  • Graphics Core: 256-bit
  • Engine clock: 475 MHz
  • Memory Interface: 128-bit DDR2
  • Memory Bandwidth: 14.4GB/sec
  • Fill Rate: 1.9 billion pixels/sec
  • Vertices/sec. 356 million
  • Memory Clock: 450MHz
  • Memory Data Rate: 900MHz
  • Maximum Memory: 256MB
  • Pixels per Clock: 4
  • Textures per Rendering Pass: 16
  • Thermal solution: Silent, single-slot heatsink/fan
  • Architected for Cg
  • Microsoft® DirectX®9.0 Optimizations and Support
  • New 64-phase Video Scaler
  • OpenGL®1.4 Optimizations and Support
  • Video Mixing Renderer (VMR)
  • High-performance, high-precision 3D rendering engine
  • On-board DVI support up to 1600x1200 resolution
  • On-board TV-out support up to 1024x768 resolution
  • Integrated Full Hardware MPEG-2 Decoder
  • Vivid NTSC/PAL TV-out support with flicker filter


FILLRATE: 1.9B pixels/sec



The sample we received came by the way of eVGA, one of NVIDIA's product launch partners.  Although the card is somewhat longer than other recent entries, it really has a lean, mean look to it.  When we say lean, we would like to point out that the 5700 Ultra has a single-slot thermal solution, but more on that later.  As eVGA points out on their box, a power connection is necessary, and in this case a 4-pin MOLEX connector will be used.  Most modern power supply units have multiple power connections, so this shouldn't be an issue.  As is standard, the ports on the bracket consisted of DVI-out, 15-pin VGA, and S-Video out.  Incidentally, the 0.13 micron die process for the GPU is handled this time around by IBM, in East Fishkill, NY, not too far from our testing labs.


The GPU and DDR2 RAM are running at pretty high speeds, generating a bit of heat, but are cooled using a one-piece, slim-line heatsink/fan manufactured by AVC.  Unlike the larger fans found on the 5900 and 5950 Ultra card, this solution only requires one slot, and it is relatively quiet.  We also found an oversized heatsink over the memory chips on the back of the card along with the NVIDIA logo on a separate tab attached by screws from the front.  eVGA has even adorned the front of the fan with one of NVIDIA's demo-dwellers, Vulcan.  Other manufacturer's models will invariably use other cooling solutions, however, there is little room for personalizing the card since NVIDIA has strict control over the manufacturing process.  The heatsink is seated on top of the RAM and GPU using copious amounts of thermal paste, quite evident in our photo.


eVGA's packaging of the e-GeForce FX 5700 Ultra consisted of a vibrant box that boldly points out the major selling points of the card, although the actual speeds of the core and memory are curiously absent.  The bundled software and accessories are somewhat less thrilling.  There was an S-video cable and a DVI-to-VGA converter, handy for setting up dual monitors.  The User's Guide was scant on information, offering only the basics for setting up the card, without further explanation of its capabilities.  The Installation CD contained a set of Detonator drivers, which can be automatically installed using eVGA's ADM software, short for Automated Driver Management. The ADM first checks for the system chipset, and will install the correct AGP GART driver if needed.  It then checks for pre-existing video drivers, and removes them before installing the newer ones.  It takes away any of the complexity that a new user might run into when installing an upgrade.  The other CDs included NVDVD 2.0, a demo disc, and a full version of Ghost Recon.  Unfortunately, Ghost Recon was released in 2001, and really won't whet gamer's appetites like ATi's inclusion of Half-Life2 with the 9600 and 9800 XTs.

The Drivers & Some Screenshots

NVIDIA's GeForce FX 5700 Ultra - Page 2

NVIDIA's GeForce FX 5700 Ultra
NVIDIA's immediate answer to the 9600 XT

By Robert Maloney
October 23rd, 2003

GeForce FX 5700 Ultra Driver Control Panels
NVIDIA's ForceWare 52.16 Drivers

Since we were going to test two new cards for today, we figured that we would give users a quick look at the latest release of NVIDIA's drivers, version 52.16, now dubbed ForceWare.  With each revision, NVIDIA attempts to simply the configuration options, while increasing quality and performance.  While others may say otherwise, we find that NVIDIA's take on the control panels is more straightforward than those found with ATi's Catalyst drivers.  Rather than flip through tab after tab, and trying to remember which ones have been clicked on as they shift position, NVIDIA's drivers have a submenu pop-up that one can use from top to bottom to alter all of the settings for the GeForce FX.  NVIDIA's drivers are also well known for their UDA, or Unified Driver Architecture, meaning that only one driver is needed for all NVIDIA GPU's.

Change Resolutions
 Color Correction 
 Temperature Settings
2D Standard Clock

3D Performance Clock

The Change Resolutions tab is just that; it allows the user to set the screen resolution, color quality, and refresh rate all from one tab.  While this doesn't offer any major benefit than setting them directly from Display Properties, one can set custom display configurations in the lower half of the screen and only choose from amongst those that have been added.  Color and gamma levels are modified in the next tab, Color Correction, and the Digital Vibrance Controls allow the user to adjust the color digitally to compensate for ambient lighting conditions.  The latest drivers also read the temperature from a thermal diode in the GPU, which can be seen in the Temperature Settings.  Should a defined temperature threshold be met, a warning can alert the user to possible danger.

One new addition to the ForceWare 52.16 drivers is the new and improved NView multi-display wizard.  Users can also divide the monitor into separate regions and quickly reposition and resize application windows.  Also, more robust profiles let you customize the driver settings for specific gaming environments.  Also noteworthy was the two speeds that the graphic core was clocked at, which we were able to see by applying the Coolbits registry hack.  At 2D (sitting at the desktop), the 5700 Ultra clocks in at 300MHz, thus running cooler and putting less wear and tear on the GPU.  Start up a 3D based application and the 5700 ramps up to 475MHz.  The memory stays clocked at 906MHz, however.  We could also use these "unlocked" tabs for overclocking the GPU and memory by moving the slider to higher frequencies.

Quality & Performance



Unlike ATi's Catalyst drivers, which require that anti-aliasing and anisotropic filtering be done on a one-on-one basis for Direct3D and OpenGL, a single slider on the Quality and Performance controls both with NVIDIA's cards.  Each device can be left at application-controlled, or set to as high as 8 samples for each.  Gone from past driver versions was the checkbox for texture sharpening.  Further optimizations for Direct3D and OpenGL games can be made on the individual tabs for each, such as defining the mipmap detail in Direct3D and disabling Vertical Sync in OpenGL.

Screenshots With The GeForce 5700 Ultra
Sure it can put up the numbers, but what about the quality?

Before we get to the benchmarks, we wanted to compare screenshots from the Radeon 9600 XT and a GeForce FX 5700 Ultra, two video cards geared toward mainstream users.  In this day and age, it doesn't make much sense to invest into a high-power graphics card unless you plan on playing your games at higher resolutions and optimizing the display by enabling AA or Anisotropic Filtering.  Faster frame rates are great, especially during multiplayer romps online, but image quality should never be sacrificed in order to obtain them.  We took two new games that hit the retail shelves recently to see how the two cards compared when viewing identical locations.

Max Payne 2 Screenshots 1280x1024x32
NVIDIA GeForce FX 5700 Ultra ATi Radeon 9600 XT







4X AA + 8X AF

4X AA + 8XAF

Max Payne 2 is short on game-play, but long on graphics.  We originally attempted to take these screenshots at 1600x1200, but were quickly alerted when we applied 4 samples of anti-aliasing that there wasn't enough memory on either card to draw the scene (each card has 128MB, while at least 140MB was required).  So we settled for 1280x1024 with 32-bit color and all graphical settings set to "High".  We compared the two sets of screenshots side-by-side, at no AA, 4XAA, 6XAA, and finally 4XAA plus 8X anisotropic filtering.  We would have to say that the quality of both cards was very close, and the "jaggies" that were so prevalent in the original pics (check the window shades for the best example) are both cleaned up so nicely that at 6XAA on both cards we saw almost straight lines.

TRON 2.0 Screenshots 1600x1200x32

NVIDIA GeForce FX 5700 Ultra ATi Radeon 9600 XT







4X AA + 8X AF

4X AA+ 8X AF

As we mentioned in the 9600 XT review, TRON 2.0 is one of the games that comes with the tagline, "NVIDIA, the way it's meant to be played".  Started in 2002, it has picked up some steam with NVIDIA expecting over 100 titles by the end of 2003 to carry this line including some big names like Electronic Arts, Microsoft Game Studios, and Activision.  Understandably, this has caused a minor uproar, since gamers claim the way a game looks should be the same for all players regardless of what video card they own.  One only needs to look back a few years to see that this is actually nothing new, as games used to carry tags that touted "3dfx optimized", "3DNOW! ready", and the like.  Getting back to the present, we checked out screenshots from TRON 2.0 from both cards.  We said it before, and we will say it again, as far as we are concerned the tron-like glow looks the same on both cards, so we don't know what all the fuss is about.  What we did notice, however, is how the quality has improved from our last look at this title.  At 4XAA and 6XAA, the GeForce card has definitely tightened up the edges, coming darn close to the Radeon 9600 XT, which we feel still has the slight edge. 

The Test System, AquaMark3 & Halo

NVIDIA's GeForce FX 5700 Ultra - Page 3

NVIDIA's GeForce FX 5700 Ultra
NVIDIA's immediate answer to the 9600 XT

By Robert Maloney
October 23rd, 2003

Benchmarks With Unreal Tournament 2003
DX8 Performance In The Mainstream

Epix's Unreal Tournament has consistently been one of the most popular shooters, and by no coincidence is it also one of the most used benchmarks for video card testing.  There are many variants to testing the demo version, one of which is to use a "Flyby", which plays back a recorded tour of one of the levels.  Here in the labs, we use a custom INI file that maximizes the graphical settings, and then displays the average frame rate for three strenuous resolutions.  We chose the 1024x768x32 and 1600x1200x32 scores for our reports, with and without anti-aliasing samples.

At the lower resolution, the GeForce FX 5700 Ultra posted up the highest overall frame rate at 114.45 frames per second, followed by the 5600 Ultra and then the Radeon 9600 XT.  When applying 4 or 6 samples of anti-aliasing, the two GeForce FX cards both dropped behind the 9600 XT by a wide margin.  At 1600x1200, we see the same differences.  The 5700 Ultra beat out the Radeon by almost 2 frames per second, and the 5600 Ultra by close to 26.  Even more impressively, when 4XAA was applied, a hit was felt by all cards, but the 5700 Ultra still managed to keep around 45 FPS.  The other two cards both dropped into the mid twenties.  A clear-cut victory for the 5700 Ultra.

Head-to-Head Performance With Splinter Cell
Stealth Pixel Shading Redefined

Splinter Cell's version 1.2 patch includes three demos in addition to a benchmarking feature, but this appears to be more CPU limited than VPU.  So, in order to come up with some more meaningful results, we used the Oil Rig demo created by the folks at Beyond 3D.  This demo removes two CPU intensive routines while increasing dependence on Pixel Shader performance.  Shaders are used to render the realistic looking ocean water surrounding the Oil Rig, as well as simulating a night vision display.  As we've mentioned in the past, anti-aliasing doesn't work with Splinter cell (at least with the current version).  Due to this fact, we do not have any AA scores listed in the graphs below.

At both resolutions, we saw the same placement of the cards, although the delta between the two GeForce cards was more pronounced at 1600x1200.  It is well-known that DX9 pixel and vertex shaders performance has been a problem for NVIDIA, and the early word on version the ForceWare 52.16 drivers was that these issues would be addressed.  So far in our testing with Splinter Cell, we have seen some needed improvement.  The 5700 Ultra came in second place, and it makes a large improvement over the 5600 Ultra, using its higher core speed and DDR2 memory. 

Final Fantasy & Gun Metal Tests

NVIDIA's GeForce FX 5700 Ultra - Page 4

NVIDIA's GeForce FX 5700 Ultra
NVIDIA's immediate answer to the 9600 XT

By Robert Maloney
October 23rd, 2003


Performances Comparisons With Final Fantasy XI v2.0
Chocobos finally make it to the big screen

Final Fantasy is a title that is more well known by console users, but it appears that Squaresoft is ready to make the jump to the PC, with a MMORPG version of the classic. The demo version comes bundled with a benchmark, which displays a score is displayed every time it makes a full cycle.  Although it was meant more for checking a system's readiness to play the game, we found that we could get a meaningful test by setting it to high-res and dividing the score (rendered frames) by the time elapsed to get the frame rate.

We weren't sure how this benchmark was going to go, since it's hard to tell just by watching the score accumulate how the card is faring. The numbers zipped by in some parts, and it seemed that some of the graphics popped into focus instead of being drawn in slowly.  At any rate, we took the final score and divided it by the time it took to run the test to get the final frame rate.  The GeForce FX 5700 Ultra was the leader, followed by the 9600 XT and then the 5600. 

Benchmarks / Comparison With Gun Metal
With gameplay like this, we can't wait to play the real thing

Yeti Studios has also released a benchmark that is based on a real game.  According to Yeti, it is designed to push the latest DX9 based cards to the limit using Pixel and Vertex Shaders to display a rapidly changing battlefield.  To help achieve this, anti-aliasing and anisotropic filtering are built into the engine, and cannot be disabled. As such, the average frame rates are lower than one should expect with the actual game, so don't be discouraged by the scores that you see in the graphs below.  The benchmark was run twice; once at 1024x768x32, and again at 1280x1024x32.

When it came to Gun Metal, at least, we had to take back our earlier statements about NVIDIA and DX9 shaders.  This benchmark was utterly dominated by the 5700 Ultra, and we had to check our results since the margin of difference was so great.  We're talking a performance difference of more than 35% over the Radeon 9600 XT, and 45% on top of the 5600 Ultra.

Next Up: Comanche 4 & Wolfenstein

NVIDIA's GeForce FX 5700 Ultra - Page 5

NVIDIA's GeForce FX 5700 Ultra
NVIDIA's immediate answer to the 9600 XT

By Robert Maloney
October 23rd, 2003

Performances Comparisons With Novalogic's Comanche 4
The Performance Battlefield

I guess you could call Comanche 4 the old man on the block, as it is probably the oldest of the tests in our current suite of benchmarks.  Comanche 4 uses DX8 class pixel and vertex shaders to produce some of the realistic visuals used throughout the game.  Unlike the previous tests, this benchmark is heavily influenced by CPU and system memory performance, especially at lower resolutions.  However, when the resolution is raised and AA and Anisotropic filtering are enabled, the current crop of video cards tend to slow down quite a bit

As we've seen and reported in the past, the Comanche 4 Demo is much more of a CPU-bound test than any of the other benchmarks we have used.  The GeForce FX 5700 Ultra and ATi Radeon 9600XT both put up some impressive numbers, locking horns without AA and then again with 4XAA.  The first cracks in the foundation come when 6XAA is applied, and the FX 5700 Ultra fell behind the 9600 XT by seven frames per second.  It still held a big lead over the 5600 Ultra though.  At 1600x1200 we were quite impressed by the 5700 Ultra's performance.  One only needs to realize that the 5700 Ultra with 4XAA applied managed to beat the 5600 Ultra with no AA to see that NVIDIA has something special here.

Benchmarks / Comparison With Wolfenstein: Enemy Territory
New Game, Better Effects, Old Engine

Wolfenstein: Enemy Territory can also be included in the "older" crowd, although its focus is on OpenGL testing (and yes, that means Quake 3 Arena is officially gone).  Wolfenstein: ET is a free standalone multiplayer game that is based on the original Return to Castle Wolfenstein. It uses a modified Quake3 core yet exhibits plenty of CPU scaling and platform variation, which also makes it a good benchmarking tool.  We used the built-in timedemo benchmark, which plays back a recorded multiplayer session.

The GeForce FX 5700 Ultra loved the Wolfenstein Demo we ran, so much so that it easily placed first in each and every test we ran.  It outclassed the other two cards by over 10 frames per second at almost every level at 1024x768, and even greater margins at 1600x1200.  Actually, both of the GeForce FX cards took to this test, where even the 5600 Ultra managed to routinely beat the 9600 XT.  The only glitch was when we applied 6XAA, as both GeForce cards put up numbers akin to having no AA whatsoever.  The Radeon 9600 XT did the same at 4XAA + AF, but only at 1600x1200, which explains the paucity of scores in the latter graph.

X2 The Threat, Overclocking & Our Final Analysis

NVIDIA's GeForce FX 5700 Ultra - Page 6

NVIDIA's GeForce FX 5700 Ultra
NVIDIA's immediate answer to the 9600 XT

By Robert Maloney
October 23rd, 2003

Head-to-Head Performance With X2 - The Threat
A Hot New Space-Sim

X2 - The Threat is a game that is due out in November, that probably hasn't received too much press...that is, until now.  A demo version can be downloaded from http://www.egosoft.com/, and it is simply a gorgeous space simulation. The benchmark is enabled by a checkbox within the demo, albeit a long one to complete.  Egosoft promises that X2 will provide gamers with the best gaming experience with sound and graphics to match, and based on what we saw in the demo, we would have to agree.  The same resolutions and optimizations were used with this benchmark as well.

In our final test, we saw a little more of the same, with the 5700 Ultra in the lead in all but one test, followed closely (at least at 1024x768) by the Radeon 9600 XT, and then the 5600 Ultra.  The only sore point once again was the application of 6XAA, where the 5700 Ultra lost nearly half of its framerate from 4XAA.  In direct comparison, the Radeon only lost about a third of its frames when doing the same.  However, at 1600x1200, the tables are turned somewhat with the GeForce FX 5700 Ultra besting the 9600 XT by a wide margin, even with anti-aliasing and anisotropic filtering applied

Overclocking With The NVIDIA GeForce FX 5700 Ultra
Because we just couldn't leave well enough alone

Much like we saw with the 9600 XT, the .13 micron process used on both cards provided some headroom for us to play with when overclocking either card.  The 5700 Ultra didn't get as high, however, as we were only able to get as high as 537MHz for the core, and 1030MHz for the memory.  That amounted to a 13% increase in both the core clock and memory speeds.  To see how we would fare in gameplay, we ran another Gun Metal test at 1280x1024, with the default 2XAA setting.  

We managed to increase our original score by a modest two and half frames, which, as you may guess, came out to a 13% increase in the frame rate.  Not exactly something to write home about, but it was still nice to see that there was some room to play with.  It should also be noted that the 5600 Ultra was clocked at 400MHz, so we have already gotten a 75MHz boost to begin with.  Add in the overclocking, and you're looking at 137MHz over the 5600 Ultra.  A quick check of the scores has the overclocked 5700 Ultra nearly doubling the output of the 5600 Ultra.


Once again we see the two biggest names in the graphics industry trading blows.  One week it's ATi taking the lead with the impressive Radeon 9600 XT, then this week we find NVIDIA reclaiming their position as a mainstream leader.  Although the mighty GeForce FX 5950 Ultra also emerges from the shadows on this day, its price makes it a somewhat prohibitive purchase, best left for those with large budgets.  The bulk of sales has and always will be in the mainstream market, and NVIDIA has released a superb entry into this burgeoning field.  It's almost a shame to label the 5700 Ultra as a mainstream unit, since its performance was top notch in all of our benchmarks and it comes with some solid features like a 475MHz 0.13 micron process GPU and 128MB of DDR2 memory.  The new ForceWare drivers have taken a full step in correcting some of the shader problems we have seen with GeForce FX cards using Detonator drivers of the past.  Until they make the switch to PCI Express, expected in the first half of 2004, this just may be the card for current owners of GeForce Ti 4200 or FX 5200/5600 cards to get in order to play newer DX9 titles like HalfLife 2 or Doom3. 

NVIDIA releases the GeForce FX 5700 Ultra just in time for Halloween, with no tricks, just a lot of treats, and we're giving it a 9.5 on the HotHardware Heat Meter, and an Editor's Choice Award.

There are a ton of folks just like you in
HotHardware's PC Hardware Forum!

NVIDIA's GeForce FX 5700 Ultra Page 7

Content Property of HotHardware.com