NVIDIA Unleashes Quadro 6000 and 5000 Series GPUs

6 thumbs up

We'd like to cover a few final data points before bringing this article to a close. Throughout all of our benchmarking and testing, we monitored how much power our test system was consuming using a power meter. Our goal was to give you an idea as to how much power each configuration used while idling and under a heavy workload. Please keep in mind that we were testing total system power consumption at the outlet here, not just the power being drawn by the graphics cards alone.

Power Consumption and Operating Temperatures
How low can you go?



With the power hungry reputation of NVIDIA's Fermi architecture, it comes as no surprise to see the Quadro 6000 pulling down a significantly more juice from the wall socket. Although it was relatively tame in an idle state, the test system required 446W during load with the 6000 videocard installed, 12% more than the FirePro V8800. The Quadro 5000 proved to be more green in its ways, requiring only 385W at load, or about 14% less power than the 6000 graphics card.



Much like their gaming cousin, the GTX 480, our Quadro workstation videocards run hot. Fully loaded, both cards max out at 88 degrees Celsius. That's just three degrees shy of the GTX 480's load temperature. In comparison, the V8800 operated 14 degrees cooler at full load.

The Quadro graphics cards feature dual slot cooling solutions that that provide a peaceful working environment during normal conditions. Fortunately, we did not experience any irritating fan noise from the cards throughout most of the benchmarks. Fan noise slightly increased a few minutes during SPEC testing, but easily within our comfort range. The Quadro 6000 became noisy only during temperature testing at full load. If you plan on operating your videocard at full load for long periods of time, consider choosing the Quadro 5000, which remained relatively quiet under all conditions.

Article Index:

1 2 3 Next
0
+ -

I'd never thought another Quadro card would see the light of day, I'm amazed at the fact that it failed some tests but I'm sure future drivers will help improve that score.

0
+ -

Not too bad of video cards, but they still draw more power and heat up much more than the ATI cards. They both seem to trade blows on diferent programs, but the conclusion states they are substantially faster than ATI's cards. I wouldn't go as far as to say that though. They are a big improvement over previous generation cards, but ATI is already what, 6 months ahead of Nvidia?

If there was a power (as in electricity) to peroformance ratio, the ATI cards would most likely win that.

0
+ -

I sorta disagree. While ATI is ahead of the curb, NVIDIA seems to have the best cards performance-wise. If I were Pixar then I would want to use the most powerful rendering solutions known to man.

But you are right on the fact that they take a lot of power, that seems to be the biggest flaw of the GF100 chip.

0
+ -

I was going to say they finally have an answer to the FirePro's? Yet not being able to use three monitors? Most of us would like to have at least two plus our Wacoms. I guess the next question would be, does it support more of you have a second card or does the porting still limit you to two?

I understand the Cinebench marks look better for the ATI. Yet when it comes to frame rate testing, in the view port anything around a 40 is good enough for smooth work flow. Unless you really need to see scientific fluid dynamics live on screen? The CATIA and EnSight levels prove this point very well! Now I know why NVidia has a better reputation for being better integrated to many DCC apps!

"ATI's FirePro cards have been out for months so we aren't surprised to see more mature drivers from them."

I am not so sure about that! I have had the V8800 for a while and they have not released a new driver since the last and only update on 4/25/10?! They even told me how lucky I am to have good driver support, yet didn't answer any concerns about driver updates. I would now say that when it comes to CGI, Nvidia has better performance when it comes to rendering. An animated scene with high resolution textures, motion blur and Photometric lighting is better handled from a company that has been at the forefront of that industry for the past twenty years.

I agree that yes...the cost really offsets productivity! Nvidia has a better reputation for workstation support and knowledgeable integration for 3D graphics. Now I am finding that out the hard way! If you have an awesome high performance system to begin with then you wont notice many troubles when switching to a higher workstation card, except within the veiwports. Yet like they do in these reviews, you will have to start off with a fresh build when switching to something like the Quadros. I have always supported ATI, and now I know why it is that I am not as productive as I want to be!! In studio, I usually have an Nvidia solution, and ATI at home. Since I am usually using scenes already modeled, lite and textured, I have never really noticed much of a difference until rendertime. That, I always caulked up the difference to not having a render farm. Now I know better!!

The PhysX and Cuda features are something that should drive professionals to these cards. It would also be interesting to see how these get integrated with a Tesla c1060? I guess the best way to sum the two companies up?! ATI develops for the end entertainment, down towards the development of content... only as a side note! Nvidia develops for the developer, knowing the end result is to make the best product for entertainment!

So if Y'all are going to just toss this test system into the test-Bin toybox, maybe pitch it my way :) Or I could trade you a FirePro V8800 for one of these Quadros? I would gladly switch the four monitor support for smoother integrated performance:)

0
+ -

The new Quadros support two monitors per card (just like the GTX 4-series)


You can download the new FirePro driver 8.762 from ATI right now.  Choose 'FirePro Beta' from the drop down menu.

 

What problems are you having with the V8800?

 

0
+ -

Thanks Raid, Ill give it a try.

The 8800 has problems in Max on stability in rendering issues. It also doesn't like when you load the Direct X driver. It only seems stable if I use Open GL. At times the wireframe gets about four pixels larger, like it is being drawn in crayon. When I load the X driver it has two of the viewports going blank.  There is still the globals marker for any selected object yet nothing else. I can still switch any of the working veiwports to any view and they still work, just two are MIA.

In Maya the blend shapes sometimes work and most other times don't. It gets frustrating when you move the slider and you think the Blend doesn't actually work! Also any sub-object selection usually does not update when you are moving them or making any kind of adjustments. In both programs the material mapping tends to not update when changing UV's. That only seems to work half of the time. I end up having to zoom in and out decides to refresh the view, then it catches up and adjusts the map.

 

0
+ -

To animatortom

At SIGGRAPH I saw many demos by Autodesk of both Maya and 3ds Max 2011 in the AMD booth using Eyefinity w/ 3 displays. They demo’ed every day with no issues. Also as Max really only supports DX (they quit adding features to OpenGL code 4 or 5 years ago), so it was all DX and worked flawlessly for several days of demos.

So wonder if you were using an old driver?

0
+ -

"I saw many demos ....several days of demos."

That is the key word,....DEMO!

It is one thing to set up a demo system with demo files. When run through properly set up files it can be made to look like it works well. In practical application, it is a whole other matter. If they had a completely tricked out system with twin Xenons and 128GB of RAM with dual 8800's then yes it would look like it rocks. They haven't had to do much to OpenGl because the development has been complete and for the basics of 3D is working well after years of development.

DX is a new direction for DCC.  The thing with Max, is that it requires a ton of RAM and a powerful CPU. If you have a six-core with around 64GB or RAM then with just the Firepro 4800, it would rock! the others are overkill unless you have a strong system to begin with. Maya relies more on the GPU so it benefits as long as it has more than 1GB on the GPU.  With a crappy system, the 8800 is fairly useless in Max. In Maya it is almost three times as fast.

So if they are demoing it, then I can guarantee, they would have it running on a completely tricked out system inside a simple looking case just for show.

0
+ -

@ animatortom

I would suggest you to try Linux - Fedora or Debian for example. There aren't any problems with ATI OpenGL Maya/Linux drivers. Cheers, g.

0
+ -

@Animatortom
"I would now say that when it comes to CGI, Nvidia has better performance when it comes to rendering...is better handled from a company that has been at the forefront of that industry for the past twenty years."

nV hasn't been at the Forefront that long, they filled the gaps Matrox and 3DLabs left behind 10 years ago. If anything ATi's been in it longer, nV just did a better job of capitalizing on the failure of others (including 3Dfx).

The PhysX and Cuda features are something that should drive professionals to these cards.

What exactly is the benefit of PhysX here... or anywhere? Confused

With OpenCL and Direct compute both of those features will start to become marginalized. Proprietary is fine if you can develop a compelling reason, but even Adobe has said they are going to OpenCL, not CUDA for the next build to get the more global CPU & GPU benefits of OpenCL.

CUDA like Brook was a nice bridge solution, but their time has come and gone, just like Cg on which Cuda was based.

"I guess the best way to sum the two companies up?! ATI develops for the end entertainment, down towards the development of content... only as a side note! Nvidia develops for the developer, knowing the end result is to make the best product for entertainment!"

That would be a way to sum it up, but not the 'best way', it's a little Myopic. If anything this review shows that ATi makes better hardware, but they are held back by their much MUCH weaker software/driver/dev_relations teams. AMD was supposed to improve that but they have done little other than a tiny improvement in Linux and only able to at best equal nV.

* Dang, hate the quoting tool here. Even after editing out remnants remain.*

1 2 3 Next
Login or Register to Comment
Post a Comment
Username:   Password: