Nvidia's Response to the Power and Heat Issues....Denial?

rated by 0 users
This post has 13 Replies | 2 Followers

Top 75 Contributor
Posts 1,964
Points 25,705
Joined: Sep 2009
gibbersome Posted: Thu, Apr 1 2010 3:09 AM

Last Friday we celebrated the US launch of our newest GPUs, the GeForce GTX 480 and GTX 470 at PAX East, in Boston. We picked PAX East as the venue because it’s a show for gamers, by gamers. Having built these cards with passionate PC gamers in mind, it was really the only option.

Since launch, we’ve been getting great feedback from you on all that the GTX 480/470 has to offer. With it, you can “crank up” your next gen PC games. From advanced tessellation engines, 480 compute cores, to new ray tracing technologies and of course 3D Vision Surround support. (You may have seen shots of our presentation at PAX that projected across three giant screens – each 80’ in diameter - with 3D stereo demos of BattleField: Bad Company 2, World of Warcraft, and Metro 2033. It was spectacular. Video of our keynote is below)

We wanted to let you know that we’ve also heard your concerns about GTX 480 with respect to power and heat. When you build a high performance GPU like the GTX 480 it will consume a lot of power to enable the performance and features I listed above. It was a tradeoff for us, but we wanted it to be fast. The chip is designed to run at high temperature so there is no effect on quality or longevity. We think the tradeoff is right.

The GF100 architecture is great and we think the right one for the next generation of gaming. The GTX 480 is the performance leader with the GTX 470 being a great combination of performance and price.

As always, we hope that you enjoy our new products and let us know what you think. We built them for you.

http://blogs.nvidia.com/ntersect/2010/03/gtx-480-a-passion-for-the-future-of-pc-gaming.html

 

What do you guys think?

  • | Post Points: 35
Top 100 Contributor
Posts 862
Points 11,010
Joined: Apr 2008
RyuGTX replied on Thu, Apr 1 2010 3:36 PM

What is there to think but that this was expected. Nvidia had to get their PR guys to send something out to smooth over all the negative thoughts on their new cards.

A lot of concerns where the heat output. Nvidia countered it by saying that the cards give you performance and that the heat won't affect the life or stability of the product. I haven't read enough reviews of the new card, but from what I have read, no one mentioned  anything about unstability from the heat.

 

I wonder if Nvidia is hiring. I could have written that.

If you think you can’t do something, you’ll never be able to do it. No matter how easy it is.
  • | Post Points: 20
Top 500 Contributor
Posts 136
Points 1,890
Joined: Mar 2010
sp12 replied on Thu, Apr 1 2010 4:16 PM

I'm honestly kinda shocked at that response. To me it seems outright silly to deny the heat issue, though I understand it was unavoidable due to shader count loss.

  • | Post Points: 20
Top 500 Contributor
Posts 141
Points 1,710
Joined: Mar 2010
Location: Iowa State University

Ahaha... well he's probably denying it otherwise it would be bad PR for the company and they can't afford to do that with only two cards in the level they need to be in the market... but yeah it's kinda hard to deny it in a blog :P 

  • | Post Points: 5
Top 100 Contributor
Posts 862
Points 11,010
Joined: Apr 2008
RyuGTX replied on Thu, Apr 1 2010 6:02 PM

How is he denying the heat issue? No where did he say that there was no heat. He just puts it in a different perspective. If you want power, you will have to deal with the heat.

If you think you can’t do something, you’ll never be able to do it. No matter how easy it is.
  • | Post Points: 20
Top 500 Contributor
Posts 136
Points 1,890
Joined: Mar 2010
sp12 replied on Thu, Apr 1 2010 6:52 PM

But it's absurd that ~90% of the power can be had for significantly less heat. It's a flawed statement.

And really, the power was initially supposed to come form ~32 additional shader cores, but those were disabled so they had to OC the card to compensate.

  • | Post Points: 20
Top 100 Contributor
Posts 862
Points 11,010
Joined: Apr 2008
RyuGTX replied on Thu, Apr 1 2010 9:31 PM

sp12:

But it's absurd that ~90% of the power can be had for significantly less heat. It's a flawed statement.

And really, the power was initially supposed to come form ~32 additional shader cores, but those were disabled so they had to OC the card to compensate.

 

But if they decreased their power output to reduce less heat, it would perform below the 5870. That would be even worse of a showing in my opinion. Can't charge a premium for something that performs worse. They probably need to get as much money as they can out of these new cards because they spend so much into making this card (R&D for the new architecture and stuff).

 

If you think you can’t do something, you’ll never be able to do it. No matter how easy it is.
  • | Post Points: 20
Top 50 Contributor
Posts 3,236
Points 37,910
Joined: Mar 2010
AKwyn replied on Thu, Apr 1 2010 10:26 PM

RyuGTX:

sp12:

But it's absurd that ~90% of the power can be had for significantly less heat. It's a flawed statement.

And really, the power was initially supposed to come form ~32 additional shader cores, but those were disabled so they had to OC the card to compensate.

 

But if they decreased their power output to reduce less heat, it would perform below the 5870. That would be even worse of a showing in my opinion. Can't charge a premium for something that performs worse. They probably need to get as much money as they can out of these new cards because they spend so much into making this card (R&D for the new architecture and stuff).

 

I think if they decreased it a tiny bit then it could still beat out the ATI by 5-10 FPS and still run cooler.

But still, they don't say anything about the efficiency. They should at least explain why the card doesn't beat the ATI Radeon HD 5870 by a huge margin except on certain games.

 

"The future starts with you; now start posting more!"

  • | Post Points: 35
Top 10 Contributor
Posts 6,372
Points 80,290
Joined: Nov 2004
Location: United States, Arizona
Moderator

First off the Nile is a river...lol.. j/k

I think this is first iteration of the chip, and I believe that this architecture has a much better life span on it that ATI's current cards. However with the first gen of any thing it is hot and power hungry. I think with a die shrink it will help the heat and power consumption. 

"Never trust a computer you can't throw out a window."

2700K

Z77 GIGABYTE G1.SNIPER

GIGABYTE GTX670

G.Skill Ripjaws X 16gb PC2133

Antec P280

Corsair H100

Asus Blu-ray burner

Seasonic X650 PSU

Patriot Pyro 128gb SSD

  • | Post Points: 20
Top 75 Contributor
Posts 1,964
Points 25,705
Joined: Sep 2009

Der Meister:

First off the Nile is a river...lol.. j/k

I think this is first iteration of the chip, and I believe that this architecture has a much better life span on it that ATI's current cards. However with the first gen of any thing it is hot and power hungry. I think with a die shrink it will help the heat and power consumption. 

 

Still though, they basically ignored the myriad of reviews concerning the high power and temperature numbers. With decent cooling, the heat won't matter, but the high power requirements will have to come down a lot if we're to see a dual-GPU  or mobile-GPU versions of the GF100.

  • | Post Points: 5
Top 100 Contributor
Posts 862
Points 11,010
Joined: Apr 2008
RyuGTX replied on Fri, Apr 2 2010 4:32 PM

TaylorKarras:

RyuGTX:

sp12:

But it's absurd that ~90% of the power can be had for significantly less heat. It's a flawed statement.

And really, the power was initially supposed to come form ~32 additional shader cores, but those were disabled so they had to OC the card to compensate.

 

But if they decreased their power output to reduce less heat, it would perform below the 5870. That would be even worse of a showing in my opinion. Can't charge a premium for something that performs worse. They probably need to get as much money as they can out of these new cards because they spend so much into making this card (R&D for the new architecture and stuff).

 

I think if they decreased it a tiny bit then it could still beat out the ATI by 5-10 FPS and still run cooler.

But still, they don't say anything about the efficiency. They should at least explain why the card doesn't beat the ATI Radeon HD 5870 by a huge margin except on certain games.

 

Well... the 5970 is a dual chip card.

Let's say they down clocked it a bit and it runs a little cooler. Just theorizing because I don't have the card to see how fast it can run at different voltages. But lets just say it still performs 5-10 fps better and runs cooler. That number would be an average and in some tests, it would lose to the 5870. Besides that, asking for $100 over the 5870 for only a 5-10 fps better card? That is even more ridiculous in my opinion. I think that most people are first concerned about the performance and/or price (or bang for the buck) and then maybe heat as a second priority. If this card performed a lot better in current generation games, I would gladly buy the card and deal with the heat. I would either slap on an aftermarket cooler or wait for manufacturers to do it themselves.

 

By the way, Der Meister brings up a good point. I keep seeing him mentioning it and I stand by him on that point. From what we have seen of the few DX11 tests/benchmarks, this card's architecture is really geared towards the DX11 features like tessellation. Maybe this card was just too soon....

 

If you think you can’t do something, you’ll never be able to do it. No matter how easy it is.
  • | Post Points: 20
Top 500 Contributor
Posts 136
Points 1,890
Joined: Mar 2010
sp12 replied on Fri, Apr 2 2010 6:15 PM

Who mentioned the 5970?

I'm under the assumption that they have it overclocked so much so it CAN beat the 5870 despite the lost shaders. Regardless, it's only 5-15 FPS better at most right now.

I agree with Nvidia's choice in that most people prefer performance to heat, but the amount of heat produced to get this performance really does limit Nvidia's options for upward expansion without a respin or yield improvement.

Power is another issue this card faces, it would never work in a notebook in its current state.

Fermi wasn't ACTUALLY meant for graphics cards, it was initially a computing chip for Nvidia's Tesla line. I'm sure Nvidia's engineers are working on their next gen, with Fermi cards just being a profit stopgap.

ATI is also working on their next-gen, it's rumored to have a 3-4x more powerful tessellation engine, so I'm not sure how long Nvidia's better Dx11 features will last.

  • | Post Points: 20
Top 100 Contributor
Posts 862
Points 11,010
Joined: Apr 2008
RyuGTX replied on Sat, Apr 3 2010 6:49 PM

I'm pretty sure they won't release a notebook version for a while. Or maybe that is why they jumped the the 400 series leaving the 300 series for the notebooks only.

If you think you can’t do something, you’ll never be able to do it. No matter how easy it is.
  • | Post Points: 20
Top 500 Contributor
Posts 136
Points 1,890
Joined: Mar 2010
sp12 replied on Sun, Apr 4 2010 8:41 PM

Actually the 300 series consists of rebranded 200 series cards for OEMs.

  • | Post Points: 5
Page 1 of 1 (14 items) | RSS