Getting Physical: AMD, NVIDIA Trade Shots Over Hardware Physics

rated by 0 users
This post has 9 Replies | 1 Follower

Top 10 Contributor
Posts 26,382
Points 1,192,285
Joined: Sep 2007
ForumsAdministrator
News Posted: Tue, Mar 16 2010 4:44 PM
When it comes to hardware-accelerated PhysX and the future of GPGPU computing AMD and NVIDIA are the modern-day descendents of the Hatfields and McCoys. Both companies attended GDC last week, where a completely predictable war broke out over PhysX, physics, developer payoffs, and gamer interest in PhysX (or the lack thereof).

The brouhaha kicked off with comments from the senior manager of developer relations at AMD, Richard Huddy, who said: "What I’ve seen with physics, or PhysX rather, is that Nvidia create a marketing deal with a title, and then as part of that marketing deal, they have the right to go in and implement PhysX in the game...I’m not aware of any GPU-accelerated PhysX code which is there because the games developer wanted it with the exception of the Unreal stuff. I don’t know of any games company that’s actually said ‘you know what, I really want GPU-accelerated PhysX, I’d like to tie myself to Nvidia and that sounds like a great plan.'"

This rather irritated Team Green, who hit back with some comments of their own. In an interview with PCGamesHardware, the director of product management for PhysX, Nadeem Mohammed, said: "we do not pay developers to select PhysX instead of other physics solution. Once PhysX is selected...we will work closely with them [the developer] to provide whatever engineering and technical assistance to make the PC version as good as it can be—and hopefully that includes pushing the edge on special PhysX effects which may require GPU acceleration for best performance."


In Batman: Arkham Asylum, PhysX is used to create the fog and shadows above (along with some GPU cloth)

From here the situation declines rapidly. AMD claims that PhysX is proprietary, deliberately deprecated to lower CPU performance, and that NVIDIA's claims that gamers actually want PhysX are spurious. According to Team Red, NVIDIA is propping up PhysX through marketing deals and bribes; the company points to NVIDIA's decision to disable PhysX if a non-NVIDIA GPU is present and the squabble over antialiasing in Batman: Arkham Asylum as proof that NVIDIA can't be trusted to maintain an "open" standard. NVIDIA's response? bullsh err, "Nonsense." We don't have room to address the complete scope of the disagreement, but lets examine whether or not PhysX counts as proprietary.



Without PhysX, the game looks more like this. Detail levels are just as good, but the ambient fog and cloth are missing.

This Section Brought To You By Ragdoll Physics

PhysX is free in the sense that anyone can download the developer tools but it's not an open standard, despite NVIDIA's attempts to portray it as such. An open standard (like OpenGL) is typically maintained by a neutral third party or jointly controlled by several competitors.  NVIDIA has made comments implying that it would love to see ATI implement PhysX support in-driver, but AMD's experience with standards controlled by its principle competitor (Intel) have probably left it deeply leery of such partnerships. It's all to easy to see NVIDIA agreeing to share its current PhysX implementation only to turn around and announce a new, NVIDIA GPU-required version of PhysX, hypothetically dubbed PhysX 2.0.

Unless NVIDIA agrees to share power or turns the standard over to a neutral third party AMD (probably) won't come to the table.Instead of a unified hardware physics platform, we'll likely have to deal with two competing solutions. NVIDIA has the advantage of having started much earlier, but a true open standard from AMD could gather developer support quickly.

NVIDIA pays lip service to the idea of sharing PhysX with AMD, but the company's decision last fall to disable PhysX if an AMD GPU was running as the primary display speaks much louder than any nice talk about sharing. It's never, ever a good idea to punish your customers by taking away product functionality that they actually paid for. The company claims compatibility issues and QA testing were the problem, which should mean that the option is plastered with WARNING stickers and "Use at Your Own Risk."

Until the dust settles, gamers can look forward to fractured standard support, forced incompatibility, and a whole lot of finger-pointing.We at Hot Hardware can't predict what hardware acceleration standard will win, but as gamers we all lose.

One last note. The popularity of PhysX changes depending on how you tally it. If we count games that use software PhysX then NVIDIA's claim that over 240 titles use PhysX is probably accurate. If we go by hardware accelerated PhysX, there are just 15 games on NVIDIA's own list. Title quality is hit-and-miss. Even if we assume there are 10 games that should be on that list and aren't, hardware PhysX support clearly hasn't taken off yet.
  • | Post Points: 95
Top 10 Contributor
Posts 4,838
Points 45,830
Joined: Feb 2008
Location: Kennesaw
rapid1 replied on Tue, Mar 16 2010 5:02 PM

PhysX would be cool if the implementation was across the board, and could be a tool used in any game. Then the programmers could choose if they wanted to use it, and the gamers could choose whether it was important enough to buy the title because it was or is supported in it. Other than that all I see is a cat fight. As for the largest holding of specific hardware which is currently (seeing as Nvidia new devices are not yet and have not been available although talked about a lot for the last year) unavailable as it will be for the near future, there hurting themselves. If it was an open spec and ran on anything, and Nvidia did it better this would earn them business. As it currently stands it really does not do much for anyone.That is except for a few extra objects which would not really be missed flying around in the air on a game (and 10 games total at that).

OS:Win 7 Ultimate 64-bit
MB:ASUS Z87C
CPU:Intel(R) Core(TM) i7 4770 ***
GPU:Geforce GTX 770 4GB
Mem:***ingston 16384MB RAM
  • | Post Points: 5
Top 25 Contributor
Posts 3,795
Points 40,670
Joined: Jan 2010
Location: New York
Inspector replied on Tue, Mar 16 2010 6:10 PM

I hate those dam fogs(in fps games) :P lol

When the dust does settle and they can HOPEFULLY get along, earth will become a better place (gamers world at least)

  • | Post Points: 20
Top 150 Contributor
Posts 639
Points 7,630
Joined: Jul 2009
ClemSnide replied on Tue, Mar 16 2010 6:19 PM

Wow. These guys make the flame wars between Windows and Mac users look like water balloon fights (with completely accurate trajectories, of course). I guess AMD is still glowing from their victory of sorts over Intel-- the language looks limilar.

I have nothing against physics, or PhysX as all th kids are calling it these days; I just wish that it'd be used in more games, and not in the traditional way. For example, an old saying in Everquest (and other MMORPGs) is "the weakest among us can carry twenty anvils; the strongest among us cannot carry twenty-one feathers." (Substitute the numbers and objects based on current stack sizes in your favorite game.) Weights and measures in games, not just MMO games, are piss-poor gosh-awful. People now expect to be able to carry a rocket launcher, two assault guns, a brace of shotguns, a Civil War-era cannon, and the odd fusion gun or two without keeling over from the weight.

Oh, I know it's a gameplay issue, that people don't want to go around picking things up and putting them down. We can presume that this is because most gamers are males, who never pick things up around the house. But I really do wish for some reality injected into my fantasy.


"I didn't cry when Bambi's mother was shot... but I cried when HAL was turned off."

  • | Post Points: 5
Top 500 Contributor
Posts 90
Points 945
Joined: Jan 2010
Location: Calgary, AB
JoelB replied on Tue, Mar 16 2010 6:37 PM

Something else to think about: Why would I want to tax my GPU even further, when I've got a quad core machine that isn't doing much to begin with? How about throwing two cores at physics, the other two at general game code/networking/AI, and leave the graphics for the GPU? It's a bit of a shame to be wasting all this CPU power while the GPUs continue to be pushed to the limit.

  • | Post Points: 20
Top 500 Contributor
Posts 105
Points 960
Joined: Feb 2010
Location: Massachusetts
Zestia replied on Tue, Mar 16 2010 10:31 PM

Absolutely correct Joel - on both pieces.

  • | Post Points: 5
Top 500 Contributor
Posts 158
Points 1,735
Joined: Mar 2010

"Weights and measures in games, not just MMO games, are piss-poor gosh-awful. People now expect to be able to carry a rocket launcher, two assault guns, a brace of shotguns, a Civil War-era cannon, and the odd fusion gun or two without keeling over from the weight."

It's also ironic that Halo, a game that got rid of the dozen-weapons syndrome, completely defied physics in other ways: grenade jumps? Changing direction in midair? But yes, I would love to see more realistic physics and inventories in games. Unlike Doom 3, for example. Where does he keep the BFG, in his trousers? :P

"Why would I want to tax my GPU even further, when I've got a quad core machine that isn't doing much to begin with? How about throwing two cores at physics, the other two at general game code/networking/AI, and leave the graphics for the GPU?"

Excellent point Joel, especially since the LGA 1156 chips have a PCIe-controlling "northbridge" built in: this gets rid of latencies to the point where your idea is feasible. To be honest though the programmers already thought of it: RTS games with more complex AI are more CPU-intensive. And on a related note GPUs are also being used to aid the CPU in parallel processing, since that's what they're better at.

  • | Post Points: 20
Top 150 Contributor
Posts 639
Points 7,630
Joined: Jul 2009
ClemSnide replied on Wed, Mar 17 2010 2:26 AM

>Where does he keep the BFG, in his trousers? :P

Check out the comic book Sam & Max: Freelance Police. Every so often, Max (the clothesless rabbit, er, lagomorph) will pull out a huge firearm. Sam (a McGruff-like trenchcoated, anthropomorphic dog) will ask "Where'd you pull that gun out from, little buddy? Max will reply "None of your damn business, Sam."


"I didn't cry when Bambi's mother was shot... but I cried when HAL was turned off."

  • | Post Points: 5
Top 25 Contributor
Posts 3,486
Points 47,175
Joined: Nov 2005
Location: Metropolis
ForumsAdministrator
Moderator

Inspector:

I hate those dam fogs(in fps games) :P lol

When the dust does settle and they can HOPEFULLY get along, earth will become a better place (gamers world at least)

And the fog will still be there. But why would you want them to get along? When they are going-at-it, lawyers duking-it-out and all, at least you know they aren't in some kind of anti-trust collusion. Competition is GOOD!

 

 SPAM-posters beware! ®

  • | Post Points: 5
Top 100 Contributor
Posts 1,076
Points 11,645
Joined: Jul 2009
Joel H replied on Wed, Mar 17 2010 2:35 PM

Joel,

Physics calculations are a much better fit for the GPU than the CPU. Such calculations are performed independently (and simultaneously) on a huge number of widgets. If you want to run calculations on the intersection and movement of independent particles, you want the widest parallel engine you can get your hands on.

Let's look at some practical examples. Assume a GeForce card with 240 SPs running at 650MHz. When the program needs physical calculations done, it allocates between 32-64 SPs to the task. (Theoretically it may not need this many; I don't honestly know). Between 13.3 percent and 26.7 percent of the GPU's pixel-crunching power has just been re-allocated. Does your frame rate drop? Possibly--but massive explosions that send tons of particles flying around are the sorts of scenarios that *already* cause frame rate dips in modern games, so you don't notice anything out of the ordinary.

Now, let's assume that a modern, complex x86 core is capable of handling the physics work it takes 4 SPs to do. Since our hypothetical core is running at 3.2GHz, it can handle the equivalent workload of 19.7 SPs per core. Thus we can dedicate two CPUs worth of performance to hardware physics calculations and see similar performance, right?

Not so much. Remember that Windows is constantly handling a variety of background tasks and shifting which programs are running on which cores to best balance power consumption, CPU loads, and responsiveness. It also takes time to spin threads out to CPUs for physics calculations and then spin the data back into the main program. (I'm guessing the GPU avoids this problem due to a thread scheduler built into the hardware.) You also need to remember that interacting particles could end up calculated on two separate CPUs. CPU1 knows what Particle X will do, CPU2 knows what Particle Y will do, but neither CPU knows what happens when Particle X meets Particle Y until they talk to CPU0.

Is it possible to do physics on the CPU? Of course--we've been doing it for decades. The fact remains, however, that physics calculations are embarassingly parallel--and since 3D graphics are as well, it makes sense that GPUs would make for excellent physics engines. If you want optimal performance, you want a bunch of small, simple engines working in parallel with an overarching thread scheduler, not a few massive engines that rely on the OS for performance tuning and data updates.

  • | Post Points: 5
Page 1 of 1 (10 items) | RSS