NVIDIA Sheds Light On Lack Of PhysX CPU Optimizations

rated by 0 users
This post has 46 Replies | 7 Followers

Top 10 Contributor
Posts 26,345
Points 1,191,290
Joined: Sep 2007
ForumsAdministrator
News Posted: Tue, Jul 13 2010 12:46 PM

NVIDIA Sheds Light On Lack Of PhysX CPU OptimizationsAbout four months ago, we covered the latest round of shin-kicking between ATI and NVIDIA, with ATI claiming that NVIDIA purposefully crippled CPU performance when running PhysX code and coerced developers to make use of it. NVIDIA denied all such claims, particularly those that implied it used its "The Way It's Meant To Be Played" program as a bludgeon to force hardware PhysX on developers or gamers.

A new report has dug into how PhysX is executed on a standard x86 CPU; the analysis confirms some of AMD's earlier statements...

NVIDIA Sheds Light On Lack Of PhysX CPU Optimizations

  • | Post Points: 125
Top 200 Contributor
Posts 450
Points 5,345
Joined: Jul 2010
Location: Cincinnati
sackyhack replied on Tue, Jul 13 2010 1:37 PM

Hmm, not sure what to make of this. I know any company's goal is to do anything within the boundaries of the law (and outside w/o getting caught) to make money, but at some point public sentiment towards their practices has to factor in or they're going to bleed customers. I think Nvidia desperately needs a ***-storm committee for their ideas.

"Will disabling physX when both an ATI card and Nvidia card are present cause a ***-storm?" "Yes" "Ok nix that"

"Will developing physX in an archaic manner just to avoid poeple using it w/o our hardware cause a ***-storm when they find out about it?" "Yes" "Damn, back to the drawing board!"

  • | Post Points: 5
Top 100 Contributor
Posts 1,076
Points 11,645
Joined: Jul 2009
Joel H replied on Tue, Jul 13 2010 3:23 PM

Sackyhack: Keep in mind that while we can't guarantee NVIDIA didn't purposely avoid some optimizations, the basic claim that there's only a certain amount of optimization and updating that can be done with a given code base. There comes a point when your programmers are spending more time figuring out how to kludge new features into old software than they are actually building the new features themselves.

NVIDIA's statements do make a certain amount of sense, but we'll have to wait and see how developers use the upcoming 3.0 SDK in order to make a better guess at whether or not the company is avoiding x86 optimizations deliberately.

  • | Post Points: 5
Top 50 Contributor
Posts 3,236
Points 37,910
Joined: Mar 2010
AKwyn replied on Tue, Jul 13 2010 7:19 PM

I just don't see why they could just make it work on x86 CPU's? If they're promoting CUDA and PhysX as standards then why don't they promote it as such, why do they tie it to NVIDIA and then force everybody to buy their products? I swear corporations will do anything to gain a quick buck, even if it means delaying CPU PhysX for years. CPU's are advanced enough to take advantage of multiple threads and therefore even the CPU's with hyper-threading should be able to utilize the multiple-threads to offer similar performance to the GPU PhysX.

I don't they why they they have to rewrite the entire architecture, games like UT3 might be broken by the new architecture and other popular games that rely on PhysX so how are they going to do it without breaking compatibility? Many questions that remain unanswered due to NVIDIA's corporate greed.

 

"The future starts with you; now start posting more!"

  • | Post Points: 20
Top 500 Contributor
Posts 102
Points 1,315
Joined: Jun 2010

TaylorKarras:

I just don't see why they could just make it work on x86 CPU's? If they're promoting CUDA and PhysX as standards then why don't they promote it as such, why do they tie it to NVIDIA and then force everybody to buy their products? I swear corporations will do anything to gain a quick buck, even if it means delaying CPU PhysX for years. 

They're promoting CUDA and PhysX only insofar as they will drive sales of NVDIA GPUs.  Corporations exist to make a profit; all other considerations are secondary.  And in this case with PhysX, it has been a lengthy road of development and marketing; there was nothing quick about this buck.

  • | Post Points: 5
Top 100 Contributor
Posts 1,076
Points 11,645
Joined: Jul 2009
Joel H replied on Tue, Jul 13 2010 8:32 PM

Taylor,

You don't understand the situation properly. PhysX is a physics middleware engine. It runs on CPUs. It runs on GPUs. Most games in development are console games and PhysX is executed using the CPU on both the XBox 360 and the PS3. This is a point we keep coming back to again and again because it seems so poorly understood. PhysX is a hardware AND a software solution. When we talk about hardware PhysX, we're talking about GPU-executed PhysX. That's a very small chunk of the total Physx base.

  • | Post Points: 20
Top 50 Contributor
Posts 3,108
Points 38,255
Joined: Aug 2003
Location: Texas
acarzt replied on Tue, Jul 13 2010 10:11 PM

Physx actually works very well in at least 1 PC game I know of, regardless of platform.

Metro 2033. Altho the game is an incredible resource hog otherwise. Physx seems to have no additional impact on one platform over an other. And there are plenty of examples of Physx in the game as well.

So reall I think it comes down to how well it is coded into the game.

Other games like Batman AA will slow a system to a crawl if you do not have hardware physx running on an Nvidia card.

  • | Post Points: 5
Not Ranked
Posts 78
Points 675
Joined: Mar 2010
Location: New York

What the poop! Are we all talking about my new 480 card I have in my PC?!  I was quite ignorant thinking that Nvidia's main  tackle was improving PC graphics, I mean it's what they do right. Well there's money in optimizing consoles seeing that it's the majority of gaming. I'm hopeful for a revamp of there driver's using x86. Hopefully it won't take them to much time. It boggles my mind that this card could perform magnitudes eh , but still better, and Nvidia hasn't perused to do it yet. Talk about not caring. I have a case badge of you guys!!!graahh

  • | Post Points: 20
Top 50 Contributor
Posts 3,108
Points 38,255
Joined: Aug 2003
Location: Texas
acarzt replied on Wed, Jul 14 2010 12:51 AM

I think you missed the point MrBrownSound.

This has nothing to do with the performance of your video card.

It has to do with Physx support across multiple platforms.

The big debate is that Physx is only optimized for Nvidia hardware.

  • | Post Points: 5
Top 100 Contributor
Posts 1,076
Points 11,645
Joined: Jul 2009
Joel H replied on Wed, Jul 14 2010 10:50 AM

Acartz,

I've actually never been able to catch a difference in PhysX performance between having it on and off in any CPU in Metro 2033. Visually it does make a difference (although a subtle one). So I do agree--there appears to be some solid optimization there.

We could test it, I suppose, by turning PhysX on and off on a dual-core or even single-core CPU in M2033. That might be interesting.

  • | Post Points: 20
Top 500 Contributor
Posts 158
Points 1,735
Joined: Mar 2010

I usually support AMD, but I can see right through this. OF COURSE they dug up dirt on this, since it's a competitor's product! Now, I don't like nVidia's ambiguous "promote it/tie it to hardware" stance, but optimization simply doesn't work if you're developing for multiple platforms.

To answer Taylor: I think that it's not CPU optimized because although I agree that multiple cores are underutilized, a CPU has only 12 parallel processing cores at the most (here I refer to the i7-980X) but a GPU can have hundreds, making it perfect for calculating hundreds of particles (water drops, bullet casings, falling pencils, whatever). That's why I think they didn't focus on CPU optimization, and for another reason: going back to optimization for multiple platforms. What do they choose to support, Intel or AMD? And besides, like Joel said, nobody's stopping people from developing for software Physx as well, where physics calculations are built into the game/simulator engine.

  • | Post Points: 20
Top 50 Contributor
Posts 3,108
Points 38,255
Joined: Aug 2003
Location: Texas
acarzt replied on Thu, Jul 15 2010 3:38 AM

That would be pretty awesome Joel! I'd be very interested in seeing those results!

  • | Post Points: 5
Top 100 Contributor
Posts 1,076
Points 11,645
Joined: Jul 2009
Joel H replied on Thu, Jul 15 2010 1:36 PM

Nether, (nice reference to Kara, btw).

 

AMD didn't commission this study. AMD has nothing to do with this study. Kanter is a person of very long standing in the tech community--his work confirms some of the facts AMD reported, insomuch as PhysX isn't optimized well for x86 CPU performance. AMD drew its own conclusions about why that is, Kanter drew his, and NVIDIA has its own explanation.

 

None of the parties in question--not AMD, not NVIDIA, not Kanter--are arguing that PhysX is fabulously well optimized for x86 CPUs. NVIDIA's position can be summarized simply in three parts:

1)  There are reasons why the numbers look the way they do.

2)  Kanter is using older projects that probably used fewer optimizations.

3)  The new code base NV will launch will make it easier to take advantage of CPU optimizations.

  • | Post Points: 5
Not Ranked
Posts 1
Points 20
Joined: Jul 2010

I don't buy Nvidia's response. It is not a simple case of them not wanting to optimize PhysX for the CPU, they seem to be purposely crippling it. As I understand David Kanter's original article, there is no advantage in using x87 over SSE. SSE and later iterations are much faster and easier to use, so why use x87?

 

Also, as Charlie Demerjian pointed out in his blog, today's compilers already default to use SSE in their output. If you want compiler's to output x87, to have to specifically tell it so. Of course given Charlie's obvious bias against Nvidia, you need to take this with a grain of salt, but it sounds logical given Intel and AMD deprecated x87 use years ago. Perhaps those who have experience in this can shed more light.

 

All they need is to recompile. Then Nvidia could've spent at most several weeks testing the recompiled library, according to David Kanter. And also, they could still provide the x87 version as an option during setup, in case somebody really needs it.

 

Simply put, there simply is no reason not to use SSE. If Nvidia doesn't want to optimize PhysX on the CPU, fine, I can understand that. But to use older, slower code deliberately just so that they can show PhysX is faster on their hardware by a lot is downright deceitful. Just as deceitful as disabling PhysX when there is an ATI GPU present, or disabling AA in Batman when using ATI GPU, or presenting non-functioning Fermi mockups and telling the world it's the real "puppy."

  • | Post Points: 20
Top 100 Contributor
Posts 1,076
Points 11,645
Joined: Jul 2009
Joel H replied on Fri, Jul 16 2010 10:55 AM

Thundermane,

I think you're missing part of NVIDIA's point. Remember, PhysX is a physics engine that runs in software mode (on the CPU) on virtually every platform it supports. Not optimizing for x86 code probably was a business decision, but the SDK itself isn't/wasn't designed to take full advantage of SSE when it was written 5-6 years ago.

Do I think there's probably more NV could do with its existing SDK?  Yes. Do I think there's a legitimate business reason why the company prioritized Cell and XBox 360 development over PC development? Absolutely. It's not just the fact that the majority of games are console games, it's the nature of the beast. Consoles are much more difficult to program; it's harder to get high performance out of them.

Making fundamental changes to the way a program multithreads or evaluates SIMD instructions is a complex process that could easily break backwards compatibility. I think we'll know a lot more once the new SDK is out.

  • | Post Points: 20
Top 10 Contributor
Posts 8,686
Points 104,335
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Wed, Aug 25 2010 9:59 AM

"The best way to encourage people to buy NVIDIA GPUs is to ensure that the special effects are amazing and only available to NVIDIA customers. Optimizing PhysX to run on an x86 CPU potentially dilutes the attractiveness of an NVIDIA GPU,"

______________________________________________

My belief is that the best way to get people to buy your products is to build a better product. (like the awesome new GTX460 series)

If PhysX was opened up and tweaked to take advantage of all of today's techno-advances, life would be better for us, the consumers, and much ill-feeling against NVIDIA would go away. I realize that they bought the fledgling concept in that company, but they closed it off to others and crippled it for their own benefit too. This was a lousy public relations move and has turned friends into foes.

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 20
Top 100 Contributor
Posts 1,076
Points 11,645
Joined: Jul 2009
Joel H replied on Wed, Aug 25 2010 10:13 AM

Real,

You've oversimplified the situation. First, NVIDIA has spent a great deal of money "building a better product." As this article stated, the next version of CUDA will incorporate more SIMD instructions and will generally be more efficient. At the same time, as you've noted, the company has worked towards releasing better iterations of Fermi at various price points.

We know NVIDIA has sunk a lot of money into CUDA and PhysX development. We know they've sunk a ton of money into future video card designs. Yes, NVIDIA wants developers to use CUDA and PhysX, but it's not as if the GeForce series has been crippled when it comes to running DirectCompute or OpenCL.

I'm not claiming that NVIDIA's approach to hardware PhysX adoption is the best possible one--but it's quite inaccurate to paint NVIDIA as opting for FUD as opposed to real improvements.

  • | Post Points: 35
Top 500 Contributor
Posts 140
Points 1,710
Joined: Jun 2010
Location: Toronto
crowTrobot replied on Wed, Aug 25 2010 10:40 AM

I know Nvidia gets a lot of hate but c'mon lets get real here. 

1.) They own PhysX, whatever they want to do with it is ther prerogative.

2.) They aren't blocking DirectCompute and a lot of these effects can be done through that now.

3.) They aren't blocking OpenCL either and they are part of Khronos group.

4.) If there are any physics engine right now that performs as well as PhysX and is as easy to implement in game as PhysX I'd love to see it, but frankly there are none.  Truth is  Nvidia is the only company right now that is actually putting their money where their mouth is in terms of advancing Physics.  Bullet has been all but abandoned (not surprisingly), the lead guy left development and its in limbo once again, although I'm amazed at the lack of flack ATI is getting for constantly promising things and not delivering. 

5.) All other physics engines are way behind PhysX in terms of technical advancement and its going to take a long while before they all catch up if they do (except for Bullet, unless ATI invests a major amount of time and resources on it in terms of developers and not just publicists coming out every few months reminding people they are working on an Open approach that never manifests into anything tangible).

  • | Post Points: 20
Top 10 Contributor
Posts 8,686
Points 104,335
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Wed, Aug 25 2010 11:25 AM

Joel H:
 Real, You've oversimplified the situation. 

I have a simple perspective. I'm not anywhere near as knowledgeable as you about this stuff. I don't know as much about this and many other things as you. I look at it as 'what I'm experiencing when I play games'. I've tried turning on PhysX with my 2GB GTX285 and my 1GB HD5850 card in both computers to no avail. it just doesn't work. So here is something that everybody's crowing about that I can't touch with either of my several hundred dollar video cards. Then I go online and read about it a little (mind that I really don't understand the technical aspects entirely) and lots of supposedly knowledgeable people are bitching that NVIDIA has crippled the technology unless you shell out big money to them, and they're trying to strong arm developers into adopting their propitiatory tech within their games. (leaving a lot of us to wonder what it looks like)

So I resent it. It's simple.

BTW: This doesn't mean that I won't buy from them. (the ultimate expression of financial love) I have my eye on a pair of those GTX460's for an SLI box. Price, performance and the inclusion of those technologies that I've been missing drive this decision.

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 20
Top 100 Contributor
Posts 1,076
Points 11,645
Joined: Jul 2009
Joel H replied on Wed, Aug 25 2010 12:21 PM

Real,

What you've just said is actually a lot more helpful when it comes to possibly helping you solve that problem. :) You say you have a GTX 285 (2GB) and a 1GB Radeon HD 5850. Are you using both of them in the same system? I'm assuming not--I don't think the 5850 is powerful enough to trounce the GTX 285 (some people on Win 7 use a hacked drive to allow them to run a secondary GeForce for PhysX while simultaneously using a higher-end ATI card.


There's no reason PhysX shouldnt' be working on your GTX 285. There are several possibilities here:

1)  You've got a driver issue/bug that's keeping PhysX from executing properly.

2)  You're confused about the difference between software and hardware PhysX. The term "Software PhysX" refers to PhysX calculations performed by the CPU. There are something like 150 games that use software PhysX--NVIDIA has adapted it for the PS3, XBox, Wii, and iPod.

Hardware PhysX titles are much rarer. To date, Batman: Arkham Asylum, Mafia 2, Unreal Tournament 3, and Mirror's Edge have been the go-to titles for good game play and high-level PhysX support. Software PhysX titles do not take advantage of NVIDIA hardware.

This is a distinction that's not easy to clarify; I frankly wish NVIDIA had used two distinct terms for their software PhysX engine (which runs on many architectures) and the hardware PhysX execution that's particular to GeForce cards.

Here's the final caveat that makes the situation more nuanced than it looks at first glance. There are only a handful of hardware PhysX titles, there's an even smaller handful of those titles that are top-notch, and yes, NVIDIA is using a restricted, proprietary API. Look around the industry, however, and you'll see that NVIDIA is also the *only* company that's sunk real work into defining, developing, and using a hardware physics standard.

ATI periodically makes some noise in this general direction but has never released hardware physics support for any shipping title. Microsoft's DirectCompute and OpenCL are both available, but we haven't heard of any games adopting these options in hardware. That's why NV gets credit--however small their success, they remain the only company that's put a major, multi-year push behind hardware physics development.

Now, back to your situation. If you're having a problem that isn't covered by one of the two examples above, I'd be more than happy to work with you to get the GTX 285 running as it should. We can talk here, or you can drop me a PM/email.

 

  • | Post Points: 35
Top 500 Contributor
Posts 140
Points 1,710
Joined: Jun 2010
Location: Toronto
crowTrobot replied on Wed, Aug 25 2010 12:35 PM

Joel H:

ATI periodically makes some noise in this general direction but has never released hardware physics support for any shipping title. Microsoft's DirectCompute and OpenCL are both available, but we haven't heard of any games adopting these options in hardware.

Lost Planet 2 uses DirectCompute for enhanced water physics and softbody compute although the framerate hit is much more massive than physx (because there is only option to turn on/off DX11 options and it includes tessellation), also the water physics implementation in that game looks horrible since its only shallow water effect and not particle.

  • | Post Points: 20
Top 100 Contributor
Posts 1,076
Points 11,645
Joined: Jul 2009
Joel H replied on Wed, Aug 25 2010 12:43 PM

Technically I stand corrected. From your description, however, it sounds pretty awful.

  • | Post Points: 5
Top 10 Contributor
Posts 8,686
Points 104,335
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Wed, Aug 25 2010 1:27 PM

Joel,

The GTX285 is in the PC I'm using today. It has a i5-750, 8GB of DDR3-1600 RAM, a OZC Agility 60GB SSD, and a pair of Seagate 750TB 7200RPM Drives. This is the "Little" box so to speak.

I have UT3 and have added the PhysX mod to the game. I've updated the game completely as far as I know. When I try to run the PhysX maps, it slows to a crawl right away. (totally unplayable and not fun)

At the moment, I'm running the latest driver from NVIDIA. (I do a weekly search for all of the computers for drivers just to stay busy) I have tried several.

After a couple of months of screwing around with this situation, swapping the cards around between this, the i7-870, and the AMD X3-720 system, trading the two types of DDR3-1600 RAM that I own, Formatting the drive and installing the game only on the systems, I just gave up in frustration.

I even bought a new Gigabyte GT240 1GB-GDDR5 OC edition PCI-E card to add to the system to act as the PhysX processor,.....to no avail. (it's now sitting on the shelf collecting dust) Adding another card to the PCI-e bus chops my video bandwidth from one @ X16 speeds to dual X8 speeds on these two LGA-1156 socket systems, but according to some sites out there, it should still work. It did not.

EDIT: I just tried it out again after posting this, and it played fine for about 30 seconds and then slowed down to a choppy crawl.

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 20
Top 50 Contributor
Posts 3,108
Points 38,255
Joined: Aug 2003
Location: Texas
acarzt replied on Wed, Aug 25 2010 2:27 PM

crowTrobot:

If there are any physics engine right now that performs as well as PhysX and is as easy to implement in game as PhysX I'd love to see it.

Havok

http://www.youtube.com/watch?v=3bKphYfUk-M

  • | Post Points: 20
Top 50 Contributor
Posts 3,108
Points 38,255
Joined: Aug 2003
Location: Texas
acarzt replied on Wed, Aug 25 2010 2:29 PM

realneil:

EDIT: I just tried it out again after posting this, and it played fine for about 30 seconds and then slowed down to a choppy crawl.

 Properly implemented Physx can be very efficient while still offering some very nice effects.

Metro 2033 is a perfect example. Enabling Physx offers little to no performance hit, yet provides a noticable difference in visuals.

  • | Post Points: 20
Top 10 Contributor
Posts 8,686
Points 104,335
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Wed, Aug 25 2010 2:56 PM

acarzt:
Properly implemented Physx can be very efficient while still offering some very nice effects.

So it's the UT3 game that's stuffed?

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 20
Top 100 Contributor
Posts 1,076
Points 11,645
Joined: Jul 2009
Joel H replied on Wed, Aug 25 2010 3:36 PM

RealNeil,

No. I've personally played UT3's PhysX maps with a GTX 260. Based on what you've described, it's not that PhysX doesn't work--the code is being run on your CPU rather than your GPU.

The simplest thing to do would be to try a different program. The Mafia 2 demo supports PhysX, Batman: AA has a good implementation and can be downloaded here:  http://www.nzone.com/object/nzone_batmanaa_downloads.html


The reason I'm suggesting we check a different title is because it'll help nail down whether or not the problem is in-driver (I suspect it is).

Two simple questions for you--I'm assuming you've done both of these, but it's always best to cover the bases when troubleshooting.

1)  The new NVIDIA drivers allow you to explicitly choose how PhysX is executed. Have you selected to run PhysX on the GPU? (If it says Auto, try changing it to the GPU explicitly).

2)  In Unreal Tournament 2003,  have you've enabled hardware PhysX using the toggle inside the game menus? 

3) You say you've tried multiple driver revisions. How do you install and uninstall drivers? Describe the steps you take, please.

We'll sort this out.

Joel

  • | Post Points: 20
Top 10 Contributor
Posts 8,686
Points 104,335
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Wed, Aug 25 2010 5:06 PM

Point #1) was set to auto so I changed it to use just the GTX285. Game worked smoothly for a longer time, (one minute) but still reverted to jerkiness (crappy frame rate) When I had the two NVIDIA cards in the system, I had it set to use the GT240 1GB card with no decent result. It was even worse.

Point #2) I have the hardware PhysX turned on in the game and have had it on all along.

Point #3) I usually just install the latest released driver onto the computer when I download it. I do not do an uninstall first.

 

Thanks for the help

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 20
Top 100 Contributor
Posts 1,076
Points 11,645
Joined: Jul 2009
Joel H replied on Wed, Aug 25 2010 5:43 PM

I usually just install the latest released driver onto the computer when I download it. I do not do an uninstall first..

Yeah..that's almost certainly the problem, and just running NVIDIA's removal tool isn't good enough. What OS are you running? I can't guarantee that this is your issue...but I'd lay good odds--75%--that not uninstalling drivers first is the problem. Don't just go run NVIDIA's own uninstallation software--you've got to do this the right way or else you'll be screwing with it for hours.

  • | Post Points: 20
Top 10 Contributor
Posts 8,686
Points 104,335
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Wed, Aug 25 2010 6:13 PM

I have Enterprise Win-7 64Bit on all of them.

I have Revo Uninstaller on the systems too. Should I use that?

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 20
Top 100 Contributor
Posts 1,076
Points 11,645
Joined: Jul 2009
Joel H replied on Wed, Aug 25 2010 7:50 PM

I don't know anything about the Revo Uninstaller, it may or may not do what we want it to. I'm going to tell you exactly what I do when I'm cleaning up and reinstalling drivers.

Step 1:  Download the driver that you're going to use.

Step 3:  Drop by Guru3D, download the latest version of Driver Sweeper. Make sure you grab the 2.xx version of the program, not the 1.xx. Install it.

Step 4:  If you have an unzipped set of NV drivers sitting in a directory on your C:\, delete them. You don't need to re-download the driver install file if you have the one you want, but delete the unpacked files.

Step 5:  Remove your current driver using NVIDIA's applet in the Add/Remove Programs panel. Also remove any additional programs listed separately, including NVIDIA PhysX.

Step 6:  Reboot and enter safe mode.

Step 7: Run Driver Sweeper. Clean off all the NV drivers on your system.

Step 8:  Open a CMD window.

Step 9: Move yourself to C:\Windows\System32

Step 10:  Type "delete nv*.* /s

Step 11:  Reboot

Step 12: If we've done all this correctly, Windows will *not* auto-reinstall the old driver set (we've killed all the places that driver set could be).

Step 13: Run the driver install package and install everything normally.

Step 14: Success (Hopefully).

In theory, NVIDIA's uninstaller should be good enough for you to flawlessly install new drivers. Don't trust theory. This method, done correctly, eliminates any chance of version conflicts by ensuring all NV driver files are wiped prior to driver installation.

 

  • | Post Points: 20
Top 500 Contributor
Posts 140
Points 1,710
Joined: Jun 2010
Location: Toronto

acarzt:

crowTrobot:

If there are any physics engine right now that performs as well as PhysX and is as easy to implement in game as PhysX I'd love to see it.

 

Havok

http://www.youtube.com/watch?v=3bKphYfUk-M

 

Well that demo actually doesn't show that it performs better than physx Stick out tongue

Granted that video is 3 years old now, but those are just basic collision physics.  More impressive would be particle manipulation to simulate water or gas for example.  The skeletal animation on that thing looks very basic too. Also since havok is made by intel, its only CPU implemented vs GPU parallelization with physx.

Best example of the current state of havok physics is probably Just Cause 2, although there are no hundreds of individually reacting physics particles in there at the same amount of time on the screen (dissappears after two seconds) like there is in something like Mafia 2 or Batman.  Even the impressive looking water in Just Cause 2 was implemented using CUDA (though not directly PhysX, it moreso shows the limitations of havok when handling fluid mechanics)

 

  • | Post Points: 5
Top 10 Contributor
Posts 8,686
Points 104,335
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Wed, Aug 25 2010 10:52 PM

Joel,

Tried all of this and it went according to plan, but it didn't help any. The PhysX problems remain the same in UT3.

D/L'd Batman,......will install in the morning to check it out. Maybe the Metro game will work correctly. We'll see about that one too.

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 20
Top 100 Contributor
Posts 1,076
Points 11,645
Joined: Jul 2009
Joel H replied on Wed, Aug 25 2010 11:08 PM

You wouldn't notice it in Metro 2033; the PhysX effects are *very* subtle. Install the patch for Batman as well to make sure you can activate PhysX from the game's control panel.  If you google for PhysX Batman Arkham Asylum I'm certain you'll find videos of areas where you can see PhysX on vs. off. It's very noticeable. With PhysX, you get ground fog, outgassing, much more debris and bats/birds, etc.

  • | Post Points: 20
Top 50 Contributor
Posts 3,108
Points 38,255
Joined: Aug 2003
Location: Texas
acarzt replied on Thu, Aug 26 2010 5:04 AM

Also when you walk through leaves and papers on the ground they move and around vs with physx off, it just sits there.

You will take a hit in Batman AA with physx on tho lol

But for me it was still very playable :-)

The biggest visual difference in batam man is when you face scare crow. The difference is HUGE.

  • | Post Points: 20
Top 100 Contributor
Posts 1,076
Points 11,645
Joined: Jul 2009
Joel H replied on Thu, Aug 26 2010 10:27 AM

Actually, assuming that the demo has the same benchmark as the full game, all you need to do is run it. With PhysX on vs. off you'll see huge differences.

  • | Post Points: 20
Top 10 Contributor
Posts 8,686
Points 104,335
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Thu, Aug 26 2010 10:47 AM

Joel,

Batman seems to install properly,....but when I try to run it, this is what I get right away.

So It looks like I will not be trying Batman out anytime soon,.......sigh.

BTW: I just reinstalled MS Net Framework and Silverlight too. No change.

I'm downloading a Mafia-II demo and the Metro2033 demo right now. Will try them out.

If neither one of them play properly I may just Backup files and reload the OS and start fresh and see what my results are.

_______________________________________________

Error Message:

 

"Microsoft Net Framework"

"Unhandled exception has occurred in your application"

"Value was either too large or too small for an INT32"

Exception Text **************
System.OverflowException: Value was either too large or too small for an Int32.
   at System.Number.ParseInt32(String s, NumberStyles style, NumberFormatInfo info)
   at BmLauncher.SystemInfo.GetPropertyInt(ArrayList info, String propertyName, Int32 property_index)
   at BmLauncher.SystemInfo.GetPropertyIntBest(ArrayList info, String propertyName, Boolean select_highest)
   at BmLauncher.SystemInfo..ctor()
   at BmLauncher.Form1.Initialise()
   at BmLauncher.Form1.OnLoad(Object sender, EventArgs e)
   at System.Windows.Forms.Form.OnLoad(EventArgs e)
   at System.Windows.Forms.Form.OnCreateControl()
   at System.Windows.Forms.Control.CreateControl(Boolean fIgnoreVisible)
   at System.Windows.Forms.Control.CreateControl()
   at System.Windows.Forms.Control.WmShowWindow(Message& m)
   at System.Windows.Forms.Control.WndProc(Message& m)
   at System.Windows.Forms.ScrollableControl.WndProc(Message& m)
   at System.Windows.Forms.ContainerControl.WndProc(Message& m)
   at System.Windows.Forms.Form.WmShowWindow(Message& m)
   at System.Windows.Forms.Form.WndProc(Message& m)
   at System.Windows.Forms.Control.ControlNativeWindow.OnMessage(Message& m)
   at System.Windows.Forms.Control.ControlNativeWindow.WndProc(Message& m)
   at System.Windows.Forms.NativeWindow.Callback(IntPtr hWnd, Int32 msg, IntPtr wparam, IntPtr lparam)


************** Loaded Assemblies **************
mscorlib
    Assembly Version: 2.0.0.0
    Win32 Version: 2.0.50727.4952 (win7RTMGDR.050727-4900)
    CodeBase: file:///C:/Windows/Microsoft.NET/Framework/v2.0.50727/mscorlib.dll
----------------------------------------
BmLauncher
    Assembly Version: 1.0.0.0
    Win32 Version: 1.0.0.0
    CodeBase: file:///F:/Program%20Files%20(x86)/Eidos/Batman%20Arkham%20Asylum%20Demo/Binaries/BmLauncher.exe
----------------------------------------
System.Windows.Forms
    Assembly Version: 2.0.0.0
    Win32 Version: 2.0.50727.4927 (NetFXspW7.050727-4900)
    CodeBase: file:///C:/Windows/assembly/GAC_MSIL/System.Windows.Forms/2.0.0.0__b77a5c561934e089/System.Windows.Forms.dll
----------------------------------------
System
    Assembly Version: 2.0.0.0
    Win32 Version: 2.0.50727.4927 (NetFXspW7.050727-4900)
    CodeBase: file:///C:/Windows/assembly/GAC_MSIL/System/2.0.0.0__b77a5c561934e089/System.dll
----------------------------------------
System.Drawing
    Assembly Version: 2.0.0.0
    Win32 Version: 2.0.50727.4927 (NetFXspW7.050727-4900)
    CodeBase: file:///C:/Windows/assembly/GAC_MSIL/System.Drawing/2.0.0.0__b03f5f7f11d50a3a/System.Drawing.dll
----------------------------------------
System.Management
    Assembly Version: 2.0.0.0
    Win32 Version: 2.0.50727.4927 (NetFXspW7.050727-4900)
    CodeBase: file:///C:/Windows/assembly/GAC_MSIL/System.Management/2.0.0.0__b03f5f7f11d50a3a/System.Management.dll
----------------------------------------

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 20
Top 100 Contributor
Posts 1,076
Points 11,645
Joined: Jul 2009
Joel H replied on Thu, Aug 26 2010 10:54 AM

I'll dig through this and see what I can do. In the meantime, try asking Win 7 to run the game in compatibility mode. You are running the final retail of Win 7 Enterprise with all patches and updates installed?

 

EDIT:  http://forums.eidosgames.com/showthread.php?t=92273

 

Hopefully that's your solution. I found that by Googling "Value was too large or too small for an Int32" + "Batman."  Updating your .NET framework is always a good idea.

  • | Post Points: 20
Top 10 Contributor
Posts 8,686
Points 104,335
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Thu, Aug 26 2010 11:50 AM

Joel H:
You are running the final retail of Win 7 Enterprise with all patches and updates installed?
Yes, I keep them all up to date.

Joel H:
http://forums.eidosgames.com/showthread.php?t=92273
This worked and I was able to launch the game normally. I just spent a while playing the demo, It works just fine. There is no in-game enabler for PhysX that I could see, so I'm not sure that it's on or not. The game did look rich to me. I'll install it onto the PC with the Radeon card in it and play it there to see what the difference may be.

I appreciate the help Joel.

Neil

 

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 20
Top 100 Contributor
Posts 1,076
Points 11,645
Joined: Jul 2009
Joel H replied on Thu, Aug 26 2010 12:13 PM

It's not in-game. If I recall correctly you navigate to the games directory, right-click on the icon, and choose setup/configure/something-or-other.

 

EDIT:  Make sure you have installed the smaller patch that was linked on the same page as the demo.

Further Edit:  I don't know that this is true, but it's entirely possible that not having the .NET framework 3.5 installed is what caused your UT3 problems. So I'd check that again, too.

  • | Post Points: 20
Page 1 of 2 (47 items) 1 2 Next > | RSS