NVIDIA Sheds Light On Lack Of PhysX CPU Optimizations

Article Index:   
About four months ago, we covered the latest round of shin-kicking between ATI and NVIDIA, with ATI claiming that NVIDIA purposefully crippled CPU performance when running PhysX code and coerced developers to make use of it. NVIDIA denied all such claims, particularly those that implied it used its "The Way It's Meant To Be Played" program as a bludgeon to force hardware PhysX on developers or gamers.

A new report from David Kanter at Real World Technologies has dug into how PhysX is executed on a standard x86 CPU; his analysis confirms some of AMD's earlier statements. In many cases, the PhysX code that runs in a given title is both single-threaded and decidedly non-optimized. And instead of taking advantage of the SSE/SSE2 vectorization capabilities at the heart of every x86 processor sold since ~2005, PhysX calculations are done using ancient x87 instructions.

When in doubt, blame the PPU.

Before the introduction of SIMD sets like SSE and SSE2, if you wanted to do floating point calculations on an x86 processor, you used the x87 series of commands. In the past 11 years, however, Intel, AMD, and VIA have all three adopted SSE and SSE2. Both allow for much higher throughput than the classic x87 instruction set—given the ubiquity of support across the PC market, it's hard to tell why NVIDIA hasn't specifically mandated their use.

As RWT's analysis shows, however, virtually all of the applicable uops in both Cryostasis and Soft Body Physics use x87; SSE accounts for just a tiny percentage of the whole. Toss in the fact that CPU PhysX is typically single-threaded while GPU PhysX absolutely isn't, and Kanter's data suggests that NVIDIA has consciously chosen to avoid any CPU optimizations, and, in so doing, has artificially widened the gap between CPU and GPU performance. If that allegation sounds familiar, it's because we talked about it just a few weeks back, after Intel presented a whitepaper claiming that many of NVIDIA's test cases when claiming huge GPU performance advantages were unfairly optimized.

Related content


sackyhack 4 years ago

Hmm, not sure what to make of this. I know any company's goal is to do anything within the boundaries of the law (and outside w/o getting caught) to make money, but at some point public sentiment towards their practices has to factor in or they're going to bleed customers. I think Nvidia desperately needs a ***-storm committee for their ideas.

"Will disabling physX when both an ATI card and Nvidia card are present cause a ***-storm?" "Yes" "Ok nix that"

"Will developing physX in an archaic manner just to avoid poeple using it w/o our hardware cause a ***-storm when they find out about it?" "Yes" "Damn, back to the drawing board!"

Joel H 4 years ago

Sackyhack: Keep in mind that while we can't guarantee NVIDIA didn't purposely avoid some optimizations, the basic claim that there's only a certain amount of optimization and updating that can be done with a given code base. There comes a point when your programmers are spending more time figuring out how to kludge new features into old software than they are actually building the new features themselves.

NVIDIA's statements do make a certain amount of sense, but we'll have to wait and see how developers use the upcoming 3.0 SDK in order to make a better guess at whether or not the company is avoiding x86 optimizations deliberately.

AKwyn 4 years ago

I just don't see why they could just make it work on x86 CPU's? If they're promoting CUDA and PhysX as standards then why don't they promote it as such, why do they tie it to NVIDIA and then force everybody to buy their products? I swear corporations will do anything to gain a quick buck, even if it means delaying CPU PhysX for years. CPU's are advanced enough to take advantage of multiple threads and therefore even the CPU's with hyper-threading should be able to utilize the multiple-threads to offer similar performance to the GPU PhysX.

I don't they why they they have to rewrite the entire architecture, games like UT3 might be broken by the new architecture and other popular games that rely on PhysX so how are they going to do it without breaking compatibility? Many questions that remain unanswered due to NVIDIA's corporate greed.

jturnbull65 4 years ago

[quote user="TaylorKarras"]

I just don't see why they could just make it work on x86 CPU's? If they're promoting CUDA and PhysX as standards then why don't they promote it as such, why do they tie it to NVIDIA and then force everybody to buy their products? I swear corporations will do anything to gain a quick buck, even if it means delaying CPU PhysX for years. 


They're promoting CUDA and PhysX only insofar as they will drive sales of NVDIA GPUs.  Corporations exist to make a profit; all other considerations are secondary.  And in this case with PhysX, it has been a lengthy road of development and marketing; there was nothing quick about this buck.

Joel H 4 years ago


You don't understand the situation properly. PhysX is a physics middleware engine. It runs on CPUs. It runs on GPUs. Most games in development are console games and PhysX is executed using the CPU on both the XBox 360 and the PS3. This is a point we keep coming back to again and again because it seems so poorly understood. PhysX is a hardware AND a software solution. When we talk about hardware PhysX, we're talking about GPU-executed PhysX. That's a very small chunk of the total Physx base.

acarzt 4 years ago

Physx actually works very well in at least 1 PC game I know of, regardless of platform.

Metro 2033. Altho the game is an incredible resource hog otherwise. Physx seems to have no additional impact on one platform over an other. And there are plenty of examples of Physx in the game as well.

So reall I think it comes down to how well it is coded into the game.

Other games like Batman AA will slow a system to a crawl if you do not have hardware physx running on an Nvidia card.

MrBrownSound 4 years ago

What the poop! Are we all talking about my new 480 card I have in my PC?!  I was quite ignorant thinking that Nvidia's main  tackle was improving PC graphics, I mean it's what they do right. Well there's money in optimizing consoles seeing that it's the majority of gaming. I'm hopeful for a revamp of there driver's using x86. Hopefully it won't take them to much time. It boggles my mind that this card could perform magnitudes eh , but still better, and Nvidia hasn't perused to do it yet. Talk about not caring. I have a case badge of you guys!!!graahh

acarzt 4 years ago

I think you missed the point MrBrownSound.

This has nothing to do with the performance of your video card.

It has to do with Physx support across multiple platforms.

The big debate is that Physx is only optimized for Nvidia hardware.

Joel H 4 years ago


I've actually never been able to catch a difference in PhysX performance between having it on and off in any CPU in Metro 2033. Visually it does make a difference (although a subtle one). So I do agree--there appears to be some solid optimization there.

We could test it, I suppose, by turning PhysX on and off on a dual-core or even single-core CPU in M2033. That might be interesting.

acarzt 4 years ago

That would be pretty awesome Joel! I'd be very interested in seeing those results!

Nethersprite 4 years ago

I usually support AMD, but I can see right through this. OF COURSE they dug up dirt on this, since it's a competitor's product! Now, I don't like nVidia's ambiguous "promote it/tie it to hardware" stance, but optimization simply doesn't work if you're developing for multiple platforms.

To answer Taylor: I think that it's not CPU optimized because although I agree that multiple cores are underutilized, a CPU has only 12 parallel processing cores at the most (here I refer to the i7-980X) but a GPU can have hundreds, making it perfect for calculating hundreds of particles (water drops, bullet casings, falling pencils, whatever). That's why I think they didn't focus on CPU optimization, and for another reason: going back to optimization for multiple platforms. What do they choose to support, Intel or AMD? And besides, like Joel said, nobody's stopping people from developing for software Physx as well, where physics calculations are built into the game/simulator engine.

Joel H 4 years ago

Nether, (nice reference to Kara, btw).


AMD didn't commission this study. AMD has nothing to do with this study. Kanter is a person of very long standing in the tech community--his work confirms some of the facts AMD reported, insomuch as PhysX isn't optimized well for x86 CPU performance. AMD drew its own conclusions about why that is, Kanter drew his, and NVIDIA has its own explanation.


None of the parties in question--not AMD, not NVIDIA, not Kanter--are arguing that PhysX is fabulously well optimized for x86 CPUs. NVIDIA's position can be summarized simply in three parts:

1)  There are reasons why the numbers look the way they do.

2)  Kanter is using older projects that probably used fewer optimizations.

3)  The new code base NV will launch will make it easier to take advantage of CPU optimizations.

Thundermane 4 years ago

I don't buy Nvidia's response. It is not a simple case of them not wanting to optimize PhysX for the CPU, they seem to be purposely crippling it. As I understand David Kanter's original article, there is no advantage in using x87 over SSE. SSE and later iterations are much faster and easier to use, so why use x87?


Also, as Charlie Demerjian pointed out in his blog, today's compilers already default to use SSE in their output. If you want compiler's to output x87, to have to specifically tell it so. Of course given Charlie's obvious bias against Nvidia, you need to take this with a grain of salt, but it sounds logical given Intel and AMD deprecated x87 use years ago. Perhaps those who have experience in this can shed more light.


All they need is to recompile. Then Nvidia could've spent at most several weeks testing the recompiled library, according to David Kanter. And also, they could still provide the x87 version as an option during setup, in case somebody really needs it.


Simply put, there simply is no reason not to use SSE. If Nvidia doesn't want to optimize PhysX on the CPU, fine, I can understand that. But to use older, slower code deliberately just so that they can show PhysX is faster on their hardware by a lot is downright deceitful. Just as deceitful as disabling PhysX when there is an ATI GPU present, or disabling AA in Batman when using ATI GPU, or presenting non-functioning Fermi mockups and telling the world it's the real "puppy."

Joel H 4 years ago


I think you're missing part of NVIDIA's point. Remember, PhysX is a physics engine that runs in software mode (on the CPU) on virtually every platform it supports. Not optimizing for x86 code probably was a business decision, but the SDK itself isn't/wasn't designed to take full advantage of SSE when it was written 5-6 years ago.

Do I think there's probably more NV could do with its existing SDK?  Yes. Do I think there's a legitimate business reason why the company prioritized Cell and XBox 360 development over PC development? Absolutely. It's not just the fact that the majority of games are console games, it's the nature of the beast. Consoles are much more difficult to program; it's harder to get high performance out of them.

Making fundamental changes to the way a program multithreads or evaluates SIMD instructions is a complex process that could easily break backwards compatibility. I think we'll know a lot more once the new SDK is out.

realneil 4 years ago

"The best way to encourage people to buy NVIDIA GPUs is to ensure that the special effects are amazing and only available to NVIDIA customers. Optimizing PhysX to run on an x86 CPU potentially dilutes the attractiveness of an NVIDIA GPU,"


My belief is that the best way to get people to buy your products is to build a better product. (like the awesome new GTX460 series)

If PhysX was opened up and tweaked to take advantage of all of today's techno-advances, life would be better for us, the consumers, and much ill-feeling against NVIDIA would go away. I realize that they bought the fledgling concept in that company, but they closed it off to others and crippled it for their own benefit too. This was a lousy public relations move and has turned friends into foes.

Joel H 4 years ago


You've oversimplified the situation. First, NVIDIA has spent a great deal of money "building a better product." As this article stated, the next version of CUDA will incorporate more SIMD instructions and will generally be more efficient. At the same time, as you've noted, the company has worked towards releasing better iterations of Fermi at various price points.

We know NVIDIA has sunk a lot of money into CUDA and PhysX development. We know they've sunk a ton of money into future video card designs. Yes, NVIDIA wants developers to use CUDA and PhysX, but it's not as if the GeForce series has been crippled when it comes to running DirectCompute or OpenCL.

I'm not claiming that NVIDIA's approach to hardware PhysX adoption is the best possible one--but it's quite inaccurate to paint NVIDIA as opting for FUD as opposed to real improvements.

crowTrobot 4 years ago

I know Nvidia gets a lot of hate but c'mon lets get real here. 

1.) They own PhysX, whatever they want to do with it is ther prerogative.

2.) They aren't blocking DirectCompute and a lot of these effects can be done through that now.

3.) They aren't blocking OpenCL either and they are part of Khronos group.

4.) If there are any physics engine right now that performs as well as PhysX and is as easy to implement in game as PhysX I'd love to see it, but frankly there are none.  Truth is  Nvidia is the only company right now that is actually putting their money where their mouth is in terms of advancing Physics.  Bullet has been all but abandoned (not surprisingly), the lead guy left development and its in limbo once again, although I'm amazed at the lack of flack ATI is getting for constantly promising things and not delivering. 

5.) All other physics engines are way behind PhysX in terms of technical advancement and its going to take a long while before they all catch up if they do (except for Bullet, unless ATI invests a major amount of time and resources on it in terms of developers and not just publicists coming out every few months reminding people they are working on an Open approach that never manifests into anything tangible).

acarzt 4 years ago

[quote user="crowTrobot"]

If there are any physics engine right now that performs as well as PhysX and is as easy to implement in game as PhysX I'd love to see it.




crowTrobot 4 years ago

[quote user="acarzt"]

[quote user="crowTrobot"]

If there are any physics engine right now that performs as well as PhysX and is as easy to implement in game as PhysX I'd love to see it.







Well that demo actually doesn't show that it performs better than physx Stick out tongue

Granted that video is 3 years old now, but those are just basic collision physics.  More impressive would be particle manipulation to simulate water or gas for example.  The skeletal animation on that thing looks very basic too. Also since havok is made by intel, its only CPU implemented vs GPU parallelization with physx.

Best example of the current state of havok physics is probably Just Cause 2, although there are no hundreds of individually reacting physics particles in there at the same amount of time on the screen (dissappears after two seconds) like there is in something like Mafia 2 or Batman.  Even the impressive looking water in Just Cause 2 was implemented using CUDA (though not directly PhysX, it moreso shows the limitations of havok when handling fluid mechanics)


realneil 4 years ago

[quote user="Joel H"] Real, You've oversimplified the situation. [/quote]

I have a simple perspective. I'm not anywhere near as knowledgeable as you about this stuff. I don't know as much about this and many other things as you. I look at it as 'what I'm experiencing when I play games'. I've tried turning on PhysX with my 2GB GTX285 and my 1GB HD5850 card in both computers to no avail. it just doesn't work. So here is something that everybody's crowing about that I can't touch with either of my several hundred dollar video cards. Then I go online and read about it a little (mind that I really don't understand the technical aspects entirely) and lots of supposedly knowledgeable people are bitching that NVIDIA has crippled the technology unless you shell out big money to them, and they're trying to strong arm developers into adopting their propitiatory tech within their games. (leaving a lot of us to wonder what it looks like)

So I resent it. It's simple.

BTW: This doesn't mean that I won't buy from them. (the ultimate expression of financial love) I have my eye on a pair of those GTX460's for an SLI box. Price, performance and the inclusion of those technologies that I've been missing drive this decision.

Joel H 4 years ago


What you've just said is actually a lot more helpful when it comes to possibly helping you solve that problem. :) You say you have a GTX 285 (2GB) and a 1GB Radeon HD 5850. Are you using both of them in the same system? I'm assuming not--I don't think the 5850 is powerful enough to trounce the GTX 285 (some people on Win 7 use a hacked drive to allow them to run a secondary GeForce for PhysX while simultaneously using a higher-end ATI card.

There's no reason PhysX shouldnt' be working on your GTX 285. There are several possibilities here:

1)  You've got a driver issue/bug that's keeping PhysX from executing properly.

2)  You're confused about the difference between software and hardware PhysX. The term "Software PhysX" refers to PhysX calculations performed by the CPU. There are something like 150 games that use software PhysX--NVIDIA has adapted it for the PS3, XBox, Wii, and iPod.

Hardware PhysX titles are much rarer. To date, Batman: Arkham Asylum, Mafia 2, Unreal Tournament 3, and Mirror's Edge have been the go-to titles for good game play and high-level PhysX support. Software PhysX titles do not take advantage of NVIDIA hardware.

This is a distinction that's not easy to clarify; I frankly wish NVIDIA had used two distinct terms for their software PhysX engine (which runs on many architectures) and the hardware PhysX execution that's particular to GeForce cards.

Here's the final caveat that makes the situation more nuanced than it looks at first glance. There are only a handful of hardware PhysX titles, there's an even smaller handful of those titles that are top-notch, and yes, NVIDIA is using a restricted, proprietary API. Look around the industry, however, and you'll see that NVIDIA is also the *only* company that's sunk real work into defining, developing, and using a hardware physics standard.

ATI periodically makes some noise in this general direction but has never released hardware physics support for any shipping title. Microsoft's DirectCompute and OpenCL are both available, but we haven't heard of any games adopting these options in hardware. That's why NV gets credit--however small their success, they remain the only company that's put a major, multi-year push behind hardware physics development.

Now, back to your situation. If you're having a problem that isn't covered by one of the two examples above, I'd be more than happy to work with you to get the GTX 285 running as it should. We can talk here, or you can drop me a PM/email.


crowTrobot 4 years ago

[quote user="Joel H"]

ATI periodically makes some noise in this general direction but has never released hardware physics support for any shipping title. Microsoft's DirectCompute and OpenCL are both available, but we haven't heard of any games adopting these options in hardware.[/quote]

Lost Planet 2 uses DirectCompute for enhanced water physics and softbody compute although the framerate hit is much more massive than physx (because there is only option to turn on/off DX11 options and it includes tessellation), also the water physics implementation in that game looks horrible since its only shallow water effect and not particle.

Joel H 4 years ago

Technically I stand corrected. From your description, however, it sounds pretty awful.

realneil 4 years ago


The GTX285 is in the PC I'm using today. It has a i5-750, 8GB of DDR3-1600 RAM, a OZC Agility 60GB SSD, and a pair of Seagate 750TB 7200RPM Drives. This is the "Little" box so to speak.

I have UT3 and have added the PhysX mod to the game. I've updated the game completely as far as I know. When I try to run the PhysX maps, it slows to a crawl right away. (totally unplayable and not fun)

At the moment, I'm running the latest driver from NVIDIA. (I do a weekly search for all of the computers for drivers just to stay busy) I have tried several.

After a couple of months of screwing around with this situation, swapping the cards around between this, the i7-870, and the AMD X3-720 system, trading the two types of DDR3-1600 RAM that I own, Formatting the drive and installing the game only on the systems, I just gave up in frustration.

I even bought a new Gigabyte GT240 1GB-GDDR5 OC edition PCI-E card to add to the system to act as the PhysX processor,.....to no avail. (it's now sitting on the shelf collecting dust) Adding another card to the PCI-e bus chops my video bandwidth from one @ X16 speeds to dual X8 speeds on these two LGA-1156 socket systems, but according to some sites out there, it should still work. It did not.

EDIT: I just tried it out again after posting this, and it played fine for about 30 seconds and then slowed down to a choppy crawl.

acarzt 4 years ago

[quote user="realneil"]

EDIT: I just tried it out again after posting this, and it played fine for about 30 seconds and then slowed down to a choppy crawl.


 Properly implemented Physx can be very efficient while still offering some very nice effects.

Metro 2033 is a perfect example. Enabling Physx offers little to no performance hit, yet provides a noticable difference in visuals.

realneil 4 years ago

[quote user="acarzt"]Properly implemented Physx can be very efficient while still offering some very nice effects.[/quote]

So it's the UT3 game that's stuffed?

Joel H 4 years ago


No. I've personally played UT3's PhysX maps with a GTX 260. Based on what you've described, it's not that PhysX doesn't work--the code is being run on your CPU rather than your GPU.

The simplest thing to do would be to try a different program. The Mafia 2 demo supports PhysX, Batman: AA has a good implementation and can be downloaded here:  http://www.nzone.com/object/nzone_batmanaa_downloads.html

The reason I'm suggesting we check a different title is because it'll help nail down whether or not the problem is in-driver (I suspect it is).

Two simple questions for you--I'm assuming you've done both of these, but it's always best to cover the bases when troubleshooting.

1)  The new NVIDIA drivers allow you to explicitly choose how PhysX is executed. Have you selected to run PhysX on the GPU? (If it says Auto, try changing it to the GPU explicitly).

2)  In Unreal Tournament 2003,  have you've enabled hardware PhysX using the toggle inside the game menus? 

3) You say you've tried multiple driver revisions. How do you install and uninstall drivers? Describe the steps you take, please.

We'll sort this out.


realneil 4 years ago

Point #1) was set to auto so I changed it to use just the GTX285. Game worked smoothly for a longer time, (one minute) but still reverted to jerkiness (crappy frame rate) When I had the two NVIDIA cards in the system, I had it set to use the GT240 1GB card with no decent result. It was even worse.

Point #2) I have the hardware PhysX turned on in the game and have had it on all along.

Point #3) I usually just install the latest released driver onto the computer when I download it. I do not do an uninstall first.


Thanks for the help

Joel H 4 years ago

I usually just install the latest released driver onto the computer when I download it. I do not do an uninstall first..

Yeah..that's almost certainly the problem, and just running NVIDIA's removal tool isn't good enough. What OS are you running? I can't guarantee that this is your issue...but I'd lay good odds--75%--that not uninstalling drivers first is the problem. Don't just go run NVIDIA's own uninstallation software--you've got to do this the right way or else you'll be screwing with it for hours.

realneil 4 years ago

I have Enterprise Win-7 64Bit on all of them.

I have Revo Uninstaller on the systems too. Should I use that?

Joel H 4 years ago

I don't know anything about the Revo Uninstaller, it may or may not do what we want it to. I'm going to tell you exactly what I do when I'm cleaning up and reinstalling drivers.

Step 1:  Download the driver that you're going to use.

Step 3:  Drop by Guru3D, download the latest version of Driver Sweeper. Make sure you grab the 2.xx version of the program, not the 1.xx. Install it.

Step 4:  If you have an unzipped set of NV drivers sitting in a directory on your C:\, delete them. You don't need to re-download the driver install file if you have the one you want, but delete the unpacked files.

Step 5:  Remove your current driver using NVIDIA's applet in the Add/Remove Programs panel. Also remove any additional programs listed separately, including NVIDIA PhysX.

Step 6:  Reboot and enter safe mode.

Step 7: Run Driver Sweeper. Clean off all the NV drivers on your system.

Step 8:  Open a CMD window.

Step 9: Move yourself to C:\Windows\System32

Step 10:  Type "delete nv*.* /s

Step 11:  Reboot

Step 12: If we've done all this correctly, Windows will *not* auto-reinstall the old driver set (we've killed all the places that driver set could be).

Step 13: Run the driver install package and install everything normally.

Step 14: Success (Hopefully).

In theory, NVIDIA's uninstaller should be good enough for you to flawlessly install new drivers. Don't trust theory. This method, done correctly, eliminates any chance of version conflicts by ensuring all NV driver files are wiped prior to driver installation.


realneil 4 years ago


Tried all of this and it went according to plan, but it didn't help any. The PhysX problems remain the same in UT3.

D/L'd Batman,......will install in the morning to check it out. Maybe the Metro game will work correctly. We'll see about that one too.

Joel H 4 years ago

You wouldn't notice it in Metro 2033; the PhysX effects are *very* subtle. Install the patch for Batman as well to make sure you can activate PhysX from the game's control panel.  If you google for PhysX Batman Arkham Asylum I'm certain you'll find videos of areas where you can see PhysX on vs. off. It's very noticeable. With PhysX, you get ground fog, outgassing, much more debris and bats/birds, etc.

acarzt 4 years ago

Also when you walk through leaves and papers on the ground they move and around vs with physx off, it just sits there.

You will take a hit in Batman AA with physx on tho lol

But for me it was still very playable :-)

The biggest visual difference in batam man is when you face scare crow. The difference is HUGE.

Joel H 4 years ago

Actually, assuming that the demo has the same benchmark as the full game, all you need to do is run it. With PhysX on vs. off you'll see huge differences.

realneil 4 years ago


Batman seems to install properly,....but when I try to run it, this is what I get right away.

So It looks like I will not be trying Batman out anytime soon,.......sigh.

BTW: I just reinstalled MS Net Framework and Silverlight too. No change.

I'm downloading a Mafia-II demo and the Metro2033 demo right now. Will try them out.

If neither one of them play properly I may just Backup files and reload the OS and start fresh and see what my results are.


Error Message:


"Microsoft Net Framework"

"Unhandled exception has occurred in your application"

"Value was either too large or too small for an INT32"

Exception Text **************
System.OverflowException: Value was either too large or too small for an Int32.
   at System.Number.ParseInt32(String s, NumberStyles style, NumberFormatInfo info)
   at BmLauncher.SystemInfo.GetPropertyInt(ArrayList info, String propertyName, Int32 property_index)
   at BmLauncher.SystemInfo.GetPropertyIntBest(ArrayList info, String propertyName, Boolean select_highest)
   at BmLauncher.SystemInfo..ctor()
   at BmLauncher.Form1.Initialise()
   at BmLauncher.Form1.OnLoad(Object sender, EventArgs e)
   at System.Windows.Forms.Form.OnLoad(EventArgs e)
   at System.Windows.Forms.Form.OnCreateControl()
   at System.Windows.Forms.Control.CreateControl(Boolean fIgnoreVisible)
   at System.Windows.Forms.Control.CreateControl()
   at System.Windows.Forms.Control.WmShowWindow(Message& m)
   at System.Windows.Forms.Control.WndProc(Message& m)
   at System.Windows.Forms.ScrollableControl.WndProc(Message& m)
   at System.Windows.Forms.ContainerControl.WndProc(Message& m)
   at System.Windows.Forms.Form.WmShowWindow(Message& m)
   at System.Windows.Forms.Form.WndProc(Message& m)
   at System.Windows.Forms.Control.ControlNativeWindow.OnMessage(Message& m)
   at System.Windows.Forms.Control.ControlNativeWindow.WndProc(Message& m)
   at System.Windows.Forms.NativeWindow.Callback(IntPtr hWnd, Int32 msg, IntPtr wparam, IntPtr lparam)

************** Loaded Assemblies **************
    Assembly Version:
    Win32 Version: 2.0.50727.4952 (win7RTMGDR.050727-4900)
    CodeBase: file:///C:/Windows/Microsoft.NET/Framework/v2.0.50727/mscorlib.dll
    Assembly Version:
    Win32 Version:
    CodeBase: file:///F:/Program%20Files%20(x86)/Eidos/Batman%20Arkham%20Asylum%20Demo/Binaries/BmLauncher.exe
    Assembly Version:
    Win32 Version: 2.0.50727.4927 (NetFXspW7.050727-4900)
    CodeBase: file:///C:/Windows/assembly/GAC_MSIL/System.Windows.Forms/
    Assembly Version:
    Win32 Version: 2.0.50727.4927 (NetFXspW7.050727-4900)
    CodeBase: file:///C:/Windows/assembly/GAC_MSIL/System/
    Assembly Version:
    Win32 Version: 2.0.50727.4927 (NetFXspW7.050727-4900)
    CodeBase: file:///C:/Windows/assembly/GAC_MSIL/System.Drawing/
    Assembly Version:
    Win32 Version: 2.0.50727.4927 (NetFXspW7.050727-4900)
    CodeBase: file:///C:/Windows/assembly/GAC_MSIL/System.Management/

Joel H 4 years ago

I'll dig through this and see what I can do. In the meantime, try asking Win 7 to run the game in compatibility mode. You are running the final retail of Win 7 Enterprise with all patches and updates installed?


EDIT:  http://forums.eidosgames.com/showthread.php?t=92273


Hopefully that's your solution. I found that by Googling "Value was too large or too small for an Int32" + "Batman."  Updating your .NET framework is always a good idea.

realneil 4 years ago

[quote user="Joel H"] You are running the final retail of Win 7 Enterprise with all patches and updates installed? [/quote] Yes, I keep them all up to date.

[quote user="Joel H"] http://forums.eidosgames.com/showthread.php?t=92273 [/quote] This worked and I was able to launch the game normally. I just spent a while playing the demo, It works just fine. There is no in-game enabler for PhysX that I could see, so I'm not sure that it's on or not. The game did look rich to me. I'll install it onto the PC with the Radeon card in it and play it there to see what the difference may be.

I appreciate the help Joel.



Joel H 4 years ago

It's not in-game. If I recall correctly you navigate to the games directory, right-click on the icon, and choose setup/configure/something-or-other.


EDIT:  Make sure you have installed the smaller patch that was linked on the same page as the demo.

Further Edit:  I don't know that this is true, but it's entirely possible that not having the .NET framework 3.5 installed is what caused your UT3 problems. So I'd check that again, too.

realneil 4 years ago


OK, PhysX is definitely on in Batman, now I know what the heck it looks like. (cool) UT3 PhysX is still broken though. I have it all updated and reapplied the PhysX patch to it just now but it's no good. I don't have any cheats enabled or even on the computer anywhere. So I don't know what's up with it.

As I said on another post, I may come into a little money in about 5 or 6 weeks. If it happens, I plan to get a good AM3 SLI board in a combo deal with the Phenom Hex-Core Black. I have some memory in mind that's said to be perfect for the CPU I like. Two GTX460's in SLI will round it off nicely, but I haven't decided on the ones I want yet. I have an Asetek water cooler on the shelf for it and I'll use One of my CoolerMaster RC690 Centurion cases.

Once this one is put together I'll try it again.

Thanks again for the help and suggestions.



Joel H 4 years ago

Alright, good! For what it's worth, the PhysX in Batman is, IMO, better than what you see in UT3. Aside from recommending you look around on Google, however, I don't know what the difference would be. I thought PhysX in UT3 was only certain maps, but the game's been patched quite a bit from where it was originally.

At any rate, I'm glad you've at least gotten to see the feature.


Ok, I have to ask. Why are you getting a good AM3 board + CPU if you've already got that insane rig of yours? A six-core Thuban isn't going to outmatch a quad-core + HT Core i7 that's been overclocked to 3.83GHz. Not unless you can squeeze a pretty hefty chunk of OCing out of the Thuban, anyway...

realneil 4 years ago

[quote user="Joel H"] *pause*  Ok, I have to ask. Why are you getting a good AM3 board + CPU if you've already got that insane rig of yours? [/quote]

Second system.

I like to have two or three nice performers here for when I have one of the kids or grand-kids here. We hook'em up together and play.

The i5-750 will go to my son in Vegas as a surprise gift, and the Phenom X3-720 may go to Ohio to one of my grandsons, also a surprise.

My wife thinks that I spoil the kids and their kids.

Guess what? I just tried the PhysX maps again and they work. How, I don't know, and I don't care either. UT3 is my favorite game and it's just got better.



crowTrobot 4 years ago

Do you have steam? Sometimes it updates itself if its on the games list (happend to me with mafia 2, it was really sluggish at first run)

I also had problems with batman but it was because I didn't know I had to reinstall physx software everytime I switch cards like I'm reinstalling drivers.

realneil 4 years ago

[quote user="crowTrobot"] Do you have steam? [/quote]  

Yes, I do, but I don't register any non-steam games with them. (haven't yet anyway) What am I missing? if anything?

[quoteuser="crowTrobot"] I didn't know I had to reinstall physx software everytime I switch cards like I'm reinstalling drivers. [/quote]

Neither did I, but this box has always had the GTX285 in it since I bought the SSD drive and started with a fresh install of 64bit Win-7.

PhysX is working now on it and I'm thinking that the difference is in the PhysX software that I downloaded yesterday. I went to the NVIDIA webpages and got the latest, greatest version and re-installed it. (along with everything else) Now it works.

I had downloaded it from the same site before. Everyone else had no problems making it work from the get go, but it didn't work properly for me until yesterday.

Joel H 4 years ago


Drivers are funny things. Think about it this way:  Running the NVIDIA auto-installer is *supposed* to be the only program you need to run. It offers you a chance to uninstall NV drivers, PhysX, or everything (You always want to pick everything unless you actually have an NForce board). 

For reasons generally unknown, however, this sort of uninstall often *doesn't* get all the driver cruft. That's why programs like DriverSweeper exist. Yet even after you ran driver sweeper, I still had you do a manual delete of all the files with the prefix nv*.*  I did that because even after DriverSweeper runs, there's still some files left over. People with nForce boards might not want to do that last step.

I've had success using the standard NV uninstall + install new. I've had better success using DriverSweeper as well before installing the new drivers. And in a few cases, I've had to do the manual deletion step before the new drivers installed properly.

Why? No idea.

New drivers don't always install new PhysX, but you should always reinstall PhysX when you remove and then re-install drivers.

realneil 4 years ago

[quote user="Joel H"]Drivers are funny things.[/quote]

Even after all of what we did, it didn't work properly until much later in the day. And it was after several reboots too.

Who knows why?

I'm just glad it works now.

Post a Comment
or Register to comment