HH: Let's move away from CPUs and talk graphics. Ghostbusters is a gorgeous DirectX 9 title, with nary a trace of DX10 support. Obviously DX9 support is most important—all three current-generation consoles use DX9-class GPUs—but is there a specific reason why Terminal Reality didn't implement DX10/10.1 for its PC release?
MR: According to Steam, 75% of the market is using DX9. We are a small company and to program for the handful of people who have DX10 or above right now, that just doesn’t make sense, especially since you won’t see any visual differences. We have played with Windows 7 quite a bit – believe the hype, it is a huge improvement over Vista and a good improvement over XP. So you’ll be seeing DX11 support shortly in Infernal.
HH: There are plenty of games on the market that support DX10, but using it almost always imposes a significant performance hit. Why do you think this occurs (and will DX11 will address it)?
MR: It took a while for hardware manufacturers to create hardware and drivers that are optimized for DX10. If you run those games on that set of hardware, you will see a significant performance increase. (I've yet to see a DX10 game that didn't significantly lower performance compared to DX9, and that includes top-end hardware--Ed) We see significant performance gains running Ghostbusters (which uses DX9), running on Vista or Windows 7 using hardware designed for DX10 as well.
HH: Does Ghostbusters take advantage of the hardware tessellation capability of the XBox 360? To date, we've yet to see a PC title that does, although that should change with DX11. What sort of graphical tricks does it allow you to pull off today, and what do we on the computer side of gaming have to look forward to?
MR: No, we don’t use the hardware tessellation unit on the 360. We just brute force high resolution meshes, such as 40,000 polygons for a Ghostbuster. It is difficult enough for artists to make a high resolution mesh bend, turn, and animate without having to deal with tessellation now.
HH: Currently, there's no way to force a video card to perform hardware anti-aliasing in Ghostbusters. A sort-of AA "trick" can be set within the game's configuration file, but the performance hit from doing so is severe. Is there any chance we'll see improved AA support in an upcoming patch? Ghostbuster's is a gorgeous game, but the proton pack, in particular, would look much cleaner if AA were available.
MR: No, this is an incorrect statement. You can set Ghostbusters to do REAL full high quality anti-aliasing in the settings.ini file. You will need a higher end video card like a Radeon 4850 or a GeForce GTX 280 to enable it. Set the multiple from 1 to 2. Don’t go higher than 2 on each direction unless you have gigabytes of VRAM available to you. Due to how Ghostbusters renders on the PC, you would need DX11 to get MSAA working...
HH: Are there any other Infernal Engine-powered titles in development from either Terminal Reality or the engine's licensers?
Joe Kreiner: Yes, lots. Can’t talk about them though. The ones we’ve announced are “The Hunt” and “The Strike” from Piranha Games, and “Cook or be Cooked” from Namco Bandai.
HH: Any chance we'll see multiplayer support patched into the PC version of the game?
JK: That’s an Atari decision. There is a significant cost involved with any feature that goes into a title, and that has to be weighed against potential benefit. I’m glad Atari choose to ship Ghostbusters on the PC. Lately, many similar style games ignore the platform.
HH: You Know We Have to Ask: You're sitting down with some friends in front of a 60" TV, getting ready to play Ghostbusters. Are you playing on an PC, XBox 360, PS3, or Wii?
MR: I switch between the PS3, 360, and PC versions all the time. Each one has their respective strengths.
Thanks to Terminal Reality for the interview--and for finally bringing a Ghostbusters game to market.