The State of DirectX 10 - Image Quality & Performance

Article Index

World in Conflict: Performance

Considering how good the game looks, you would naturally suspect that World in Conflict would require some serious power to run, but that isn't the case. World in Conflict's system requirements are actually pretty moderate. Requiring only a 2GHz P4 class single core processor, 512MB of RAM (1GB for Vista) and a 128MB DX9c compliant video card with Shader Model 3.0 support, World in Conflict should be able to run on most gaming oriented machines made in the last couple of years. As we saw on the previous page, World in Conflict has a couple of nice DX10 image quality enhancements. However, we've reserved final judgment on whether DX10 is worth it so far, but now it's time to find out if there is a performance cost associated with rendering all those soft particles, cloud shadows and volumetric lighting effects.

Lost Planet System Requirements

For our benchmarks, all graphics settings were turned up to their highest level. Anti-aliasing was turned on and set to 4X while anisotropic filtering was set to 16X. Vertical sync was manually disabled in-game as well as forced off in the graphics driver options. World in Conflict can be toggled between DX9 and DX10 rendering with an option in the video settings menu, although a game restart is required before changes take effect. Note that we used the demo version of the game for all of our tests.

Like Company of Heroes, World in Conflict has a built-in, in-game benchmark test. The test in the demo version of the game consists of a flyby over mission 3 of the single-player campaign. During the flyby, different graphical aspects of the game are demonstrated while a large battle takes place across the map. We found the results of this built-in benchmark to be a good indication of what typical in-game performance would be like.

For our tests, we ran the built-in benchmark tool five times per resolution, per video card. Benchmark runs that resulted in strange values that did not correlate with the rest of the results were attempted a second time.
We then averaged the results for each resolution to obtain our final results.

World in Conflict Performance Comparison
DirectX 9 and DirectX 10 Performance Compared


With one exception, the results of all the video cards followed the same pattern. In nearly all cases, DirectX 10 performance seriously lags behind DX9 performance, although not as much as some of the other games we've seen. The one exception to this trend is the Radeon 2900 XT which seemed to be indifferent of whether the game was running in DX9 or DX10. In fact, the 2900 XT actually performed 1 frame per second, on average, better in DX10 than in DX9. We found this behavior to be quite odd since the Radeon 2600 XT did not share the 2900 XT's indifference. However, this is good news for 2900 XT owners since they pay absolutely no penalty for enabling DX10 rendering.

The 8800 GTX remains the overall top performer. For DX9, the 8800 GTS comes in second with the 2900 XT in a close third, however the situation is reversed in DX10. The 8800 GTX was playable at all times and we didn't notice any slowdown, even at 1920x1200 with DX10 rendering turned on. While the 8800 GTS and the 2900 XT don't perform quite as well, often dipping below the 20FPS mark, they both remained fairly playable at all resolutions in both DX9 and DX10.

Unfortunately, our two mid-range cards didn't do very well and posted unplayable results in both DX9 and DX10 with the settings we chose. Both mid-range cards were unplayable in both DX9 and DX10, although the 8600 GTS was able to hum along at an average of 18FPS in DX9 at 1280x1024 and was playable, although prone to stutters and sharp frame rate drops. We'd like to remind you again that our results were created with the game in its highest image quality mode. It is entirely possible to play World in Conflict smoothly on both of our mid-range cards when using lower video settings.

 Discuss This Article

Related content