|The Promise of DirectX 10|
DirectX 10 (DX10) has been one of the hottest topics for discussion and news coverage since the first DX10 compliant hardware appeared in the second half of last year. Touted as the biggest milestone in games development since programmable shaders were introduced with DirectX 8, nearly seven years ago, DX10 has generated a lot of buzz. Unlike the older versions of DirectX which were each built on top of the previous version, DirectX 10 is a completely new beast. With Windows Vista, Microsoft fundamentally changed the way drivers are designed, and they also completely redesigned DirectX from the ground up.
Before DirectX 10, each new version of DirectX was an incremental improvement over the previous version and it was also backwards compatible. This meant that many of the limitations of the previous versions were carried forward to each new version of DirectX. Microsoft broke this cycle by completely redesigning DirectX 10. The overhaul of both DirectX and the way drivers work in Vista is so complete that Vista actually comes with multiple versions of DirectX in order to support DX10 while still remaining backwards compatible with older software.
While DirectX 10 has seen heavy coverage both in the press and in forum discussions across the 'net, most of the discussion has been centered around the potential of DX10 since, at least initially, no one had any actual real-world, reproducible performance data. Since the first DX10 game didn't appear until June, nearly seven months after DX10 hardware first hit the shelves, no one had any idea how DX10 hardware and software would perform until rather recently. Due to its reliance on Vista's new driver model, DirectX 10 is only available for Vista and there are no plans to make a version available for Windows XP. Thanks in part to the relatively slow uptake of Vista, especially in gaming circles, developers didn't have a huge incentive to implement DirectX 10 in their games and as a result, games with DX10 support have been somewhat slow in coming.
One of the biggest issues holding back the maturation of DirectX 10 is the need for DirectX 9 support for the immediate future. It will take years for DX10 hardware to be ubiquitous and until then developers will be unwilling to alienate the section of the market that still uses DX9 hardware by releasing a DX10 exclusive game. This forces developers to compromise between DX9 and DX10 and currently the logical choice is to lean towards DX9 since most of the hardware out there today doesn't support the newer API. However, it's been nearly a year since the first DX10 hardware appeared and there are now several DX10 capable games on the market. And it looks like things are about to really pick up.
This holiday season is shaping up to be one of the most exciting for PC gamers in years with dozens of highly anticipated PC games set to be released. Hotly anticipated titles like Crysis, Hellgate: London, Unreal Tournament 3 and the PC version of Gears of War are all due to arrive in the coming months and they share another thing in common; they all feature DirectX 10 support. In fact, the holiday release cycle has already started and two highly anticipated DX10 games, Bioshock and World in Conflict, have already been launched. With all of these big holiday releases right around the corner, we think it's about time we looked at the current state of DirectX 10 and answer the big question; are we ready?
Discuss This Article
|Scope, Test Setup & Methodology|
As previously mentioned, we intend to explore the state of DirectX 10 in this article. That's a very tall order and one we cannot hope to fulfill unless we set out some boundaries and determine a specific scope for our examination. It would be nearly impossible and rather frivolous to encompass an evaluation of DirectX 10 from the perspective of everyone effected by its introduction -- such as the game developers, hardware manufacturers and end consumers -- into a single article.
To help keep the article at a manageable scope, we are only going to explore the current state of DX10 from the perspective of the end consumer. Anything that is or should be transparent to the consumer, such as API optimizations that let developers program more efficiently, will not be covered. Note that this is not necessarily the same as exploring the current end user experience. We want to focus on examining the potential of currently available DirectX 10 hardware and software. We will focus our attention on two fronts; performance and image quality. Specifically, we are interested in examining the differences in performance between DirectX 9.0c and DirectX 10 and the image quality enhancements of DirectX 10, if any.
Furthermore, we will only explore the performance and image quality of currently available games on currently available hardware. We chose five video cards and five games that we believe will best represent the currently available selection of DirectX 10 hardware and software for use in our testing. However, we decided not to include any entry level DX10 hardware, because today's offerings are not well suited to cutting-edge gaming.
ATI vs NVIDIA: While the purpose of this article was to look at DirectX performance, some people will no doubt try to draw conclusions about which brand is better from our data so we might as well address the issue. First and foremost, we want to point out again that we made no effort to create a fair comparison between ATI and NVIDIA in our tests. The test setup for each game was chosen to try and illustrate how current generation graphics hardware will handle upcoming DX10 titles and the five video cards in our test were chosen primarily because we thought they represented the most relevant price points at the time the article was conceived.
Performance Tests: For all of our performance tests, we used a single system configuration so that the benchmark results for each video card can be directly compared in each game. Since we are much more interested in performance differences rather than absolute performance, we were not as concerned with using a configuration representative of the average gamer. This allowed us to use a high-end configuration which meant we could enable all of the DX10 exclusive options available to each game. We chose to enable DX10 exclusive rendering options for all DX10 benchmarks rather than using an 'apples-to-apples' approach -- where DX10 rendering is enabled but DX10 exclusive options are not -- because we believe that not enabling the DX10 exclusive options would be sort of missing the point.
However, apples-to-apples tests are not without merit. As previously noted, one of the promises of DX10 is improved performance and efficiency compared to DX9 in certain situations. These efficiencies could result in better overall performance in an apples-to-apples situation where DX10 rendering proves to be faster than DX9 rendering. While we did not perform a full set of apples-to-apples benchmarks for every game, we kept an eye out for games that showed performance benefits under DX10 during our testing and performed additional apples-to-apples comparisons when necessary.
While our focus is on the current state of DirectX 10, we wanted to keep an eye on the future and how well the current batch of DX10 capable hardware will be able to handle upcoming titles. After all, many of the most hotly anticipated DX10 capable titles have yet to be released. In order to attempt to simulate the level of strain that future titles may put on hardware, we performed our tests with each game set to a high level of image quality with anti-aliasing and anisotropic filtering enabled wherever possible. Specific video settings and other test information for each game can be found on their respective pages.
A Cautionary Note Regarding Performance Test Results: Generally, we used the highest settings possible for each game, even if cranking a specific setting doesn't necessarily result in an appreciable image quality improvement. Usage of high video settings meant that our test system was put under a considerable amount of strain and performance numbers are expected to be on the low side. We designed our tests to illustrate the competency of currently available DX10 hardware and games and the test results may not necessarily represent a typical end user experience.
Image Quality Tests: For each of the five games in our test, we compared the image quality of DX9 rendering and DX10 rendering and attempted to find as many differences as possible. All image quality tests were performed with a GeForce 8800 GTX video card and the same test system configuration used in the performance tests. The video settings in each game were cranked up to their maximum levels and anti-aliasing as well as anisotropic filtering were enabled wherever possible.
Screenshots of each game using both DX9 and DX10 rendering were taken with Fraps at a resolution of 1920x1200. The screenshots were saved by Fraps in bmp format and later reformatted as jpegs with Adobe Photoshop CS3. The screenshots have been resized (maintaining aspect ratio) and/or cropped but not altered in any other way. While using jpeg does introduce a certain amount of compression artifacts to the images, we felt that they did not interfere with illustrating any of the image quality differences we are trying to point out in our direct-comparison images, where a single image displays a part of a scene in both DX9 and DX10 side-by-side in order to focus on a specific difference in image quality. After all, if the difference is so subtle that simple jpeg compression artifacts could prevent the difference from being noticed in a direct side-by-side comparison, how are you supposed to notice it in-game?
Discuss This Article
|Bioshock: Image Quality & Features|
Bioshock is a highly anticipated first-person shooter from 2K Games. Dubbed as the spiritual successor to the highly acclaimed System Shock series, Bioshock presents a compelling blend of traditional first-person action with RPG customization options and open ended gameplay elements, all set in a self-sufficient under water anti-utopian city on the mid-Atlantic seabed in 1946. With all the praise and attention this game has garnered and considering that the game supports DirectX 10, we couldn't afford not to check it out. We also have sufficient technical reasons to include this game in our look at the state of DirectX 10. Bioshock is one of the first games built on the Unreal 3.0 engine to be released on the PC this year but it certainly won't be the last. There are several highly anticipated Unreal 3.0 based titles coming this holiday season and Bioshock should give us a preview of what they will offer.
The key features of this game are definitely its excellent plot and engrossing visuals which are achieved due in no small part to the game's steller graphics. From the sea water leaking in through cracks in the walls to the excellent use of particle effects in explosions and ominous shadows, Bioshock's graphics will impress. An interesting design decision that is important to note for our image quality comparisons is that Bioshock is designed to use 4x Anisotropic Filtering and no Anti-Aliasing. The game defaults to 0x AA and 4x AF and there are no in-game adjustments for either setting. The game also will not support anti-aliasing in DX10 mode even if you attempt to force AA through the graphics driver settings . Luckily, this does not seem to negatively effect image quality and we noticed little aliasing in the game despite the lack of AA.
While Bioshock is one of the first of many Unreal 3.0 based games that will be released this year, it is certainly not the first Unreal 3.0 based game to be released on the PC. That honor belongs to Rainbow Six: Vegas which has been available on the PC since last December. While both Bioshock and Vegas are built up from the Unreal 3.0 engine, they use significantly different versions and a lot of new features have been added since Vegas was developed, most notably the addition of DirectX 10 support. Like previous Unreal engines, Unreal 3.0 is being continuously developed and improved by Epic Games and new features slowly make their way into the engine.
While it was originally designed to take advantage of shader model 3.0, Epic has since added limited DirectX 10 shader support into newer builds of the engine. However, Epic's DX10 additions have, so far, been primarily focused on performance optimizations. Bioshock's developers had different ideas on how DX10 should be implemented and they have added their own modifications to support a variety of DX10 enhancements. The new DX10 enhancements include the use of dynamic water ripples, soft edges for particles, and crisper shadow edges. These enhancements are enabled by turning on the "DX10 Detailed Surfaces" option in the video menu.
Dynamic Water Ripples
Bioshock takes place in a dilapidating under water city on the Atlantic seabed called Rapture and as you might expect, water is everywhere. Water plays such an important role in creating the atmosphere of Rapture that the development team included a water effect specialist whose entire job was to program and tweak the water effects in Bioshock. It's therefore not surprising that Bioshock sports some of the best water effects in any game. Everywhere you look, you can see the Atlantic trying to seep in to reclaim Rapture and the diversity of water effects is astounding.
One of the DirectX 10 enhancements involves improved water effects, specifically better water ripples. In DX9, when something disturbs the water, ripples are created which quickly fade away. This same effect has been used in many other DX9 games and the result is relatively convincing. However, if you pay close attention to the ripples and really study them, you'll notice that the 'ripple' isn't really there, it just seems like it is thanks to clever usage of 2D animated sprites. However, in DX10 mode, Bioshock actually seems to render the ripple and give it a third dimension. The ripple actually rises slightly higher than the surrounding water, unlike in DX9 where the ripple is purely illusionary. While this is a nice effect, it's very subtle and you're unlikely to notice it unless you're really paying attention.
These two screenshots illustrate the difference between ripples in DX9 and DX10. In DX9, the water surface remains completely flat and sprites are used to represent the ripple and splash of water. In DX10, the sprites are still present, but if you look closely you'll see that there is also a slight ring around the center of the splash where the water is slightly higher, representing a ripple. This effect is quite subtle and somewhat difficult to notice in-game and nearly impossible to notice in a still screenshot.
Soft Particle Edges
Soft particles is another touted DX10 feature supported by Bioshock. Soft particles refers to the way particle effects interact with geometric objects. Regular particle effects intersect with geometric objects sharply and the line of intersection can clearly be seen. This often results in particle effects appearing as if composed of several stacked 2D sprites which intersect with an object. With soft particles, the point of intersection is difficult to find and particle effects appear much more natural.
These two pairs of screenshots illustrate the advantage of soft particles. In the DX9 screenshots, you can clearly see where the particle effect intersects with the geometric object (a couch or a suitcase, depending on the image), giving the particle effect an unnatural layered 2D effect. In the DX10 screenshot, the particle effects appear 3D and natural. While Bioshock supports soft particles in DX10 mode, it does not in DX9, which is a bit odd since soft particles are not exclusive to DX10.
Soft particles are much more noticeable in-game than the previously mentioned ripple effects, however, they are still rather subtle. In the heat of the action, your unlikely to be bothered by a lack of softness in your particles.
The last and least noticeable image quality difference between DX9 and DX10 in Bioshock is the appearance of shadows. In the screenshot you can see that the shadow in DX10 has a slightly sharper edge when compared to the same shadow rendered in DX9. We didn't notice this at all in-game and had we not noticed the small paragraph about DirectX 10 enhancements in the Bioshock game manual, we probably would not picked up on it during gameplay the effect is so subtle.
Image Quality Impressions
Overall Bioshock is a great looking game. However, the difference in image quality between DirectX 9 and DirectX 10 rendering in Bioshock is very minimal. Chances are good that you are unlikely to notice the difference during normal play, unless you knew what to look for. For the most part, Bioshock looks equally good in DX9 and DX10. When your busy trying to survive in a anti-utopian society full of genetically spliced mutants, the sharpness of shadows on the wall and how particles are intersecting with geometry are probably the last things on your mind.
Discuss This Article
When it comes to system requirements, Bioshock is quite a demanding game. It has the highest hardware requirements out of the five games we'll be looking at. The minimum system requirements call for nothing less than a Pentium 4 2.5GHz processor, 1GB of memory, 8GB of disk space and a DX9 compliant 128MB graphics card that supports Pixel Shader 3.0. The recommended requirements are even harsher, requiring a 7900GT-class or better video card that has 512MB of memory. While our test system is certainly well above the recommended requirements, some of our video cards are right on the edge of the recommended specs. This should give us some insight into how well the game scales with mid-range cards.
Bioshock System Requirements
For our benchmarks, all graphics settings were turned up to their highest level. Since the game doesn't support anti-aliasing in DX10 mode, we decided to exclude anti-aliasing in our DX9 tests too. We also decided to stick to the game's default behavior of using 4X anisotropic filtering. Vertical sync was manually disabled in-game as well as forced off in the graphics driver options. In Windows Vista, Bioshock's default behavior when DX10 exclusive options are disabled is to use a hybrid DX9 and DX10 rendering mode which seems to use both DX9 and DX10 code. In order to force Bioshock to render with only DX9 code like it would in Windows XP, the "-dx9" flag must be used. We inserted the flag into the Bioshock shortcut in the Windows Game Explorer for all of our DX9 benchmarks.
Bioshock does not have a built-in automated benchmark. Instead, we had to manually run through a level while recording our FPS with Fraps' benchmarking function. The benchmark numbers presented below are the average frame rates recorded by Fraps during our run. We chose the first level of the game for our testing since it presented a good mix of scripted and dynamic areas as well as in-game cutscenes. The first level also contains a good variety of areas ranging from small corridors to large rooms. The benchmark run begins when the player's bathysphere arrives in Rapture and ends when the player reaches the bathroom in the Kashmire Restaurant.
Along the path of the benchmark route are several enemies and rendered in-game cut-scenes. Since Bioshock is such a dynamic game, each run invariably ended up slightly different from the next. In light of this, each benchmark was attempted five times and the average was taken and recorded. We tried our best to perform the same exact movements during each benchmark run. Benchmark runs that resulted in strange values that did not correlate with the rest of the results were attempted a second time. This process, though painstaking, allowed us to generate relatively accurate and repeatable benchmark results.
The results we obtained from our testing gave us a heap of data which we have summarized in the tables above. We noticed NVIDIA and ATI cards each exhibited distinct behavior with the cards of each brand following a specific pattern. The three NVIDIA cards all show nearly identical performance in DX9 and DX10. While DX9 does prove to be faster most of the time, the difference is negligible. This is great news because it means NVIDIA cards are able to use and enjoy all the DX10 specific effects with essentially zero performance penalty, at least in this game.
The results of the two ATI cards tell a very different story. Both ATI cards performed significantly better in DX9 than they did in DX10. We're unsure if it's because ATI's DX10 driver support isn't up to snuff but this isn't favorable news for ATI owners wishing to take advantage of DX10 in Bioshock. The performance hit associated with using DX10 is too costly and it simply isn't worth the tiny gain in image quality. This is especially true for 2600 owners since our 2600 XT was bordering on being unplayable in DX10 mode, even at the lowest resolution.
Not too surprisingly, the GeForce 8800 GTX proved to be the overall top performer out of our five graphics cards. The Radeon 2900 XT pulls up in second place and the 8800 GTS closely follows in third. In DX9, the 2600 XT takes forth place with the 8600 GTS stuck in last, but the two cards switch places in DX10 mode. Overall, performance in Bioshock was decent and playable, even on our two mid-range cards. We suspect that this is in part due to the lack of anti-aliasing. Video memory doesn't seem to be a critical issue with Bioshock which is good news for lower end cards with less memory. As you'll see, not all of the games we're testing will be so forgiving.
Discuss This Article
|Call of Juarez: Image Quality & Features|
Call of Juarez is a Spaghetti Western first person shooter first released last year in Europe. Developer Techland later ported the game to the XBOX 360 in June of this year. The PC version finally made it to our side of the pond around the same time, picking up a couple enhancements and bug fixes along the way, most notably the addition of DirectX 10 support (only available in the North American version). The game features both a story driven single player campaign and a multiplayer mode. Call of Juarez's party trick is a somewhat innovative split campaign where you alternate between playing as the protagonist as well as the antagonist, although things get shaken up a bit later in the game.
The game's Western theme means you'll get an opportunity to explore a setting and time largely neglected by other game developers. Luckily Techland has built quite a stunning graphics package to really pull you into the wild west setting. Our image quality comparison was conducted with 4X anti-aliasing and 16X anostropic filtering. All in-game image quality settings were cranked to their maximum levels and all DX10 exclusive options were enabled for the DX10 screenshots.
Built on the ChromeEngine, the North American version of Call of Juarez features excellent graphics and great use of lighting and depth of field effects. This game boasts some of the best rendered scenery we've seen to date in a game and a very impressive draw distance. In the second screenshot below which shows a river with mountains in the distance, you can actually walk right up to the mountains and one of the objectives actually requires you to climb to the top of one of them. There are also no loading screens mid-level so the journey from the position seen in the screenshot to the top of the mountain is totally loading-bar free.
Call of Juarez Screenshots (DirectX 10)
Call of Juarez features, by far, the most drastic difference in image quality between its DirectX 9 and DirectX 10 modes out of any DX10 capable game currently available. The game looks so different in DirectX 10 compared to DirectX 9 that it almost seems like a completely different game. In reality, that sentiment isn't as far from the truth as you may think. In the nine months between Call of Juarez's initial European release in September of '06 and the North American release in June of '07, Techland was able to practically rebuild the game for DirectX 10.
When the DirectX 10 version of Call of Juarez is launched for the first time, a message box appears and informs you about all of the new DX10 enhancements built into this version of the game, and there are a of of them. Some of these DX10 exclusive features can be disabled in-game using the DX10 exclusive "Enhanced Quality" option in the video settings menu. We also noticed a couple features the pop-up message box didn't mention. Interestingly, "gun bob", where your weapons appear to slowly bob up and down, supposedly to simulate breathing, is exclusive to DX10.
Like many of the other games, DX10 mode features more noticable use of contrast in lighting and better HDR effects. In the screenshot composition above, you can see that DX10 features greater contrast, making the light appear harsher and giving the illusion that the sun is beating down harder. There is also greater constrast transition in DX10. If you stand in a shadow in DX9, then step out into the sun, there is nearly no noticable difference in contrast. However, in DX10, the contrast is stronger when your standing in the sun, causing shadows to appear darker and lighted areas to appear brighter. This is more realistic since you would notice a similar effect if you were to try the same thing on a very sunny day.
Another difference we noticed between DX9 and DX10 is that the draw distance in DX10 appeared to be even longer than what we saw in DX9. However the biggest difference in image quality actually has very little to do with DirectX 10 effects. The DirectX 10 version of the game features a completely new set of textures that are significantly more detailed than the set used for the DirectX 9 version of the game. These DX10 exclusive textures aren't just higher resolution versions of the DX9 textures either, they are actually completely different.
The first image above shows a DX9-DX10 comparison of a close-up of a mountain side. Notice that the rock texture used on the moutain is completely different and not just the same texture at different resolutions. The second image shows the same mountain textures, but from a much greater distance. From afar, the difference in the textures is much more noticable, although some of the difference is due to a DX10 exclusive lighting effect reflecting off of the rock. These images also show another difference between DX9 and DX10, the softness of shadows. Call of Juarez's DX10 mode offers softer, more natural looking shadows.
However, not all of the textures have changed. Some of them remain the same between the two versions of the game. The image above shows one such texture. Notice that while the texture is the same, the mip-mapping used in DX10 is much improved, making the stone appear to have greater depth.
Another very noticable difference between DX9 and DX10 in Call of Juarez is the way water is rendered. The water rendering method used in DX9 is fairly transparent and highly reflective of surrounding objects. As a result the color of the water is dominated by the color of the river bed and objects near the river like the trees and hill side. However, in DX10, the water is less transparent and less reflective. The result is that the color of the water is far less dependent on the river bed and surrounding objects. Instead, the river mostly reflects the sky. The edge of the water in DX10 also has a smoother transition between water and land than in DX9. Neither effect is totally convincing and which one looks better comes down to personal preference.
Image Quality Impressions
Call of Juarez displays the greatest image quality difference out of any DirectX 10 compatible game we have seen, however it doesn't do this purely through DX10 image quality enhancements. While, from a DX9 vs. DX10 perspective, this could be considered "cheating", it is still a DX10 exclusive feature regardless of how it was achieved. However, it does raise the question of how many of the image quality improvements seen in the DX10 version of Call of Juarez can be duplicated in DX9.
Overall, we think Call of Juarez is a very good looking game in DX9 and great looking game in DX10. This is one of the few games where DX10 rendering definitely makes an appreciable difference in image quality that can be noticed during regular gameplay. While the difference is clearly present, it's hard to say if the difference is necessarily an improvement. In many cases, such as with the water effect, whether the game looks better in DX9 or DX10 comes down to personal preference. The next logical question is how much of a performance penalty, if any, will we need to endure in order to benefit from the DX10 exclusive features of Call of Juarez.
Discuss This Article
|Call of Juarez: Performance|
As we saw in our image quality examination, Call of Juarez takes the title of having the largest image quality difference between DX9 and DX10 of any currently available DX10 capable game. However, it doesn't achieve this through use of DX10 effects alone but rather a whole new set of textures and effects that are of much higher quality than those used for the DX9 mode. Some of the image quality improvements seen in the DX10 version of Call of Juarez can probably be duplicated in DX9 although it is hard to tell exactly how many.
It will be interesting to see how much performance is effected by the addition of so many new image quality enhancements to the DX10 mode of the game. Call of Juarez has the second highest system requirements out of the five games we are examining in this article. Interestingly, the game also has the lowest hard drive space requirement out of the five games, requiring only 2.4GB of free space.
Call of Juarez System Requirements
For our benchmarks, all graphics settings were turned up to their highest level. Anti-aliasing was turned on and set to 4X while anisotropic filtering was set to 16X. Vertical sync was manually disabled in-game as well as forced off in the graphics driver options. The DX9 and DX10 versions of Call of Juarez must be started using different executable flags and there is no way to switch between the two rendering modes in-game, however, some of the DX10 exclusive features can be disabled using the "Enhanced Quality" option that is available in DX10 mode.
Call of Juarez does not have a built-in automated benchmark. However, a stand-alone benchmark was released by Techland some time before the North American version of the game was released. This benchmark was designed to show off the new DX10 improvements that were added to the North American release of the game. However, we do not believe that benchmark presents an accurate picture of normal gameplay performance so we will not be using it. Instead, we decided to manually run through a level while recording our FPS with Fraps' benchmarking function. The benchmark numbers presented below are the average frame rates recorded by Fraps during our run. We chose Episode II of the game for our testing since it presented a mix of environments representative of the rest of the game. The Episode II level contains a good variety of environments ranging from close quarters indoor areas to huge outdoor environments. The benchmark run begins at the beginning of the episode, inside a small church, from which we then travel outside and across the expansive level. We followed the standard path one would take to beat the level but we also took several detours along the way to explore some areas that might be missed during a normal play-through.
Along the path of the benchmark route are several enemies and a hand full of short rendered in-game cut-scenes. Call of Juarez is a heavily scripted game so each run through the level was nearly identical to the next. Despite this, for the sake of accuracy, each benchmark run was attempted five times and the average was taken and recorded. We tried our best to perform the same exact movements during each benchmark run. Benchmark runs that resulted in strange values that did not correlate with the rest of the results were attempted a second time. This process, though painstaking, allowed us to generate relatively accurate and repeatable benchmark results.
We have summarized the results we obtained from our testing tables above. Unlike some of the other games, with Call of Juarez the NVIDIA and ATI cards all exhibited similar behavior. All five cards performed significantly better in DX9 than they did in DX10 and we aren't all that surprised. The large amount of DX10 enhancements really take their toll.
Equally unsurprising, the GeForce 8800 GTX proved to be the overall top performer out of our five graphics cards. In DX9, the 8800 GTX is perfectly playable and maintains good frame rates despite the fact that all image quality settings are pushed to the max. However, in DX10, the 8800 GTX's performance drops significantly to nearly 50% of its DX9 performance. While the game remains playable, even at 1920x1200, the card is really feeling the stress and this doesn't bode well for the other four cards.
The GeForce 8800 GTS comes in second while the Radeon takes a close third. Like the 8800 GTX, both of these cards maintain acceptable frame rates in DX9 but falter in DX10. At 1920x1200 in DX10, both cards exhibit noticeable frame stutter. The situation is far worse for the two mid-range cards. Both the GeForce 8600 GTS and the Radeon 2600 XT are completely unplayable in DX10 with the settings we tested. Their frame rates were so low that is was extremely difficult to complete the benchmark route and the lowest results should be taken with a grain of salt. Make no mistake, the performance was really that bad, however since the game became very difficult to play, the margin of error becomes large enough that direct comparisons between results lower than about 8 frames per second become meaningless. Both of the mid-range cards do a bit better in DX9 but remain difficult to play, although the 8600 GTS seemed to have the upper hand.
Remember that all of the image quality settings were maxed out in our test. We found that it was entirely possible to obtain playable frame rates on the two mid-range cards with only a small sacrifice of image quality.
Discuss This Article
|Company of Heroes: Image Quality & Features|
Company of Heroes is a WWII real-time strategy game developed by Relic Entertainment and originally released just over a year ago. Relic took the formula they had used in their highly successful Warhammer 40k series and adapted it to the much-trodden setting of World War II with great success. Company of Heroes' unique blend of excellent graphics and steller gameplay won more than its fair share of 'Game of the Year' awards and it went on to become one of the most highly rated RTS games of all time.
Company of Heroes uses the Essence Engine which Relic developed in-house specifically for the game and at the time of release, it featured the best graphics of any strategy title. Company of Heroes is a very impressive looking game and it supports many features and technologies previously reserved for first-person shooters. In addition to steller graphics, the game also makes use of the Havok 3 physics engine, allowing the game to impliment a more realistic physics system than previous strategy games.
Company of Heroes is the first of only two currently available real-time strategy games to impliment DirectX 10 elements, although it was not always that way. Company of Heroes was initially a DirectX 9.0c title, but at the end of May, over eight months after it was originally released, Company of Heroes recieved DirectX 10 support in the form of a patch, bringing the game to version 1.7. A patched game automatically detects DirectX 10 compatibility and if available, makes a couple new options available in the video settings menu. On a DirectX 10 capable rig, the game gives the user the option of using DX10 shaders as well as increasing the Terrain Detail setting to 'Ultra'. Both of these options are not selectable on a rig not capable of DX10.
Enabling these DX10 exclusive options nets the player a couple new graphical effects. The most noticable difference is the addition of grass in the game. In DX9, grass in Company of Heroes is represented by greenish terrain textures with the occasional bush, but in DX10, individual blades of grass can be seen where appropriate. The two images below illistrate the difference between grass in DX9 and DX10. The first image shows a zoomed in view of a patch of grass in the game. In DX9, we see a somewhat blurry grass texture while DX10 gives us a patch of 'fuzzy' grass. You can clearly see individual blades of grass, although the effect isn't terribly convincing. However, remember that this is a RTS where you're likely to spend most of your time zoomed out to get a better view of the battlefield. The second image shows the same patch of grass from a comfortable distance similar to the distance you would normally view the battlefield during gameplay. From this distance and angle, the grass effect is slightly more realistic, although it's also more difficult to notice.
Another difference between DX9 and DX10 is the use of higher quality terrain textures. The difference is quite subtle and we found it difficult to notice outside of a side-by-side comparison. However, the difference in texture quality is definitely there as you can see in the second image above. The difference in texture quality is most noticable on the gravel road. In DX10, details in the road are better defined and sharper than in DX9.
In Company of Heroes, each level has a predefined global light source that is used for all shadow and lighting calculations. While this makes perfect sense in a day-time setting where the gobal light source simply represents the sun, Company of Heroes offers several night-time levels. In a night time setting, the global light source isn't sufficient since there are almost always more than one light source in the vacinity of any given object. When DX10 shaders are enabled, point lights gain the ability to cast shadows. The difference between how lighting is handled in DX9 and DX10 in Company of Heroes is shown in the image below. In DX9, each member of the engineer squad only casts a single shadow dictated by the global light source for that level. However, the lamp they are standing next to has no effect on lighting and shadow calculations at all. In DX10, each engineer now has a second shadow cast by the lamp.
Notice that this lighting effect isn't just limited to units, it also effects level objects. In the image above, the lamp is hanging from one side of the lamp post. In DX9, the lamp post only has one shadow, cast by the global light source. Notice that in DX10, the lamp light also casts a shadow so that the lamp post now has two shadows. Another effect enabled through the use of DX10 shaders is softer shadow edges. In all of the comparison images so far, the shadows in DX9 have ridged, aliased edges despite an anti-aliasing setting of 4X. In DX10, shadows have smoother edges, making them appear softer and more natural. The difference in shadow aliasing is most noticable in the images of the grass patch where the shadow of the flag pole is significantly smoother in DX10 than DX9.
The last difference in image quality between DX9 and DX10 that we noticed is the addition of 'litter objects' in DX10. A litter object is any small geometric object whose only purpose is to increase image quality and realism. Litter objects in Company of Heroes usually take the form of rocks. In the image above, we can see an example of litter objects. In the DX9 image, the terrain is 'flat' while in DX10, there are several small rock-like objects on the ground. Another property of litter objects is that they are not static. Since they are geometric objects, they are interactive and can be moved. For example, an explosion could throw a rock into the air or a tank could crush it with its tracks.
Image Quality Impressions
The DirectX 10 support introduced by patch 1.70 brought several noticable image quality enhancements to Company of Heroes, many of which add a significant amount of realism to the game, although the slightly improved terrain textures are much too subtle to be noticed, and we didn't particularly notice the addition of point-light shadows. However we did notice the new grass effects and we felt that litter objects did their job of fleshing out the terrain to make it appear more realistic. The DX10 enhancement we noticed the most, however, was the improved shadow effects. Something about the rough, jagged and primative shadow effects in DirectX 9 just didn't sit well with us and we thought that it really stuck out, especially when all of the other graphical effects are cranked to the max.
Despite being a strategy game, Company of Heroes has some excellent graphics. Unfortunitely, since it's a strategy game, you're also less likely to notice how great everything looks. Unlike a first-person shooter where your smack in the thick of the action, you spend most of your time in a strategy game hovering well above the battlefield and graphics play a much smaller roll in your experience. This doesn't bode well for DirectX 10 image quality enhancements which are quite subtle to begin with. Like the other games we have looked at so far, we rarely noticed the additional DX10 effects during gameplay.
Discuss This Article
|Company of Heroes: Performance|
Strategy games generally have low system requirements, especially compared to first-person titles from the same generation. Company of Heroes is somewhat of an exception. Since it offers nearly first-person shooter class graphics, it must also suffer from FPS-level system requirements. It isn't too difficult to get Company of Heroes to run at an acceptable speed since it only requires a 2.0GHz single-core processor and a DX9c compatible graphics card with 64MB of memory. However, it can be a challenge to get the game to look as good as it can while still maintaining playable frame rates.
Company of Heroes System Requirements
For our benchmarks, all graphics settings were turned up to their highest level. Anti-aliasing was set to 4X and anisotropic filtering was set to 16X. While patch 1.70 added DirectX 10 support, it also implemented mandatory vertical sync. Starting with patch 1.70, there is no longer an option to toggle vertical sync and it is on by default. We also found that forcing vertical sync off with the display drivers is ineffective. The only way to reliably disable vertical sync for a patched version of Company of Heroes is to use the '-novsync' flag. We inserted the flag into the Company of Heroes shortcut in the Windows Game Explorer for all of our benchmarks.
In a scripted first-person shooter, the player is the primary variable that the rest of the game world revolves around. This makes it relatively easy to create a contrived in-game test that can be used as a benchmark. It's usually as simple as playing through a level several times whilst performing the exact same actions each time. In a strategy game, there are far more variables and it is much more difficult to create a consistent, repeatable test to use as a benchmark. In most strategy games, Company of Heroes included, it is nearly impossible to re-create the exact same sequence of events. Even if the player behaves exactly the same each time, the computer will often respond differently.
Luckily, Company of Heroes has a built-in automated benchmark that can be accessed from the video options menu. The benchmark consists of a composition of two in-game, fully rendered cutscenes that you encounter in an early level of the single player campaign. We found the benchmark to be fairly useful for judging what performance will be like during normal gameplay and we will be using it in this article.
The results of our benchmark shows a huge disparity between our DX9 and DX10 results. For all five of our graphics cards, DX9 performance far exceeded DX10 performance. Our top three cards, the 8800GTX, 8800GTS and 2900XT, all performed very well in DX9 and maintained very playable frame rates. The 8600GTS and the 2600XT didn't do quite as well but remained playable in DX9 except at 1920x1200, where they both began to noticeably stutter. DX10 performance is a completely different story. While the 8800GTX and the 8800GTS maintain acceptable frame rates at all the resolutions we tested, the 2900XT had a hard time, although it remained somewhat playable. Our two mid-range cards didn't like DX10 at all and posted unplayable results at all three resolutions.
We'd like to point out that the low results posted by some of the cards aren't typical of a standard user experience. Company of Heroes can be very video memory intensive and it's fairly easy to exceed a video card's available video memory with certain setting combinations, resulting in a sharp drop in performance. Relic has realized this and in patch 1.71, they have added a bar in the video options menu to indicate how much video memory is being used and if you exceed the available amount, the game prevents you from saving the chosen video settings. Due to this restriction and since our mid-range cards only possess 256MB of video memory, we had to use version 1.70 of the game for our tests. We found that 256MB of video memory was simply too limiting and we were not able to enable many of the game's image quality options. We'd also like to note that with all image quality settings set to their highest value, even the 768MB of video memory that the 8800GTX wasn't enough in DX9, although it was barely able to fit in DX10. This gives us some indication of just how memory intensive Company of Heroes can be and it also tells us that DX10 is actually more memory efficient than DX9.
It seems that the DX10 image quality enhancements really hurt performance and the cost is hard to justify. The image quality improvement provided by the DX10 enhancements really aren't noticeable enough to justify the extremely steep performance hit. All of our results show that Company of Heroes is able to achieve less than half the average frame rate in DX10, compared to DX9. The small image quality improvement simply isn't worth it.
Correction: We made a slight mistake in the original article. It turns out that the bar in the video options menu, added by patch 1.71, doesn't represent video memory per se. It actually represents the amount of virtual address space used by Company of Heroes, they just happen to be essentially the same thing in Vista.
Discuss This Article
|Lost Planet: Image Quality & Features|
Lost Planet: Extreme Condition is a third-person shooter created by Capcom for the XBOX 360. Lost Planet is a pretty standard third-person console shooter and it features both personel and vehicle combat. The game features a heavily scripted and story-driven single player mode as well as multiplayer mode. While the gameplay is not particularly original and the story takes itself a bit too seriously, Lost Planet features slick, well executed action as well as some of the best graphics on any platform. Never had we seen such blatent and overly frequent use of motion blur effects. Normally this would be a complaint but the game just looks so good, you can't help but forgive it for showing off.
About five months after Lost Planet arrived on the XBOX 360, Capcom ported it to the PC. The PC release includes some exclusive content not found in the original XBOX 360 version such as several new multiplayer maps, a movie mode that lets you view all the cutscenes back-to-back, a Resident Evil 4 style view mode, and three new characters. Possibly the most interesting -- as well as the most relavent to this article -- exclusive feature is DirectX 10 support. DirectX 10 in Lost Planet brings several new image quality enhancements like better depth of field effects, motion blur and fur shading.
Possibly because it's a XBOX 360 port, Lost Planet has the shortest list of DX10 exclusive image quality enhancements out of the five games in our test. In fact, we found that the PC version of Lost Planet is a near perfect port of the original XBOX 360 version. This isn't necessarily a good thing since very little was done to help adapt the game to the unique controls and hardware challenges of the PC. No where is this more apparent than in the menu, which breaks just about every convention for menu navigation with a keyboard. The graphics also appear to be of the same general quality as the XBOX 360 version which isn't as much of a complaint since the game looks excellent on the 360.
In total, we were only able to notice three differences in image quality in Lost Planet, between DX9 and DX10. The most noticable difference in image quality between DX9 and DX10 is the way fur is rendered. Lost Planet takes place on a bitterly cold ice planet and just about everyone's attire has fur incorporated into it somewhere. It's also a third-person game so you're constantly staring at your in-game character's back-side, and your usually wearing fur-lined clothing. What that all amounts to is that you'll be seeing a lot of the fur effect and we definitely noticed the image quality difference between DX9 and DX10. In the two images below, you can see that the fur effect in DX9 looks kind of spotty and prickly while the DX10 fur looks like it would be nice and soft. While this is a nice image quality improvement, we can't say it really improved our gaming experience any.
Another difference we noticed is with the way shadows are rendered, or more specifically, the way shadow edges are rendered. In DX9, shadow edges are hard and unnaturally sharp and crisp. Shadows in DX9 are also aliased, regardless of the level of anti-aliasing used. In DX10, shadow edges are softer and appear slightly blurry. We also never noticed any shadow edge aliasing in DX10. However, we'd like to note that the difference is rather subtle and you probably won't notice unless you were looking for it and knew what to look for.
The last image quality difference that we were able to notice showed up when we were comparing the screenshots we took. In DX10, contrast seems to be higher all-around and this results in a couple of image quality advantages. The biggest advantage of higher contrast is apparent when you're outdoors, which is most of the time. Lost Planet uses a thick haze to control draw distance and objects that are farther away tend to get lost in the haze and become difficult to see. With higher constrast, far-off objects that are still within the draw distance are more visible, making the draw distance appear longer. This can be noticed in the image below, where the mountain in the distance is hard to see in DX9 but plainly visible in DX10.
Another image quality improvement that results as a side-effect of increased contrast in DX10 is better object detail. It's constantly snowing in Lost Planet and you'll be spending a lot of time in snowy white landscapes where a higher contrast results in a slight increase in object detail. Thanks to the ever-present snowy white haze, details on objects are difficult to see unless your standing right next to them. With increased contrast, this issue is slightly eleviated.
Lastly, the motion blur effects in DX10 are supposed to be of higher quality than in DX9. However, we were unable to notice a difference. Any attempt to take screenshots to directly compare the difference would also be extremely difficult. Even if we were to get such screenshots, they wouldn't help much since this is a moving effect and unless it looks better while moving, which is doesn't appear to, then it doesn't really matter.
Image Quality Impressions
Overall, Lost Planet boasts the smallest image quality improvement in DX10 over DX9. Whether or not DX10 is worth the trouble will be determined by the performance numbers on the next page, but we can tell you now that it certainly doesn't make much of a difference when it comes to image quality. The up-side is that Lost Planet looks every bit as good on the PC as it did on the XBOX 360 and it remains a beautiful game while in motion thanks to healthy and frequent doses of the game's excellent motion blur and depth of field effects.
Discuss This Article
|Lost Planet: Performance|
Lost Planet offers very little in the way of DirectX 10 image quality enhancements, as we saw on the previous page. Besides the odd fur effect, Lost Planet looks the same in DirectX 9 as it does in DirectX 10. We suspect that this is due to Lost Planet's console heritage. Since the game is a near exact port of the XBOX 360 version, we're not surprised that the PC version didn't receive much in the way of graphical enhancements. This would lead us to assume that performance between DX9 and DX10 in Lost Planet will also be very similar.
Lost Planet has some of the lowest system requirements out of the games we are testing today. Requiring only a P4 class single core processor running at 1.5GHz, 512MB of memory and a DX9c compliant video card with 128MB of onboard video memory. This makes the game very accessible and we'll be keeping an eye on how our midrange cards perform.
Lost Planet System Requirements
For our benchmarks, all graphics settings were turned up to their highest level. Anti-aliasing was turned on and set to 4X while anisotropic filtering was set to 16X. Vertical sync was manually disabled in-game as well as forced off in the graphics driver options.
Lost Planet has a built-in automated benchmark that can be accessed from the first menu layer that the game starts in. The automated benchmark has two parts, both of which take place in the first mission in the game. The first part of the benchmark is a fly-through of the first section of the level, starting at the spawn point and moving through until the cave entrance is reached. Then the benchmark proceeds to the second part of the test. Now the camera is fixed in the middle of the cave and slowly rotates. Once two rotations have been completed, the benchmark starts over. An average frame rate is produced for each section of the benchmark. For our purposes, we have taken the average of the frame rates produced by the two sections of the benchmark to create a single number that we will use to represent each individual benchmark run.
For the sake of accuracy, each benchmark run was attempted five times and the results were averaged. Benchmark runs that resulted in strange values that did not correlate with the rest of the results were discarded and the benchmark run was attempted a second time.
The results we obtained from our testing gave us a heap of data which we have summarized in the tables above. Despite it's relatively sparse usage of DX10 image quality enhancements, Lost Planet still demonstrates a significant performance drop from DX9 to DX10. The game also wasn't as hardware friendly as the system specifications would suggest. Our two mid-range cards were rather unplayable, even at 1280x1024.
The three higher-end cards all maintained playable frame rates throughout, although the Radeon 2900 XT really suffered in DirectX 10. While both 8800 cards only displayed a relatively small performance drop from DX9 to DX10, the 2900 XT's performance plummeted in DX10. The 2900 XT maintains a playable frame rate at all three resolutions in DX9 but in DX10, it had trouble running the games at a decent frame rate at anything above 1280x1024.
We found that the system requirements are quite optimistic. While several of the games we've tested so far have really punished our mid-range cards, with only minor image quality adjustments we were able to obtain playable frame rates in those games. This wasn't the case with Lost Planet. We were able to play the game with both of the mid-range cards, but only at a significant cost of image quality.
In our search for playable frame rates for the two mid-range cards, we noticed an interesting trend. When using DX10 rendering, but with all of the DX10 exclusive features turned off, the game sometimes performed better than when it is using DX9 rendering. With all of the DX10 exclusive featured disabled, the game is essentially identical in image quality to DX9. We decided to run a full set of apples-to-apples benchmarks in order to explore this phenomenon.
These three graphs represent the data we collected in our look at apples-to-apples performance in Lost Planet. When no DX10 exclusive features are enabled, Lost Planet exhibits a completely different characteristic than it did when running with all DX10 enhancements turned on. Interestingly, despite having identical image quality with the game in DX9, when the game is running in DX10 with exclusive options disabled, all of our cards performed much better. So much better that all three NVIDIA cards actually ended up performing better in DX10 than DX9 at all three image quality settings. This means that by switching to DX10, NVIDIA owners can get a couple "free" frames per second of extra performance at no cost in image quality.
Unfortunately, this trend was not observed with the ATI cards. While the Radeon 2600 XT was able to perform identically in DX9 and DX10 with these settings at 1920x1200, its performance was so low that it doesn't even matter. However, both ATI cards benefit from the same performance increase as the NVIDIA cards, although it isn't enough to rectify the Radeon 2900 XT's sharp performance drop when in DX10.
Discuss This Article
|World in Conflict: Image Quality & Features|
The last game in our look at the state of DX10 is World in Conflict. Developed by Massive Entertainment and published by Sierra, World in Conflict is an alternate history real-time tactical game set in 1989. Released for the PC and the XBOX 360 in mid-September, World in Conflict explores a "what if" scenario where the Soviet Union -- on its last legs and about to collapse -- decides to go to war in a last-ditch attempt to pull itself out of economic collapse. The game features a single-player campaign narrated by Alec Baldwin, a skirmish mode and multiplayer via Massive Entertainment's 'Massgate' system.
While World in Conflict explores an interesting plot and rather unique setting, its real claim to fame are its fast-paced tactical gameplay and steller graphics. Company of Heroes may have been considered the best looking strategy game when it was released last year, but World in Conflict is definitely the best looking strategy game to date. WiC incorporates many advanced graphical effects and as long as you don't zoom in all the way, it looks absolutely spectacular. Most important to us, the game also features DirectX 10 support and it incorporates a lot of DX10 features.
World in Conflict (DirectX 10)
We'd like to note that while we are looking at World in Conflict last in this article, we actually started with World in Conflict during our testing and at the time the game had not been released yet so all of our image quality and performance tests are conducted with the official demo. However, the demo is sufficient in giving us an idea of what World in Conflict has to offer and should be an accurate representation of the final release.
World in Conflict boasts three main image quality enhancements that are exclusive to DX10 and an additional functional difference. The three primary DX10 exclusive image quality effects are the addition of soft particles, cloud shadows, and volumetric lighting. We also noticed slight differences in the way shadows are rendered. We'll get to the image quality enhancements in a moment.
On a DX10 compatible rig, World in Conflict enables enhanced dual-screen support which is unavailable in DX9. World in Conflict has a 'Mega-map' screen that displays the entire battlefield on-screen, in real-time, in a map-like style reminiscent of a satellite image. The mega-map also displays tactical data overlayed on the terrain such as unit types, unit positions, line-of-sight and terrain features. Commands can also be issued from the mega-map so the game can also be played from this view. Enhanced dual-screen support in DX10 mode allows the player to play the game on one of the screens while the Mega-map is exclusively displayed on the second screen.
World in Conflict Mega-map
Enhanced dual-screen support is a very useful and significant feature since it is the only feature exclusive to the DX10 path of a game that effects gameplay in any of the games we have looked at. This feature also gives dual-screen setups a bit more utility and may lend a dual-screen user a tangible tactical edge in the game. For really serious players who wish to compete with the best, this feature alone may be worth the transition to DX10.
Moving on to image quality differences, we'll start with soft particles. When particle effects intersect with geometry in World in Conflict's DX9 mode, the particle effect intersects in a hard line and appears to be comprised of stacked 2D planes. Just like in Bioshock, when DX10 is enabled in World in Conflict, particle effects are softer and intersection points with geometry don't result in visible lines of intersection, giving a more realistic effect overall. Additionally, soft particles in World in Conflict also exist in-game, which means that particles are effected by other in-game objects. For example, in DX9, if a helicopter flew through a plume of smoke, it would simply go directly through the smoke effect. That same scenario would play out much more realistically in DX10 since soft particles would be effected by the presence of the helicopter, resulting in the smoke being dispersed by the helicopter's passing. In the image below, you can see the difference between particle effects in DX9 and soft particles in DX10.
Soft Particles in World in Conflict
Another image quality enhancement available in DX10 in World in Conflict are cloud shadows. While clouds are present in both DX9 and DX10, in DX10 the clouds can cast their own dynamic shadows. The clouds are constantly moving through the sky, this means that the shadows they cast slowly roll accross the landscape in real-time. This effect is quite detailed and very realistic. Not only are the cloud shadows cast on the ground, but they also roll over buildings and units. While this effect is quite subtle, it adds to the realism of the game and makes it that much more pleasent to look at.
Another image quality enhancement offered by DX10 rendering in World in Conflict is volumetric lighting. World in Conflict features the best volumetric lighting effects in any strategy game and they easily rival the lighting effects of many first-person games. The volumetric lighting effects are quite spectacular as you can see in the images below, however they are not limited to sunlight. While it is much harder to notice since the effect is fleeting, volumetric effects can also be found in explosions. We found volumetric lighting in World in Conflict to be quite noticable and pleasant to look at . You can also see the effect of cloud shadows in the image below. Note that the cloud shadow effect is much more pronounced when in motion like it would be in-game.
Volumetric Lighting & Cloud Shadows in World in Conflict
The last image quality difference we were able to notice is in the way shadows are rendered. When we were zoomed in enough, we noticed that, like in Bioshock, shadows in DX10 are crisper and more accurate than in DX9. In the image below, the shadow in DX9 has blurry edges while the same shadow in DX10 has sharp and crisp edges. This difference is very subtle and you're very unlikely to notice it during normal gameplay since unlike in Bioshock where you're right next to the shadows, in World in Conflict you're often zoomed out, far away from the ground in order to see more of the battlefield at once.
Shadow Edges in World in Conflict
Image Quality Impressions
World in Conflict is a great looking game and enabling DirectX 10 makes it look even more spectacular. However, it is also a very hectic and engaging game so your constantly busy doing something. This means you won't get very many oppertunities to stop and oogle at the beautiful graphics. We played the demo in both DX9 and DX10 and our impression is that they looked pretty similar, even though we knew it not to be true. Despite knowing what to look for, we were simply too busy enjoying the game to notice the image quality differences between DX9 and DX10.
The most noticable DX10 image quality enhancement is definitely volumetric lighting, unfortunetaly volumetric lighting is rarely encountered if you prefer to play the game from a traditional top-down view, where you're zoomed out and looking at the ground. With this view point, it's also more difficult to appreciate soft particles since your zoomed out far enough not to notice how ugly particles effects can be in DX9.
However, one huge advantage of using World in Conflict with DX10 enabled is the dedicated mega-map screen with a dual-screen setup. While the necessity of having two screens makes this feature quite prohibative, we found it to be extremely useful. The information available through the mega-map is extremely valuable and being able to see it at a glance without having to leave the main in-game view can make all the difference. Another side effect of having a dedicated mega-map screen is that you can afford to be zoomed in more on your main screen and therefore appreciate the graphics more since you'll still be able to get quick tactical information from the mega-map.
Discuss This Article
|World in Conflict: Performance|
Considering how good the game looks, you would naturally suspect that World in Conflict would require some serious power to run, but that isn't the case. World in Conflict's system requirements are actually pretty moderate. Requiring only a 2GHz P4 class single core processor, 512MB of RAM (1GB for Vista) and a 128MB DX9c compliant video card with Shader Model 3.0 support, World in Conflict should be able to run on most gaming oriented machines made in the last couple of years. As we saw on the previous page, World in Conflict has a couple of nice DX10 image quality enhancements. However, we've reserved final judgment on whether DX10 is worth it so far, but now it's time to find out if there is a performance cost associated with rendering all those soft particles, cloud shadows and volumetric lighting effects.
Lost Planet System Requirements
For our benchmarks, all graphics settings were turned up to their highest level. Anti-aliasing was turned on and set to 4X while anisotropic filtering was set to 16X. Vertical sync was manually disabled in-game as well as forced off in the graphics driver options. World in Conflict can be toggled between DX9 and DX10 rendering with an option in the video settings menu, although a game restart is required before changes take effect. Note that we used the demo version of the game for all of our tests.
Like Company of Heroes, World in Conflict has a built-in, in-game benchmark test. The test in the demo version of the game consists of a flyby over mission 3 of the single-player campaign. During the flyby, different graphical aspects of the game are demonstrated while a large battle takes place across the map. We found the results of this built-in benchmark to be a good indication of what typical in-game performance would be like.
For our tests, we ran the built-in benchmark tool five times per resolution, per video card. Benchmark runs that resulted in strange values that did not correlate with the rest of the results were attempted a second time. We then averaged the results for each resolution to obtain our final results.
With one exception, the results of all the video cards followed the same pattern. In nearly all cases, DirectX 10 performance seriously lags behind DX9 performance, although not as much as some of the other games we've seen. The one exception to this trend is the Radeon 2900 XT which seemed to be indifferent of whether the game was running in DX9 or DX10. In fact, the 2900 XT actually performed 1 frame per second, on average, better in DX10 than in DX9. We found this behavior to be quite odd since the Radeon 2600 XT did not share the 2900 XT's indifference. However, this is good news for 2900 XT owners since they pay absolutely no penalty for enabling DX10 rendering.
The 8800 GTX remains the overall top performer. For DX9, the 8800 GTS comes in second with the 2900 XT in a close third, however the situation is reversed in DX10. The 8800 GTX was playable at all times and we didn't notice any slowdown, even at 1920x1200 with DX10 rendering turned on. While the 8800 GTS and the 2900 XT don't perform quite as well, often dipping below the 20FPS mark, they both remained fairly playable at all resolutions in both DX9 and DX10.
Unfortunately, our two mid-range cards didn't do very well and posted unplayable results in both DX9 and DX10 with the settings we chose. Both mid-range cards were unplayable in both DX9 and DX10, although the 8600 GTS was able to hum along at an average of 18FPS in DX9 at 1280x1024 and was playable, although prone to stutters and sharp frame rate drops. We'd like to remind you again that our results were created with the game in its highest image quality mode. It is entirely possible to play World in Conflict smoothly on both of our mid-range cards when using lower video settings.
Discuss This Article
|Performance Analysis & The State of DX10|
Our tests generated a mountain of data and we've sorted through it, one game at a time. While each game tells its own story, ultimately they end up being slight variations of the same tale. The current generation of games doesn't perform very well in DirectX 10 mode. In almost every game, the story was the same. The games were playable in DX9 but take huge performance hits when switched to DX10, with DX10 exclusive image quality options enabled.
Out of the five games we tested, only one displayed a definite performance improvement while using a DX10 path, Lost Planet, and that was only when the DX10 exclusive graphics options were disabled. This shows that DX10 does have the potential to be more efficient than DX9. Unfortunately, none of the other games demonstrated this efficiency.
Our performance results also tell us two important things. First, the current generation of graphics hardware and software doesn't offer a large enough image quality improvement using a DX10 path to justify the associated framerate performance penalty. While most of the games and video cards in our testing performed quite well in DX9, they also took a huge hit after switching to DX10. Even the higher end cards were slowed considerably and had difficulty maintaining playable frame rates at higher resolutions in DX10. The second thing our results tell us is that the current generation of graphics hardware could have questionable longevity when it comes to DX10 gaming performance.
With DX10 performance in currently available games being what it is, we have to question whether or not the current batch of hardware will be able to handle upcoming games at playable frame rates in DX10-mode with reasonably high graphics settings. With over a dozen DX10 titles slated to arrive in the next six months, that could be bad news for early adopters who have already plunked down for an upgrade to DX10 in anticipation. However, it is still very likely that drivers will improve considerably and developers will get better at implementing DX10 features more efficiently into their game engines. Finally, there may even be hope that Microsoft will further improve Vista's performance as it relates to graphics and gaming, but the current situation isn't great.
The introduction of DirectX 10 will, without a doubt, have a huge impact on PC gaming. DX10 is in many ways a rebirth of DirectX rather than just an evolution of DX9. DirectX 10, as a completely new API, will free developers from the limitations that have been carried forward from one version of DirectX to another over the last decade. Since it's been rebuilt from the ground up, DX10 offers revolutionary features, like geometry shaders, and shows a lot of promise. However, retro-fitting DirectX 10 support into an existing game engine doesn't result in an instant image quality increase and better efficiency. The new API means that PC game developers, who have been accustomed to using the old DirectX for years, will now need to adjust to DX10's new API and the transition to the new API hasn't been without its growing pains of course.
Another issue is the need for backwards compatibility with DX9, for both hardware and software. It will take some time for DX10 hardware to become the norm and until then developers will be unwilling to alienate the section of the market that still uses DX9 hardware by releasing a DX10 exclusive game. This forces a compromise between DX9 and DX10 when it comes to image quality and performance optimizations. Currently the logical choice is to lean towards DX9 since much of the hardware out there today is still DX9 from the previous generation and current generation hardware is backwards compatible in DX9 mode. It's for these reasons that it will be unlikely for us to see a game that lists DirectX 10 as a minimum requirement, at least on the near-term horizon.
Are We There Yet?
The DX10 exclusive effects available in the five games we looked at were usually too subtle to be noticed in the middle of heated gameplay. The only exception is Call of Juarez, which boasts greatly improved graphics in DX10. Unfortunately these image quality improvements can't entirely be attributed to DX10 since the North American version of the game -- the only version that supports DX10 -- had the benefit of a full nine months of extra development time. And much of the image quality improvements in Call of Juarez when using DX10 rendering were due to significantly improved textures rather than better rendering effects.
Our test results also suggest that currently available DX10 hardware struggles with today's DX10 enhanced gaming titles. While high-end hardware has enough power to grind out enough frames in DX10 to keep them playable, mid-range hardware simply can't afford the performance hit of DX10. With currently available DX10 hardware and games, you have two choices if you want to play games at a decent frame rate; play the game in DX9 and miss out on a handful of DX10 exclusive image quality enhancements, or play the game in DX10 but be forced to lower image quality settings to offset the performance hit. In the end, it's practically the same result either way.
While the new DX10 image quality enhancements are nice, when we finally pulled our noses off the monitor, sat back and considered the overall gameplay experience, DirectX 10 enhancements just didn't amount to enough of an image quality improvement to justify the associated performance hit. However, we aren't saying you should avoid DX10 hardware or wait to upgrade. On the contrary, the current generation of graphics cards from both ATI and NVIDIA offer many tangible improvements over the previous generation, especially in the high-end of the product lines. With the possible exception of some mid-range offerings, which actually perform below last generation’s similarly priced cards, the current generation of graphics hardware has a nice leg-up in performance and features that is worth the upgrade. But if your only reason for upgrading is to get hardware support for DX10, then you might want to hold out for as long as possible to see how things play out.