The State of DirectX 11 - Image Quality & Performance - HotHardware

The State of DirectX 11 - Image Quality & Performance

3 thumbs up
As previously mentioned, we intend to explore the state of DirectX 11  in this article. That's a very tall order and one we cannot hope to fulfill unless we set out some boundaries and determine a specific scope for our examination. It would be nearly impossible and rather frivolous to encompass an evaluation of DirectX 11 from the perspective of everyone affected by its introduction -- such as the game developers, hardware manufacturers and end consumers -- into a single article.

To help keep the article at a manageable scope, we are only going to explore the current state of DirectX 11 from the perspective of the end consumer. Anything that is or should be transparent to the consumer, such as API optimizations that let developers program more efficiently, will not be covered. Note that this is not necessarily the same as exploring the current end user experience. We want to focus on examining the potential of currently available DirectX 11 software. We will focus our attention on two fronts; performance and image quality. Specifically, we are interested in examining the differences in performance between DirectX 9.0c and DirectX 11 and the image quality enhancements of DirectX 11, if any.

HotHardware's Test System
System Specifications, Video Cards Used & Games Tested

AMD Phenom II X4 955
(3.20GHz Quad-Core)

AMD 785G Chipset Motherboard

2GB DDR3 1333MHz CL7

On-Board 10/100/1000 Ethernet
On-Board Audio

WD "Raptor" 150GB Hard Drive
10,000 RPM SATA

Windows Vista Ultimate
Catalyst 10.3
DirectX 9, 10, 11

Video Cards:

Diamond Radeon HD 5770 1024MB
Diamond Radeon HD 5870 1024MB

Aliens vs Predator
Battlefield: Bad Company 2
Dirt 2
Unigine Heaven Benchmark

Performance Tests:
For our performance tests, we used a single system configuration (see above). Our goal is to compare the performance of DirectX 11 rendering paths in currently available games with other rendering paths like DirectX 10 and especially DirectX 9.  The April 2010
Steam hardware survey reveals that Windows XP 32bit is still the most popular OS with 36.79% of all Steam users who participated in the survey, although Windows 7 64bit is quickly catching up with 26.39%, a 2.1% gain from the last survey. So it would seem that a lot of users are still stuck with using DirectX 9 rendering paths, even if most of them have DirectX 10 and 11 hardware. Therefore we focused on the comparison between DirectX 9 and 11 for most of our testing, though we also tested DirectX 10 and we'll be reporting any interesting results there too.

Since the focus of the test is on contrasting the different DirectX rendering paths, very little attention will be given to the hardware running the tests. This is not a hardware focused article. Our only goal with hardware selection was to choose a respectable gaming system that many gamers upgrading to DirectX 11 will likely have. We are interested in how these games and DirectX rendering paths perform in a real-world mid-range system.

While our goal was to use a single consistent setup for all tests, we ultimately chose to test two different video cards since there seems to be two separate price 'sweet spots'. The first sweet spot appears at around $150, and it is represented here by the Diamond Radeon 5770. Video cards in this price point are currently an excellent value and hard to beat without shelling out significantly more, making it our value sweet spot. The other sweet spot is at the high end at $400, represented here by the Diamon Radeon 5870. This is the edge of single-card performance and a more powerful setup will require multiple cards which significantly bumps up the price, making this price point the high-end sweet spot. We expect both of these price points and video cards to be very popular and chose them to represent our hypothetical "average" PC gamer.

It's worth noting that this article was conceived before NVIDIA's DirectX 11 products were available and they are not represented here for that reason alone. You may also notice that the test system is a full AMD platform, which is largely due to convenience at the time of article conception. We'd like to remind everyone that this article is about software, and the hardware chosen is purely for the purposes of representing popular price points in order to illustrate a hypothetical "average" gamer. This article makes no further comment about hardware.

Image Quality Tests: For each of the games in our test, we compared the image quality of the game's DirectX 9 rendering path with its DierctX 11 rendering path. All image quality tests were performed with the Diamond Radeon 5870 and the same system configuration used in the performance tests. The video settings for each game were set to their highest settings for all of the image quality tests.

There are cases where a certain game will not offer certain features, like anti-aliasing or screen-space ambient occlusion (SSAO), in a DirectX 9 rendering path. This means that if both the DirectX 9 and 11 rendering paths were configured to their maximum available settings, the DirectX 11 rendering path would be doing significantly more work, perhaps losing some performance and also gaining image quality. We view this as a necessary part of comparing the different rendering paths and made no attempt to perform a direct setting-for-setting comparison.

Image quality was judged both in-game and with the aid of a large series of screenshots. Screenshot of each game were taken using the FRAPS utility in both DirectX 9 and 11 rendering modes. Studious effort was put into ensuring that screenshots were taken in matching sets, so the DirectX 9 screenshots would have matching DirectX 11 counterparts which are the best recreations of the DirectX 9 shot we could manage. Several dozen sets of matching screenshots were produces for each game and only the best 4 for each game are included in this article. The screenshots were judged on how well they illustrate differences between DirectX 9 and 11.

The screenshots were taken at 1920x1200 resolution and saved by FRAPS as uncompressed bitmaps. The screenshot sets chosen for inclusion in the article were cropped and resized in Adobe Photoshop and saved as high quality JPEGs, no other alterations were made to the images.While using JPEG does introduce a certain amount of compression artifacts to the images, we felt that they did not interfere with the illustration of the image quality differences. After all, if the difference is so subtle that simple JPEG compression artifacts could prevent them from being noticed in a direct side-by-side comparison, how are you supposed to notice it in-game?

Article Index:

1 2 Next
+ -

>> the OpenGL rendering path experienced buggy behavior with frequent artifacting, which may have been fixed in Heaven v2.0.

So, i would recommend trying again with the current version.  After all, the best feature of the Unigine engine is that the OpenGL support makes it cross-platform (It will run on the PS3 and Linux).  It might not hurt to look at it on a newer nVidia card too - as I suspect any artifacts are just going to be due to ATI's drivers.

+ -

Very good informative article, I love these kind of reviews. Well it seems my 8800 GTX is not that long in the tooth after all. I was really feeling left out as ATI released the "5" series most notably the 5870. But this is good news as I am in no position to upgrade any time soon. I guess I will keep running my 8800 until a game brings it to its knees. For the record I am running 1920 X 1200. Bad Company 2 looks good fully maxed. 


Thanks again for a great review, and clarification.

+ -

Great article. As an owner of a high-end DX10 rig, my concerns centre on how much of all this new lovin' will be available under DX10. I also have concerns that additional object complexity will ONLY be delivered via tessellation. So the year-on-year steady increase in underlying mesh complexity that we can all benefit from, will be superseded by a tessellation approach. Worse still, that developers won't even bother to implement the standard bump-mapping approach to deliver "simulated detail"... why would they when they can deliver that same detail via tessellation? Are DX10 owners condemned to console-port quality or worse from now on? Although the traditional rapid development in graphics technology has dated high-end tech after just a year, the technology ceiling imposed by a console-focused market, has meant that high-end DX10 hardware has and will continue to be viable for quite some time... unless that is, DX11 makes it useless because developers decide to deliver object detail purely via tessellation. Maybe the fact that the consoles cannot deliver the same tessellation, will ironically save the day in this respect, and ensure that developers will not abandon the conventional route to delivering additional detail; i.e. via base-mesh improvements and bump-mapping. But the wonders of tessellation are a double-edged sword in that they may deny non-DX11 owners the traditional path of increasing (underlying) mesh complexity, even though their rigs may be capable of handling it.

+ -

Are DX10 owners condemned to console-port quality or worse from now on?...Maybe the fact that the consoles cannot deliver the same tessellation, will ironically save the day in this respect, and ensure that developers will not abandon the conventional route to delivering additional detail;

That's amusing, considering that the vast majority of games these days start life as console ports and come to the PC. Your comments are doubly amusing since the XBox 360's tessellator (developed by ATI and part of every Radeon 2K, 3K, and 4K part) has been used since the XBox 360 shipped. From what I've read, tessellation also suits the PS3 well; it fits nicely into the sort of calculations Cell is a monster at.

Realistically, however, I think you've exactly nothing to worry about. 3.5 years after Vista, there's still a lot of the market running XP. That's strike 1 for this "DX11 takes over" theory. Strike 2 is the fact that most games *are* console ports and all current consoles are limited to DX9. If anything, I'd expect DX11 to ramp more slowly than DX9 for that reason alone. Yes, game developers can (and will) take advantage of DX11, but they aren't going to put the same amount of work into it as they would if the PS3 and XBox 360 were all DX11 hardware. 

Strike 3 is the fact that tessellation is one of the few truly unique features of DX11. Go look at some screenshots from Metro 2033 that show the effect on vs. off. Look at the effect on frame rate. You've got nothing to worry about.

+ -

I guess you're right. I suppose tessellation won't become imperative for at least a year... especially with consoles placing a ceiling on development quality. I hope so anyway, because I have no plans to upgrade for at least another year. It will be interesting to see if the consoles hold back the uptake of tessellation on the PC, or try to implement some cut-down version themselves.

My concern about tessellation making developers lazy comes from Unigine Heaven. That is far plainer without tessellation than it needs to be. I hope that this is just to show off tessellation, and doesn't represent the future.

+ -

I find myself in agreement with Joel on this one.

From everything I read, the main benefits of DX11 are actually for the developer, but no developer of any major title is going to be writing for it solely since the card support for it is not yet ubiquitous.

We still haven't seen any real proof that tessellation gives a great deal of visual improvement *while* running at reasonable framerates in a practical game (i.e. not a demo that applies it to just one or two objects).  So I'll wait for the next generation of cards.

And my final pet peeve, which the regulars here have heard much too often:  tessellation is not a feature of DirectX11, it's a feature of the cards.  Games using OpenGL on Windows can use tessellation too (and unfortunately it's just as slow).  Tessellation, albeit not the same pipeline, was actually available as an OpenGL extension first, so please don't give MS too many kudos for this "innovation" that is basically a refined API that the vendors had to do all the heavy lifting to implement.

+ -

That's just the thing 3vi1,
Tessellation is just a neat little addition just like adding a light to make an image look neat. When actually working to create the games, it is not like we use it within our view ports to see what the final in game image is going to be. I tried DX11 with max, and after a few frustrating days, I said FT and switched back to Open GL and everything is very smooth now. Developers have to still use Open GL until it comes to putting assets into the engine. When things like DX11 or Tessellation come into play. They do help with the final renders for video as well. But I don't really notice a difference when using something like VRay. Most of that stuff then comes down to your processing capabilities. If you have a home PC then it can take days for a HD image. If you have the Death Star it could take seconds.
I look at DX11 as just an update to the progression of game engines on MS Os's By this time next year I am sure we will be arguing about the next generation of this stuff.

+ -

Mmmm. We might be arguing over the hardware. I'd be downright surprised if we're arguing over DirectX 12 in a year. It wouldn't surprise me if we don't see DX12 until the next generation of consoles and/or Windows 8.

+ -

Buggy behavior is probably the best way to describe the performance of DirectX11 within DCC apps.

Within the developer environment DX11 is having a hard time operating within programs. I had to switch it back to Open GL in order to get things running smoothly. On the game side I guess that is really not an issue. But since we must still really on Open GL to generate the content, it is hard to realize the performance of DX11 before being ported to in game platforms. Rendering with AO on within DX11 shows quite a few artifacts that just don't show up in OpenGL on the 8800.

On the gaming side it is much smoother. This is in part because they have many programmers that utilize DX11 within the final product. Which makes the final result very nice! I am sure most people wont notice that much of a difference between DX10, unless they are using a TV that is larger than 40"'s and is full 1080p HD. I have noticed that my 22" monitor has no real noticeable image quality improvements when working within DX11 at 1080p, yet when I view the same images on the Cintiq with true color correction at a higher resolution, the quality is impressive!

I have tried to push the envelope with the tessellations within the programs. Yet this only works best when you have the maps already calculated for the final render within a game engine. trying to work with a high tessellation is still not a viable reality. You still have to work with low poly objects, then calculate maps then render or port. When all is working properly you can really see a noticeable difference in the Z-depth of the final images. Much like the AvP images, you can notice that the backgrounds seem a little darker on the DX11 image on the same monitor. yet when you combine it with an HD monitor/GPU, it gives you a greater depth of field. and really tends to look like a live action set within a hazy atmosphere.

Tessellations are just another nice little perk that just make gamplay more realistic. There really is no more concern than for the implementation of the multiple render pass, introduced a few years back. Both of which must be utilized in the engines processing environment. People aren't really going to notice it unless something stop working. until then they will just enjoy it :)

+ -

I guess that support for tessellation will be a moot point in a couple of years when most people have capable hardware, however, as you guys obviously have developer insight, maybe you can address my principle question directly, that is...

Will the underlying mesh detail (exposed within the game, not the development process) continue to increase as it has done over the last 14 years, or will it stagnate, because tessellation+displacement mapping is used exclusively to deliver the fine detail. Also, will developers drop the "simulated detail" techniques, such as normal mapping, for the same reason? In the long term (say 2 years) will developers continue to deliver both methods and allow game players to choose the underlying mesh detail combined with normal mapping if they so wish?

You are presenting tessellation as "just another nicety" that we can take or leave. But surely it is more than that. Surely it is a technique that the entire industry could become completely tied to.  A lock in. Maybe not while this generation of consoles are calling the tune, but certainly moving forward.

As I say, this is not so important provided that most people have capable hardware by the time any lock-in happens, but it would be nice to hear the thinking on tessellation adoption moving forward. Just another effect on top, or a paradigm shift?

1 2 Next
Login or Register to Comment
Post a Comment
Username:   Password: