The State of DirectX 11 - Image Quality & Performance - HotHardware

The State of DirectX 11 - Image Quality & Performance

3 thumbs up
Touted as one of the biggest milestones in Windows games development in recent years, DirectX 10 ultimately didn't live up to the hype as far as end-users were concerned. A large part of the disappointment was perhaps due to a misunderstanding of what DirectX 10 was supposed to be. DirectX 10 was a fundamental refresh of the entire API and it brought a much-needed reboot to an aging API model. The reboot removed a lot of the legacy clutter left over from over a decade of previous DirectX developments. DirectX 10 was never really about the end users at all, it was really about the developers.

It also didn't help that DirectX 10 was tied to Windows Vista, which was unpopular with gamers from launch. On top of that, due to the revamped API model which required developers to adjust how they programmed games, DirectX 10 received sluggish developer adoption. Considering the lukewarm reception DirectX 10 received a few short years ago, it isn't too surprising that many gamers are now approaching DirectX 11 with some caution.



In terms of technology, DirectX 11 isn't the API defining, behind-the-scenes reboot that DirectX 10 was. It's an incremental update, much in the same way that Windows 7 is an update of Vista. At the end of the day, DirectX 11 is basically just DirectX 10 with a bunch of new features stuck on. However, we'd speculate that DirectX 11 should have a much larger impact on the end-user experience than DirectX 10 did because it brings several new features to the table that will make creating advanced graphical effects seldom seen in past games, much easier.

Some critics have said that Windows 7 is everything that Vista was supposed to be. Is DirectX 11 everything DirectX 10 was hyped up to be?  We couldn't let such a loaded question go unanswered, even if we did pose it ourselves. Now that DirectX 11 hardware is readily available and compatible games are on the shelves, it's high time we took stock of what DirectX 11 has to offer to the end-user. On the following pages, we're going to take a long, hard look into the state of DirectX 11 and what it means to the community as a whole.

Article Index:

1 2 Next
0
+ -

>> the OpenGL rendering path experienced buggy behavior with frequent artifacting, which may have been fixed in Heaven v2.0.


So, i would recommend trying again with the current version.  After all, the best feature of the Unigine engine is that the OpenGL support makes it cross-platform (It will run on the PS3 and Linux).  It might not hurt to look at it on a newer nVidia card too - as I suspect any artifacts are just going to be due to ATI's drivers.

0
+ -

Very good informative article, I love these kind of reviews. Well it seems my 8800 GTX is not that long in the tooth after all. I was really feeling left out as ATI released the "5" series most notably the 5870. But this is good news as I am in no position to upgrade any time soon. I guess I will keep running my 8800 until a game brings it to its knees. For the record I am running 1920 X 1200. Bad Company 2 looks good fully maxed. 

 

Thanks again for a great review, and clarification.

0
+ -

Great article. As an owner of a high-end DX10 rig, my concerns centre on how much of all this new lovin' will be available under DX10. I also have concerns that additional object complexity will ONLY be delivered via tessellation. So the year-on-year steady increase in underlying mesh complexity that we can all benefit from, will be superseded by a tessellation approach. Worse still, that developers won't even bother to implement the standard bump-mapping approach to deliver "simulated detail"... why would they when they can deliver that same detail via tessellation? Are DX10 owners condemned to console-port quality or worse from now on? Although the traditional rapid development in graphics technology has dated high-end tech after just a year, the technology ceiling imposed by a console-focused market, has meant that high-end DX10 hardware has and will continue to be viable for quite some time... unless that is, DX11 makes it useless because developers decide to deliver object detail purely via tessellation. Maybe the fact that the consoles cannot deliver the same tessellation, will ironically save the day in this respect, and ensure that developers will not abandon the conventional route to delivering additional detail; i.e. via base-mesh improvements and bump-mapping. But the wonders of tessellation are a double-edged sword in that they may deny non-DX11 owners the traditional path of increasing (underlying) mesh complexity, even though their rigs may be capable of handling it.

0
+ -

Are DX10 owners condemned to console-port quality or worse from now on?...Maybe the fact that the consoles cannot deliver the same tessellation, will ironically save the day in this respect, and ensure that developers will not abandon the conventional route to delivering additional detail;

That's amusing, considering that the vast majority of games these days start life as console ports and come to the PC. Your comments are doubly amusing since the XBox 360's tessellator (developed by ATI and part of every Radeon 2K, 3K, and 4K part) has been used since the XBox 360 shipped. From what I've read, tessellation also suits the PS3 well; it fits nicely into the sort of calculations Cell is a monster at.

Realistically, however, I think you've exactly nothing to worry about. 3.5 years after Vista, there's still a lot of the market running XP. That's strike 1 for this "DX11 takes over" theory. Strike 2 is the fact that most games *are* console ports and all current consoles are limited to DX9. If anything, I'd expect DX11 to ramp more slowly than DX9 for that reason alone. Yes, game developers can (and will) take advantage of DX11, but they aren't going to put the same amount of work into it as they would if the PS3 and XBox 360 were all DX11 hardware. 

Strike 3 is the fact that tessellation is one of the few truly unique features of DX11. Go look at some screenshots from Metro 2033 that show the effect on vs. off. Look at the effect on frame rate. You've got nothing to worry about.

0
+ -

I guess you're right. I suppose tessellation won't become imperative for at least a year... especially with consoles placing a ceiling on development quality. I hope so anyway, because I have no plans to upgrade for at least another year. It will be interesting to see if the consoles hold back the uptake of tessellation on the PC, or try to implement some cut-down version themselves.

My concern about tessellation making developers lazy comes from Unigine Heaven. That is far plainer without tessellation than it needs to be. I hope that this is just to show off tessellation, and doesn't represent the future.

0
+ -

I find myself in agreement with Joel on this one.

From everything I read, the main benefits of DX11 are actually for the developer, but no developer of any major title is going to be writing for it solely since the card support for it is not yet ubiquitous.

We still haven't seen any real proof that tessellation gives a great deal of visual improvement *while* running at reasonable framerates in a practical game (i.e. not a demo that applies it to just one or two objects).  So I'll wait for the next generation of cards.

And my final pet peeve, which the regulars here have heard much too often:  tessellation is not a feature of DirectX11, it's a feature of the cards.  Games using OpenGL on Windows can use tessellation too (and unfortunately it's just as slow).  Tessellation, albeit not the same pipeline, was actually available as an OpenGL extension first, so please don't give MS too many kudos for this "innovation" that is basically a refined API that the vendors had to do all the heavy lifting to implement.

0
+ -

That's just the thing 3vi1,
 
Tessellation is just a neat little addition just like adding a light to make an image look neat. When actually working to create the games, it is not like we use it within our view ports to see what the final in game image is going to be. I tried DX11 with max, and after a few frustrating days, I said FT and switched back to Open GL and everything is very smooth now. Developers have to still use Open GL until it comes to putting assets into the engine. When things like DX11 or Tessellation come into play. They do help with the final renders for video as well. But I don't really notice a difference when using something like VRay. Most of that stuff then comes down to your processing capabilities. If you have a home PC then it can take days for a HD image. If you have the Death Star it could take seconds.
 
I look at DX11 as just an update to the progression of game engines on MS Os's By this time next year I am sure we will be arguing about the next generation of this stuff.

0
+ -

Mmmm. We might be arguing over the hardware. I'd be downright surprised if we're arguing over DirectX 12 in a year. It wouldn't surprise me if we don't see DX12 until the next generation of consoles and/or Windows 8.

0
+ -

Buggy behavior is probably the best way to describe the performance of DirectX11 within DCC apps.

Within the developer environment DX11 is having a hard time operating within programs. I had to switch it back to Open GL in order to get things running smoothly. On the game side I guess that is really not an issue. But since we must still really on Open GL to generate the content, it is hard to realize the performance of DX11 before being ported to in game platforms. Rendering with AO on within DX11 shows quite a few artifacts that just don't show up in OpenGL on the 8800.

On the gaming side it is much smoother. This is in part because they have many programmers that utilize DX11 within the final product. Which makes the final result very nice! I am sure most people wont notice that much of a difference between DX10, unless they are using a TV that is larger than 40"'s and is full 1080p HD. I have noticed that my 22" monitor has no real noticeable image quality improvements when working within DX11 at 1080p, yet when I view the same images on the Cintiq with true color correction at a higher resolution, the quality is impressive!

I have tried to push the envelope with the tessellations within the programs. Yet this only works best when you have the maps already calculated for the final render within a game engine. trying to work with a high tessellation is still not a viable reality. You still have to work with low poly objects, then calculate maps then render or port. When all is working properly you can really see a noticeable difference in the Z-depth of the final images. Much like the AvP images, you can notice that the backgrounds seem a little darker on the DX11 image on the same monitor. yet when you combine it with an HD monitor/GPU, it gives you a greater depth of field. and really tends to look like a live action set within a hazy atmosphere.

Tessellations are just another nice little perk that just make gamplay more realistic. There really is no more concern than for the implementation of the multiple render pass, introduced a few years back. Both of which must be utilized in the engines processing environment. People aren't really going to notice it unless something stop working. until then they will just enjoy it :)

0
+ -

I guess that support for tessellation will be a moot point in a couple of years when most people have capable hardware, however, as you guys obviously have developer insight, maybe you can address my principle question directly, that is...

Will the underlying mesh detail (exposed within the game, not the development process) continue to increase as it has done over the last 14 years, or will it stagnate, because tessellation+displacement mapping is used exclusively to deliver the fine detail. Also, will developers drop the "simulated detail" techniques, such as normal mapping, for the same reason? In the long term (say 2 years) will developers continue to deliver both methods and allow game players to choose the underlying mesh detail combined with normal mapping if they so wish?

You are presenting tessellation as "just another nicety" that we can take or leave. But surely it is more than that. Surely it is a technique that the entire industry could become completely tied to.  A lock in. Maybe not while this generation of consoles are calling the tune, but certainly moving forward.

As I say, this is not so important provided that most people have capable hardware by the time any lock-in happens, but it would be nice to hear the thinking on tessellation adoption moving forward. Just another effect on top, or a paradigm shift?

1 2 Next
Login or Register to Comment
Post a Comment
Username:   Password: