The State of DirectX 11 - Image Quality & Performance

The State of DirectX 11 - Image Quality & Performance

In terms of technology, DirectX 11 isn't the API defining, behind-the-scenes reboot that DirectX 10 was. It's an incremental update, much in the same way that Windows 7 is an update of Vista. At the end of the day, DirectX 11 is basically just DirectX 10 with a bunch of new features stuck on. However, we'd speculate that DirectX 11 should have a much larger impact on the end-user experience than DirectX 10 did because it brings several new features to the table that will make creating advanced graphical effects seldom seen in past games, much easier.

Some critics have said that Windows 7 is everything that Vista was supposed to be. But is DirectX 11 everything DirectX 10 was hyped up to be? We couldn't let such a loaded question go unanswered, even if we did pose it ourselves. Now that DirectX 11 hardware is readily available and compatible games are on the shelves, it's high time we took stock of what DirectX 11 has to offer to the end-user...

The State of DirectX 11 - Image Quality & Performance

0
+ -

>> the OpenGL rendering path experienced buggy behavior with frequent artifacting, which may have been fixed in Heaven v2.0.


So, i would recommend trying again with the current version.  After all, the best feature of the Unigine engine is that the OpenGL support makes it cross-platform (It will run on the PS3 and Linux).  It might not hurt to look at it on a newer nVidia card too - as I suspect any artifacts are just going to be due to ATI's drivers.

0
+ -

Very good informative article, I love these kind of reviews. Well it seems my 8800 GTX is not that long in the tooth after all. I was really feeling left out as ATI released the "5" series most notably the 5870. But this is good news as I am in no position to upgrade any time soon. I guess I will keep running my 8800 until a game brings it to its knees. For the record I am running 1920 X 1200. Bad Company 2 looks good fully maxed. 

 

Thanks again for a great review, and clarification.

0
+ -

Great article. As an owner of a high-end DX10 rig, my concerns centre on how much of all this new lovin' will be available under DX10. I also have concerns that additional object complexity will ONLY be delivered via tessellation. So the year-on-year steady increase in underlying mesh complexity that we can all benefit from, will be superseded by a tessellation approach. Worse still, that developers won't even bother to implement the standard bump-mapping approach to deliver "simulated detail"... why would they when they can deliver that same detail via tessellation? Are DX10 owners condemned to console-port quality or worse from now on? Although the traditional rapid development in graphics technology has dated high-end tech after just a year, the technology ceiling imposed by a console-focused market, has meant that high-end DX10 hardware has and will continue to be viable for quite some time... unless that is, DX11 makes it useless because developers decide to deliver object detail purely via tessellation. Maybe the fact that the consoles cannot deliver the same tessellation, will ironically save the day in this respect, and ensure that developers will not abandon the conventional route to delivering additional detail; i.e. via base-mesh improvements and bump-mapping. But the wonders of tessellation are a double-edged sword in that they may deny non-DX11 owners the traditional path of increasing (underlying) mesh complexity, even though their rigs may be capable of handling it.

0
+ -

Are DX10 owners condemned to console-port quality or worse from now on?...Maybe the fact that the consoles cannot deliver the same tessellation, will ironically save the day in this respect, and ensure that developers will not abandon the conventional route to delivering additional detail;

That's amusing, considering that the vast majority of games these days start life as console ports and come to the PC. Your comments are doubly amusing since the XBox 360's tessellator (developed by ATI and part of every Radeon 2K, 3K, and 4K part) has been used since the XBox 360 shipped. From what I've read, tessellation also suits the PS3 well; it fits nicely into the sort of calculations Cell is a monster at.

Realistically, however, I think you've exactly nothing to worry about. 3.5 years after Vista, there's still a lot of the market running XP. That's strike 1 for this "DX11 takes over" theory. Strike 2 is the fact that most games *are* console ports and all current consoles are limited to DX9. If anything, I'd expect DX11 to ramp more slowly than DX9 for that reason alone. Yes, game developers can (and will) take advantage of DX11, but they aren't going to put the same amount of work into it as they would if the PS3 and XBox 360 were all DX11 hardware. 

Strike 3 is the fact that tessellation is one of the few truly unique features of DX11. Go look at some screenshots from Metro 2033 that show the effect on vs. off. Look at the effect on frame rate. You've got nothing to worry about.

0
+ -

I guess you're right. I suppose tessellation won't become imperative for at least a year... especially with consoles placing a ceiling on development quality. I hope so anyway, because I have no plans to upgrade for at least another year. It will be interesting to see if the consoles hold back the uptake of tessellation on the PC, or try to implement some cut-down version themselves.

My concern about tessellation making developers lazy comes from Unigine Heaven. That is far plainer without tessellation than it needs to be. I hope that this is just to show off tessellation, and doesn't represent the future.

0
+ -

I find myself in agreement with Joel on this one.

From everything I read, the main benefits of DX11 are actually for the developer, but no developer of any major title is going to be writing for it solely since the card support for it is not yet ubiquitous.

We still haven't seen any real proof that tessellation gives a great deal of visual improvement *while* running at reasonable framerates in a practical game (i.e. not a demo that applies it to just one or two objects).  So I'll wait for the next generation of cards.

And my final pet peeve, which the regulars here have heard much too often:  tessellation is not a feature of DirectX11, it's a feature of the cards.  Games using OpenGL on Windows can use tessellation too (and unfortunately it's just as slow).  Tessellation, albeit not the same pipeline, was actually available as an OpenGL extension first, so please don't give MS too many kudos for this "innovation" that is basically a refined API that the vendors had to do all the heavy lifting to implement.

0
+ -

That's just the thing 3vi1,
 
Tessellation is just a neat little addition just like adding a light to make an image look neat. When actually working to create the games, it is not like we use it within our view ports to see what the final in game image is going to be. I tried DX11 with max, and after a few frustrating days, I said FT and switched back to Open GL and everything is very smooth now. Developers have to still use Open GL until it comes to putting assets into the engine. When things like DX11 or Tessellation come into play. They do help with the final renders for video as well. But I don't really notice a difference when using something like VRay. Most of that stuff then comes down to your processing capabilities. If you have a home PC then it can take days for a HD image. If you have the Death Star it could take seconds.
 
I look at DX11 as just an update to the progression of game engines on MS Os's By this time next year I am sure we will be arguing about the next generation of this stuff.

0
+ -

Mmmm. We might be arguing over the hardware. I'd be downright surprised if we're arguing over DirectX 12 in a year. It wouldn't surprise me if we don't see DX12 until the next generation of consoles and/or Windows 8.

0
+ -

Buggy behavior is probably the best way to describe the performance of DirectX11 within DCC apps.

Within the developer environment DX11 is having a hard time operating within programs. I had to switch it back to Open GL in order to get things running smoothly. On the game side I guess that is really not an issue. But since we must still really on Open GL to generate the content, it is hard to realize the performance of DX11 before being ported to in game platforms. Rendering with AO on within DX11 shows quite a few artifacts that just don't show up in OpenGL on the 8800.

On the gaming side it is much smoother. This is in part because they have many programmers that utilize DX11 within the final product. Which makes the final result very nice! I am sure most people wont notice that much of a difference between DX10, unless they are using a TV that is larger than 40"'s and is full 1080p HD. I have noticed that my 22" monitor has no real noticeable image quality improvements when working within DX11 at 1080p, yet when I view the same images on the Cintiq with true color correction at a higher resolution, the quality is impressive!

I have tried to push the envelope with the tessellations within the programs. Yet this only works best when you have the maps already calculated for the final render within a game engine. trying to work with a high tessellation is still not a viable reality. You still have to work with low poly objects, then calculate maps then render or port. When all is working properly you can really see a noticeable difference in the Z-depth of the final images. Much like the AvP images, you can notice that the backgrounds seem a little darker on the DX11 image on the same monitor. yet when you combine it with an HD monitor/GPU, it gives you a greater depth of field. and really tends to look like a live action set within a hazy atmosphere.

Tessellations are just another nice little perk that just make gamplay more realistic. There really is no more concern than for the implementation of the multiple render pass, introduced a few years back. Both of which must be utilized in the engines processing environment. People aren't really going to notice it unless something stop working. until then they will just enjoy it :)

0
+ -

I guess that support for tessellation will be a moot point in a couple of years when most people have capable hardware, however, as you guys obviously have developer insight, maybe you can address my principle question directly, that is...

Will the underlying mesh detail (exposed within the game, not the development process) continue to increase as it has done over the last 14 years, or will it stagnate, because tessellation+displacement mapping is used exclusively to deliver the fine detail. Also, will developers drop the "simulated detail" techniques, such as normal mapping, for the same reason? In the long term (say 2 years) will developers continue to deliver both methods and allow game players to choose the underlying mesh detail combined with normal mapping if they so wish?

You are presenting tessellation as "just another nicety" that we can take or leave. But surely it is more than that. Surely it is a technique that the entire industry could become completely tied to.  A lock in. Maybe not while this generation of consoles are calling the tune, but certainly moving forward.

As I say, this is not so important provided that most people have capable hardware by the time any lock-in happens, but it would be nice to hear the thinking on tessellation adoption moving forward. Just another effect on top, or a paradigm shift?

0
+ -

Monkfish,

Check this out for Tech Report's coverage of Unigine and some of its flaws:  http://www.techreport.com/articles.x/18682/10  It directly speaks to why Unigine deserves to be considered more of a tessellation technical demo than any sort of representative 3D benchmark.

Let me address your larger point in three sections. . You say "Surely it [tessellation] is a technique that the entire industry could become completely tied to."

Section 1: We've Been Here Before:

Sure. That's true. The question isn't if, it's when. Ten years ago, when 3dfx still existed, fixed-function Hardware T&L (a new DirectX 7 feature) was a brand-new technology and bump mapping in 3D games was just starting to appear. Then came DX8, the GF3, and programmable shaders. Then support for Shader Model 2.0b vs. SM3.Then DX10/10.1, etc etc.

Each time new technologies become available, they pass through a period of gradual adoption until they become ubiquitous. This is why ATI and NV build new low-end chips every product cycle instead of recycling older generations. Eventually, all the solutions out there, from integrated to extreme support a technology (even if such support isn't practically useful). What was exotic has become universal. If you want a CPU analogy, look at SSE/SSE2. Once upon a time, those were Intel only technologies. Now, every developer on the planet can assume that 95%+ of the x86 processors in-use incorporate them.

Still, you referred to this DX11 situation as a new case, so let me talk about that specifically:

Section 2:  Why A Synthetic 3D Benchmark Is a Lousy Gaming Test

There are several factors that combine to prevent the sort of sudden disruption you're discussing. The first thing to understand is the reason why a benchmark like Heaven isn't a good example of a 3D game benchmark. I'm strictly speaking in general terms on this one, but the points apply to Heaven:

In a synthetic benchmark, particularly a flyby benchmark like Heaven, the developer controls everything. He knows exactly which camera angles are used, where the light sources are, and which pixels need to be rendered. There's no need to waste CPU/GPU cycles on AI calculations or rendering muzzle flashes/weapons fire, because none of those events occur—or, if they do occur, they occur on schedule and affect the exact same targets the exact same way every single time.

There's no need to worry about the potential performance impact of high-fidelity 3D audio, because there isn't any (a simple musical loop or some basic sounds don't count for the same reasons as above). There's also no off-screen under-the-hood sorts of calculations going on. In a multiplayer game, the game engine is constantly receiving updates on your location and the location of other players, vehicles, events, etc. There's also no need to calculate a bunch of messy physics interactions that occur in different ways every time Rocket A intersects Home B.

That leads us to Section 3.

Section 3: Understanding GPU Budgets:

Because Heaven isn't doing a whole lot of other things, it can task the GPU with throwing the vast majority of its horsepower behind doing just one thing--tessellation. That's just fine if your goal is to showcase tessellation (or measure the tessellation power of different GPUs), but it limits the number of situations in which Heaven is *really* going to tell you about overall 3D performance. in a real game engine.

The best way to understand 3D performance modeling is to think about it in terms of a budget. A $150 GPU, for our purposes, has $150 to spend on 3D performance.  We can widen the scope of the analogy a little--a $75 CPU + $150 GPU has a total of $225 to spend, etc. Either way, when a developer sits down with his programming team, everyone knows that the game needs to run within the budgetary constraints of as many systems as possible.

If you know the 'average' person has a $100 video card and a $50 CPU, you don't spend $90 on tessellation. Instead, you use tessellation where it makes the most sense and gives the greatest benefits. Since you know not everyone has tessellation-capable cards, you design fallback rendering modes that display the scene as closely as possible to what you want with as minimal a performance hit as is possible. In some cases, it may be that lower-end $75 cards with tessellation support will give superior visuals than midrange $125 cards that lack it. Regardless, developers design games that can fit *all* of the necessary 3D rendering capabilities into small GPU budgets rather than dedicating an inordinate amount of time to one particular effect.

(Ok, sometimes they do the latter--but the games tend to suck)

The good news is that since most games are designed for consoles these days, DX9 remains a sort-of "floor." Yes, eventually, tessellation and DX11 will become de facto standards everyone supports. Based on comments made by Sony and Microsoft, however, I think we can safely assume that next-generation consoles are three years away at *minimum.* Both companies have said relatively recently that they believe current-gen products have "years" of life left.

In 3-4 years, we'll all have DX11 cards. And in the meantime, developers will still keep investing time/energy into current mapping techniques. Remember what I said about GPU budgets--there will absolutely be cases in the future where a developer that *could* use an 'expensive' solution like displacement maps in a game area will instead opt for a cheaper normal map method when the game is running on $100-$200 cards. That means investing in ways to make the cheap stuff look good, which means all the things that were important a year ago are still important today and will matter in five years. ;) 

The whizbang effects change. The rate of change changes. The process doesn't.

 

Edit:  Ack, yes. Only referring to Heaven. I knew better than to equate the two. *bonk*

 

+1
+ -

>> Unigine deserves to be considered more of a tessellation technical demo than any sort of representative 3D benchmark

While I agree with a lot of your logic and even your conclusion, I have to be "that guy" and disagree with you on this one underlying point, Joel:  Unigine is a real, cross-platform game engine with titles in development.  Of course, we've not seen how much practical tessellation those games will implement, so I think your argument still stands.

Update:  It occurs to me now that you probably meant for us to infer "Unigine Heaven", so I withdraw any argument.  :)

0
+ -

Thanks Joel. You fully addressed my concerns in section 3. Principally, that the stagnation in the industry imposed by the consoles and the current economic climate will ensure that developers aren't going to pursue an exclusively tessellated approach at the expense of other techniques any time soon.

I'd still encourage developers to expose finer mesh detail the old fashion way, instead of just saying "if you want these finer mesh details you'll need DX11". I know tessellation facilitates smooth LOD variation (which is possibly more significant than using it to add additional detail), but some of us own kit capable of handling higher mesh detail natively (not as high as Heaven, but as you said, that is an unrealistic implementation anyway).

0
+ -

Monk,

IIRC (and I'm reaching back years for this one), one of the interesting technical aspects of the original Doom 3 vs. HL2 debate was the different techniques the two engines used. HL2 was the more standard approach and relied on detailed textures and high-res models (for the time period). Doom 3 used comparatively little texturing, but focused heavily on mapping and geometry.

So long as there are different ways of reaching the same goal I think you'll find developers that want to use them. ;)

0
+ -

3vi1,

OpenGL under Windows? Aside from professional development and rendering tools, is anyone except Carmack still using OGL for non-browser based gaming in the Windows market?

(I am referring to shipping products, including products from small independents.)

+1
+ -

Joel H:

3vi1,

OpenGL under Windows? Aside from professional development and rendering tools, is anyone except Carmack still using OGL for non-browser based gaming in the Windows market?

(I am referring to shipping products, including products from small independents.)

wow.exe -opengl

Many games support it, even if it's not the default rendering engine.  Professional game engines should always be abstracted from the graphics API for the purpose of porting, and any serious engine will have OpenGL support since it's the only truly cross-platform solution.

Of course, everything on the Windows platform today will probably default to a DirectX backend.  When Microsoft left the OpenGL architecture review board, they spread a bunch of FUD that they were removing OpenGL support from their great new upcoming OS:  Vista.  That never happened, but the propaganda served its purpose of getting a bunch of developers to scramble to DirectX.

0
+ -

3vi1,

WoW is developed for Mac simultaneously, which is why the -opengl flag exists. Also, last time I checked (about a year ago), the ATI WoW OGL mode was beyond terrible. Frame rates 50% of what they were in DX, horrible image quality, random bugs, etc.

When I asked, I was thinking of OGL implementations that were either the primary engine of a game or on-par with the DX flavor.

0
+ -

Joel H:
Also, last time I checked (about a year ago), the ATI WoW OGL mode was beyond terrible. Frame rates 50% of what they were in DX, horrible image quality, random bugs, etc.

That sounds like a system/driver problem.  I ran it under Wine on Linux a year or two ago and the framerates were perfect.  I never saw any graphical bugs and can't imagine why it would run worse natively under Windows.

Joel H:

When I asked, I was thinking of OGL implementations that were either the primary engine of a game or on-par with the DX flavor.

Primary engine?  I think you mean "default backend on the Windows platform".  You're probably not going to see more than a handful of indie games default to OpenGL on Windows today because, as I mentioned before, Microsoft got everyone to add native DirectX backends to their game engines with a campaign of deception/FUD.   As long as MS keeps the driver developers tied up with optimizations and fixes for DX over OpenGL, it's going to be the most consistent backend to use *for that one platform*.  Heck, I even contributed a fix or two to the DX backend of SDL back in the day.

But, pretty much any cross platform engine is going to be "primarily" written to OpenGL, because testing should be done on as many platforms as possible, early in the development process.  That's why the id Tech engines you mention above are primarily OpenGL with a DX backend tacked on.

0
+ -

3vi1,

"But, pretty much any cross platform engine is going to be "primarily" written to OpenGL, because testing should be done on as many platforms as possible, early in the development process."

Why? Let me expand on that a bit. I'm not questioning that it's good to have cross-platform technology or arguing that non-Windows platforms aren't worth consideration, but I'm curious how much work really goes into developing 3D programs meant for non-Windows OS's. Has Linux market share in netbooks risen sufficiently for their to be new interest in providing advanced 3D capabilities for these devices? ("advanced" relative to what they can do, of course).

Similarly, how much native OSX development goes on? IIRC, Apple only added support for OGL 3.2 in the last few months. There's no word on OGL 4 at all.

Do you think the expanding capabilities of the smartphone/smartbook/netbook segment are collectively driving an uptick in the importance and prevalence of OpenGL? Coming from the other direction, do you think Microsoft's decision to *not* publish a "Mobile DirectX" gives WebGL a significant advantage MS will eventually rue?

0
+ -

>> Why?

Because if you're truly interested in having a cross-platform engine, you don't want to find you've went down a path that only performs well on devices running one manufacturer's OS .  Ideally, you should support not just OpenGL, but at least two backends to make sure you're abstracting to a level that's not just suited to a single API.  Minimally, though, I would start with OpenGL - since it's available on all platforms, including Windows.

I have a crazy conspiracy theory (which I certainly can't confirm, as I've done no DX programming since DX9) that MS's API changes in DX10 and DX11, like the D3DX libraries before them, are part of a long term internal strategy to increase the difficulty of porting DX-only titles off-platform by making new developers think they shouldn't abstract their graphics backend from the game engine at all.

>> how much native OSX development goes on

Every iPhone game?  Not much on the desktop, admittedly.

>> Do you think the expanding capabilities of the smartphone/smartbook/netbook segment are collectively driving an uptick in the importance and prevalence of OpenGL?

I hadn't really thought about that, but I hope so.  I'm always going to root for the solution that's available to the most people with the least lock-in.

What I'd really like to see would be major vendors getting behind a project similar to (or being) SDL, optimizing a truly open/standardized API that has unification of the audio and input routines.

Login or Register to Comment
Post a Comment
Username:   Password: