• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

DirectX 11 Set to Help Game Developers to Boost Performance of New Titles

Soldato
Joined
7 May 2006
Posts
12,183
Location
London, Ealing
As the release of DirectX 11 application programming interface (API) and supporting hardware and software approaches, more details regarding the benefits they both bring as well as certain peculiarities transpire. While no people involved into development of next-generation games would disclose which features next-generation titles utilize, educated guesses may shed some light onto some usage models of DX11 API.
The First Three Things to Do with DirectX 11

According to Richard Huddy, senior manager of software developer relations at ATI, graphics business unit of Advanced Micro Devices, game developers can easily boost performance of their next-generation titles by implementing certain DirectX 11 capabilities. In addition, by utilizing some of the features of the new API, they can tangibly improve visual appeal of their future games.

“If I was making a decision about [which DirectX 11 features to implement first], I would go for three things fairly quickly,” said Mr. Huddy.

Those things are, apparently, multi-threading support, compute shaders and tessellation.
Multi-Threading: +20% Performance

“As soon as I got hold of DirectX 11 API from Microsoft and Windows 7 beta, which they’ve had for a considerable amount of months now, I would add multi-threading support through the use of Display Lists because that can guarantee me a speed up in almost any application I can think of. Typical benefit is going to be around 20%, but I expect a large variation of that; the variation it can be north of 50%,” explained the developer relations specialist.

What is even more important, software developed with DirectX 11 Display Lists in mind would work faster on every hardware that is compatible with Windows 7, provided that there are appropriate drivers for graphics cards.

“That would be attractive to game developers since that would not require that I have DirectX 11 hardware at hand: the use of Display Lists will give a benefit on any Windows 7 hardware, which includes DirectX 9, DirectX 10, DirectX 11 hardware, once AMD, Nvidia and others ship appropriate drivers that offer the acceleration,” said Richard Huddy.
Post Processing: +25% Performance

“Then I would go for compute shaders, for the low-hanging fruit: I would do things like post-processing, physics, AI, or maybe running particle systems. But the easiest thing to do is to take all the post-processing and bundle it all up into a compute shader, the way it handles [data] transactions is more efficient, said the developer relations chief.

Presently post-processing effects, such as motion blur or depth of field, are done using pixel shader, which may not be very efficient in terms of performance since pixel shaders still depend on the graphics pipeline and require data transfer, usage of texture instructions loads of memory reads and so on. Since compute shaders are independent of the rendering pipeline and require far less of data transactions, texture reads, etc., implementation of post-effects using them may be substantially faster.

“At the moment post-processing is done using pixel shaders and it can be quite hard to get the efficiency out of the hardware. So, we think compute shaders are going to give significant performance win in that specific area of code. Typically we see that post-processing costs from 10% to 25% of frame time (depends on post-processing),” explained Mr. Huddy.

It should be noted that compute shaders 4.0 and 4.1 supported by existing ATI Radeon and Nvidia GeForce graphics cards, therefore, as they have many limitations, may not work as fine as compute shaders 5.0 when it comes to post-processing and similar usage. Therefore, Richard Huddy does not expect CS 4.x to become popular, especially keeping in mind that those, who buy leading-edge hardware, are more likely to acquire the latest game.

ATI has already demonstrated DirectX 11-capable hardware, therefore, at least select game developers already have ATI Radeon HD 5000-series graphics cards at hands.
Tessellation: When Performance Meets Quality

Tessellation is not exactly a performance boosting feature, but it can generate highly-detailed objects using less resources than using traditional technology; therefore, either game developers will be able to offer better graphics, or they will be able to render existing graphics with higher speed.

“After the post-processing work, I would probably switch to tessellation. I would use […] patch approximation to smooth out jagged objects, things like pipes, which are supposed to have smooth form, use N-patches there. Or I will be even more aggressive and take something like parallax occlusion mapping, which is a rather attractive kind of trick for improving the quality of pixels within an object. I would instead extrude the geometry and load a parallax occlusion height map and then would generate much improved silhouettes using the tessellator,” said Mr. Huddy.

Obviously, to use tessellation, developers will have to rely on DirectX 11 or, at least, hardware with tessellation support. Still, tessellation is a yet another advantage of DX11.
More to Come!

There are a lot more things to implement, though, to boost performance of future games. One of them is HDR compression.

“If I had a spare day and HDR, then I would conduct HDR effects into Microsoft’s new format because that be twice compact that way.

Important notice: Richard Huddy is a specialist who recommends game developers to implement certain features and helps programmers to implement those technologies in the most efficient way. Nevertheless, by no mean should his answers be considered as opinion of game developers in general and should be treated as his own outlook.

http://www.xbitlabs.com/news/video/...osting_Features_of_DirectX_11_First__ATI.html
 
yes, everything we've missed for a couple years thanks to Nvidia, the spare shader power on the 2900-4800 cards isn't as efficiently used as possible because, well, they were expecting all dx10 games to be as highly programmable and efficient as what dx11 will now bring.

But dx11 should be the decent jump forwards we need, real benefits and performance increases, quality increases for no performance hit. Basically dx11 rocks and dx10 should have rocked.

THe interesting thing is once tesselation is introduced and games actually use it with the option, it "should" work on current ATi cards with tesselation units, and some of the performance benefits from dx11 being properly used in games will filter through to dx10.1 ati cards also, which means when we finally get dx11 games, ATi's past generation(or 3) could all get a performance boost while Nvidia get smeg all, its another reason to go for a ATi card now if you ask me.

DX11 should also be very very quickly uptaken, not least because developers all started to program for the original dx10 before it was neutered years ago, so most will have experience of essentially dx11 coding from the past few years. I'd imagine(a la Assasins creed) that many games have dx10.1/dx11 features in the engine but went completely unused, games were in developement for dx10(original) way before dx10 cards were out , its likely they were aiming for the original standard and simply disabled things at a late stage, or cards simply ignore dx10 features.
 
I take it the lessons of the past have gone straight over the heads of you guys... I love the optimism all the same...

Very few programmers have extensive DX10 experience - most game developers have played it safe and ran with open GL or DirectX 9.

DX11 should also be very very quickly uptaken, not least because developers all started to program for the original dx10 before it was neutered years ago, so most will have experience of essentially dx11 coding from the past few years. I'd imagine(a la Assasins creed) that many games have dx10.1/dx11 features in the engine but went completely unused, games were in developement for dx10(original) way before dx10 cards were out , its likely they were aiming for the original standard and simply disabled things at a late stage, or cards simply ignore dx10 features.

I'd love to know where you get your information from - coz I've been around the game development industry for some time* and seen very little focus on DX10 at all.

* Not even taking direct development into account I've been involved in closed game testing and more public beta testing of approx. 1 in 5 video game titles released on the PC since around 2003 or 2004.
 
Oh and I have something else to add to this... its been awhile so I'd forgotten...

When DX10 came out... "neutered" as it might have been...

Most game developers that even bothered with DX10 killed most of the features they had been planning... not because of lack of support for nVidia... the effects ran flawlessly on an 8800GTX... the 2900 wouldn't even render half the effects properly or the features were completely lacking in the drivers - take a look at early reports on lost planet and ETQW, being the most pertinent and easy to find examples, if you want more information... it took ATI over 6 months AFTER nVidia to even implement these features properly in the drivers.

So all you people saying nVidia killed DX10 are very poorly informed - to lay it on thick - if ATI couldn't even get functionality up and running for a "neutered" feature set until many months later - imagine the trouble they'd have had with the full blown deal...
 
Last edited:
Yet despite all these troubles, how do the current generation of ATI cards manage to out sell and generally outperform (at least at the various price points) the 'current generation' (eg last generation) of Nvidia cards?
 
Oh the cards themselves aren't the problem, I'm not a big fan of ATI/AMD decision making and feature support but thats a different story...

I just find it amusing when people hype up features when the past has shown us that they are for the most part irrelevant as if something is going to change overnight - while slating nVidia for things that are six to one and half a dozen to the other...
 
So all you people saying nVidia killed DX10 are very poorly informed - to lay it on thick - if ATI couldn't even get functionality up and running for a "neutered" feature set until many months later - imagine the trouble they'd have had with the full blown deal...

To be fair, NV were not exactly covering themselves with glory back in the early days of DX10, what with their drivers being responsible for a huge percentage of Vista crashes. Given the turnaround in fortune shown by the 4x series and the rapid improvement of drivers in the same time frame, it is a bit unfair to be looking on future support in the same way as it was back in the prime cluster**** of the merger and the 360 deal. Given the benefits DX11 brings to even previous generation hardware (something DX10 did not offer), I would imagine the take up will be a bit quicker.

I'm not a big fan of ATI/AMD decision making and feature support but thats a different story...
Really!? I'm shocked. :D
 
To be fair, NV were not exactly covering themselves with glory back in the early days of DX10, what with their drivers being responsible for a huge percentage of Vista crashes. Given the turnaround in fortune shown by the 4x series and the rapid improvement of drivers in the same time frame, it is a bit unfair to be looking on future support in the same way as it was back in the prime cluster**** of the merger and the 360 deal. Given the benefits DX11 brings to even previous generation hardware (something DX10 did not offer), I would imagine the take up will be a bit quicker.

I couldn't care less about vista personally :P

I guess the last bit is a fair point tho.
 
I don't think it'll be as widely taken up as people like to think, but the top graphical games will likely want to make use of it where they can - naturally the developers want to provide the highest possible framerate for the best possible graphics, that's how you attract gamers. Well, along with gameplay and storyline, but you can have the best of both and nobody will buy it if it runs at 8fps.
 
I will do what i always do hope for the best from it but wait and see what it actually delivers. I find it funny Rroff that your constantly so quick to downplay both ati and what they are doing and the part nvidia played in the mess that was dx10. It is not exactly secret how much interfering nvidia did about dx10 and the fuss they kicked up about 10.1 was very public even using their influence to get at least one game stripped of 10.1 features that gave an improvement though not to nvidia cards.

Personally i am looking forward to seeing what intel bring to the table as well as the other two established companys but as a customer i just hope we get decent value for our money from them all and healthy competition preventing any of them getting comfortable again as that really wasn't good for us as customers.
 
I really hope DX11 will allow developers to create more detailed worlds. The bad thing is that even with limitless graphical horsepower a developer studio actually needs to create all that content.

Many games have witnessed the fact that games tend to get more realistic but shorter since developing such detailed worlds must take time. While we have some good PC exclusives but most games are either ported from consoles, or are designed for the PC and consoles aswell limited the developers to DX9 and what the consoles are capable of.

I guess that we see the next leap in graphics when the new consoles are out since developers will be forced to learn the new API and they will have more hardware horsepower from the new consoles that will allow them to deliver better stuff.
 
Well, it's an ATI man being interviewed, would be nice to get this sort of "wish list" from a dev working for the likes of iD or Crytek. Someone who actually has to implement the stuff.
 
Oh and I have something else to add to this... its been awhile so I'd forgotten...

When DX10 came out... "neutered" as it might have been...

Most game developers that even bothered with DX10 killed most of the features they had been planning... not because of lack of support for nVidia... the effects ran flawlessly on an 8800GTX... the 2900 wouldn't even render half the effects properly or the features were completely lacking in the drivers - take a look at early reports on lost planet and ETQW, being the most pertinent and easy to find examples, if you want more information... it took ATI over 6 months AFTER nVidia to even implement these features properly in the drivers.

So all you people saying nVidia killed DX10 are very poorly informed - to lay it on thick - if ATI couldn't even get functionality up and running for a "neutered" feature set until many months later - imagine the trouble they'd have had with the full blown deal...

ET:QW isn't DX10 it's OpenGL. :confused:

Plus, I played through lost planet to the last boss using a 2900XT played at 1920x1200 and max in-game settings.
 
tbh teh HD2XXX was a rubbish card vs Nvida, anyway, the HD38XX was better but still not good enough, so maybe it was AMD/ATI that killed directX10 at first untill they brought out the HD48XX than left Nvida well behined.

thats my option on things. seems to expalin a few things and yes i did have HD2600XT i didn't like it. didn't have enough for a 8800GT :( got a HD3870 that was fine
 
basically most of it is just bs. i hear people say oh that wont be this dx this or that look a lot of people cant play some games absolute max on dx9 ! nevermind 1o which is basically none existent in games. theyll shovel anything down your throat to buy a new card.

all you need to do is look at the games that might be really big and see what they use in the future then you know basically what you need. for at least the next year.

at very least it will propbably be 2 years before any games uses it mainstream if that early. 10 is not mainstream and people worrying about 11 :(
 
Back
Top Bottom