• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Deux Ex: Mankind Divided. DX12, Async shaders and Tress FX 3.0 support at launch.

Soldato
Joined
7 Feb 2015
Posts
2,864
Location
South West
Original Article From Hexus.net Link

Deus Ex: Mankind Divided was first teased then revealed to us back in April. From the comments on that news, this is a highly anticipated game, thanks largely to the compelling and enjoyably playable nature of its predecessors. Now we have just heard some news which may whet your Deus Ex appetites further – the game will launch early next year with DirectX 12 support from day one.

e9e109aa-0576-4940-a59d-edd60c8b2dd1.jpg


To recap, the new game follows on from Human Revolution, just two years further into the future, 2029. The hero, Adam Jensen, is tasked to hunt and capture augmented terrorists while he is also held under suspicion for being a transhuman. As it's set further in the future, there are new tools and gadgets available to our hero, and you can play in a style that appeals to you and your gaming skills - with a stealth or action emphasis.


AMD confirms DX12 from day one

TweakTown reporter Anthony Garreffa recently enjoyed a chat with AMD's Chief Gaming Scientist, Richard Huddy. The talk got onto the subject of "all things DX12 and Asynchronous Shaders," and the support for the API in Deus Ex: Mankind Divided, since the game was used in AMD's DirectX 12 showcase.

AMD's Huddy confirmed to Garreffa that Deus Ex: Mankind Divided would be a game that supports DirectX 12 straight away, from its launch date. Furthermore TressFX Hair 3.0 technology will feature in the upcoming title.

c2c8670d-0e28-4d7f-80ed-1762aa6fc33e.jpg


Built-in benchmark

Deus Ex: Mankind Divided will include a built-in benchmark, confirmed Huddy. Built into such a sure-fire PC gaming success, the benchmark will probably be widely used by both gamers and the tech press to check the difference between DirectX 11 and DirectX 12 performance on hardware upgrades.

How long will we have to wait to enjoy this game and delve into its benchmarks? According to the official site, and the YouTube pre-order promo video embedded below, Deus Ex: Mankind Divided will become available on 23rd Feb on PlayStation 4, Xbox One and PC. (Or four days earlier for pre-order customers).
 
The article makes no mention of Deus Ex using asynchronous shaders and even if that is the case the insinuation that NVidia will be in trouble because of it is yet to be proven.

AMD using dodgy marketing tricks once again.

There's a good article about it all here btw, not from AMD or one of their developer friends.

https://scalibq.wordpress.com/2015/09/02/directx-12-is-out-lets-review/

And don’t get me started on Oxide… First they had their Star Swarm benchmark, which was made only to promote Mantle (AMD sponsors them via the Gaming Evolved program). By showing that bad DX11 code is bad. Really, they show DX11 code which runs single-digit framerates on most systems, while not exactly producing world-class graphics. Why isn’t the first response of most people as sane as: “But wait, we’ve seen tons of games doing similar stuff in DX11 or even older APIs, running much faster than this. You must be doing it wrong!”?

But here Oxide is again, in the news… This time they have another ‘benchmark’ (do these guys actually ever make any actual games?), namely “Ashes of the Singularity”.

And, surprise surprise, again it performs like a dog on nVidia hardware. Again, in a way that doesn’t make sense at all… The figures show it is actually *slower* in DX12 than in DX11. But somehow this is spun into a DX12 hardware deficiency on nVidia’s side. Now, if the game can get a certain level of performance in DX11, clearly that is the baseline of performance that you should also get in DX12, because that is simply what the hardware is capable of, using only DX11-level features. Using the newer API, and optionally using new features should only make things faster, never slower. That’s just common sense.

Now, Oxide actually goes as far as claiming that nVidia does not actually support asynchronous shaders. Oh really? Well, I’m quite sure that there is hardware in Maxwell v2 to handle this (nVidia has had asynchronous shader support in Cuda for years, via a technology they call HyperQ. Long before AMD had any such thing. The only change in DX12 is that a graphics shader should be able to run in parallel with the compute shaders. Not something that would be that difficult to add to nVidia’s existing architecture, and therefore quite implausible that nVidia didn’t do this properly, or even ‘forgot’ about it). This is what nVidia’s drivers report to the DX12-API, and it is also well-documented in the various hardware reviews on the web.

It is unlikely for nVidia to expose functionality to DX12 applications if it is only going to make performance worse. That just doesn’t make any sense.

There’s now a lot of speculation out there on the web, by fanboys/’developers’, trying to spin whatever information they can find into an ‘explanation’ of why nVidia allegedly would be lying about their asynchronous shaders (they’ve been hacking at Ryan Smith’s article on Anandtech for ages now, claiming it has false info). The bottom line is: nVidia’s architecture is not the same as AMD’s. You can’t just compare things such as ‘engine’ and ‘queue’ without taking into account that they mean vastly different things depending on which architecture you’re talking about (it’s similar to AMD’s poorly scaling tessellation implementation. Just because it doesn’t scale well doesn’t mean it’s ‘fake’ or whatever. It’s just a different architecture, which cannot handle certain workloads as well as nVidia’s).

Edit: There have been some updates on the async compute shader story between nVidia, AMD and Oxide. See ExtremeTech’s coverage for the details. The short story is exactly as I said above: nVidia’s and AMD’s approach cannot be compared directly. nVidia does indeed support async compute shaders on Maxwell v2, and indeed, there are workloads where nVidia is faster than AMD, and workloads where AMD is faster than nVidia. So Oxide did indeed (deliberately?) pick a workload that runs poorly on nVidia. Their claim that nVidia did not support it at all is a blatant lie. As are claims of “software emulation” that go around.

The short story is that nVidia’s implementation has less granularity than AMD’s, and nVidia also relies on the CPU/driver to handle some of the scheduling work. It looks like nVidia is still working on optimizing this part of the driver, so we may see improvements in async shader performance with future drivers.
 
Last edited:
Well i hope TressFX Hair looks better than it did with tomb raider, all hype again it looks like.

Tbh the same could be said for most games which claim to have realistic hair Spook, Witcher 3 was a prime example for me, had not problems maxing all the Hairworks stuff out on my system, but it still don't look amazing imho.

I'd agree with you though, hopefully we'll get some fantastic looking hair in this game that doesn't wreak performance on either team.
 
The article makes no mention of Deus Ex using asynchronous shaders and even if that is the case the insinuation that NVidia will be in trouble because of it is yet to be proven.

AMD using dodgy marketing tricks once again.

There's a good article about it all here btw, not from AMD or one of their developer friends.

https://scalibq.wordpress.com/2015/09/02/directx-12-is-out-lets-review/

It has been confirmed before that the game supports asynchronous shaders. :)

I was highlighting DX12 support on day 1 more than anything.
 
Thing to note about TressFX 3.0 isn't only about hair it can do other things also. But I feel tomb raider will show it off better.

The great thing about TressFX is everyone can run it and run it at decent performance. Sadly the same can't be said about other hair tech.
Who doesn't like Deus Ex??? :eek:
I know man I really been wanting to jump onto it but just not had the time. I did buy human revolution deus ex just not had time.
 
Last edited:
This is the type of game that will suit me and with it being DX12, we will get to see some of what DX12 does to AMD and Nvidia hardware. Hopefully it looks great and plays great. DE:HR I never really got into it but that was more that I had other games to play and that took a back seat.
 
This is the type of game that will suit me and with it being DX12, we will get to see some of what DX12 does to AMD and Nvidia hardware. Hopefully it looks great and plays great. DE:HR I never really got into it but that was more that I had other games to play and that took a back seat.

You to greg! give it a go when you get chance bud.
 
Back
Top Bottom