• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

DirectX® 12 for Enthusiasts: Explicit Multiadapter

But why would Microsoft go through the effort of adding asymmetric multi-GPU into the DirectX 12 API if how you put it something that can't be done?

Has Robert Said
Even with these things in mind, I’m excited about the future of PC gaming because developers already have expressed interest in explicit multi-adapter’s benefits—that’s why the feature made it into the API! So with time, demand from gamers, and a little help from AMD, we can make high-end PC gaming more powerful and versatile than ever before.

Perhaps Microsoft looked in their crystal ball and realised in 10 years time people will be refusing to upgrade from Win 10 lol.:D
 
Perhaps Microsoft looked in their crystal ball and realised in 10 years time people will be refusing to upgrade from Win 10 lol.:D

:D
They must be some use for it though. I Doubt MS would add something that wasn't possible to get working..
Who knows what the future holds in the PC scene, I predict big changes in the next couple years.
 
:D
They must be some use for it though. I Doubt MS would add something that wasn't possible to get working..
Who knows what the future holds in the PC scene, I predict big changes in the next couple years.

I don't think it is a matter of them adding the ability. It is just something that is now possible since you talk to ALL gpu's independently and directly. So beyond that it is a matter of what kind of work you want to do on each gpu.
 
I think the issue with any Multi-GPU tech and really taking advantage of it is that you can't guarantee that everyone will have it. So I think we're unlikely to see anything new because of multi-GPU, the best we can really hope for is that it does what it always did better.

The game has to be doable on 1 GPU, so the only thing 2 can add is speed/performance. Which really isn't anything new.

In fact the game doesn't need to be doable on 1 GPU, the game needs to be doable on an XBOX ONE. Anything that can't be done on an XBOX ONE probably won't make it into the game.

So I really can't see any fancy new effects being added because of multi-GPU unless it's a PC only game. Even then most will run it on a single GPU so unlikely to see anything on multi-GPUs that can't be done on 1.

Just because they could do something, doesn't mean they will. The people that actually do these things rarely control the purse strings or timelines for the games they make.
If, in order to release on-time and in budget, you have to choose between cutting out the last level of your game or this neat effect that a tiny minority of your customers can use/see, which are you gonna drop?
 
I think the issue with any Multi-GPU tech and really taking advantage of it is that you can't guarantee that everyone will have it. So I think we're unlikely to see anything new because of multi-GPU, the best we can really hope for is that it does what it always did better.

The game has to be doable on 1 GPU, so the only thing 2 can add is speed/performance. Which really isn't anything new.

In fact the game doesn't need to be doable on 1 GPU, the game needs to be doable on an XBOX ONE. Anything that can't be done on an XBOX ONE probably won't make it into the game.

So I really can't see any fancy new effects being added because of multi-GPU unless it's a PC only game. Even then most will run it on a single GPU so unlikely to see anything on multi-GPUs that can't be done on 1.

Just because they could do something, doesn't mean they will. The people that actually do these things rarely control the purse strings or timelines for the games they make.
If, in order to release on-time and in budget, you have to choose between cutting out the last level of your game or this neat effect that a tiny minority of your customers can use/see, which are you gonna drop?

Not really think of it another way and not just about adding performance..

Lets say one GPU on Battlefield 5 gets you Ultra graphics but when you add in another GPU you then can get Ultra+
What Ultra+ might add is better Graphics being rendered on 2nd GPU for example 2nd GPU doing only Effects

They is a lot of things am thinking of right now that could work very well..
 
Not really think of it another way and not just about adding performance..

Lets say one GPU on Battlefield 5 gets you Ultra graphics but when you add in another GPU you then can get Ultra+
What Ultra+ might add is better Graphics being rendered on 2nd GPU for example 2nd GPU doing only Effects

They is a lot of things am thinking of right now that could work very well..

Mantle was out for what, 18 months? And we had what 7, 8, maybe 9 games?
We didn't see any new effects from Mantle that I'm aware of.

We saw more new effects from TressFX, HairWorks and PhysX than we did Mantle.

The stuff they can add will be minor as they can't drastically change the experience from what everyone else sees.
Sure they can add more particles to an explosion or something but it'll just be tweaks on something we already have, not something new.

What effects do you imagine they could add that would be worthwhile?
 
Not really think of it another way and not just about adding performance..

Lets say one GPU on Battlefield 5 gets you Ultra graphics but when you add in another GPU you then can get Ultra+
What Ultra+ might add is better Graphics being rendered on 2nd GPU for example 2nd GPU doing only Effects

They is a lot of things am thinking of right now that could work very well..

Real-time Ray-based GI on the second card? :P

Still waiting on someone to do Wave based 3D surround using the TrueAudio DSP's like Aureal did with their own 3D sound cards in the 90's.
 
Mantle was out for what, 18 months? And we had what 7, 8, maybe 9 games?
We didn't see any new effects from Mantle that I'm aware of.

We saw more new effects from TressFX, HairWorks and PhysX than we did Mantle.

The stuff they can add will be minor as they can't drastically change the experience from what everyone else sees.
Sure they can add more particles to an explosion or something but it'll just be tweaks on something we already have, not something new.

What effects do you imagine they could add that would be worthwhile?

The difference here is everyone can now use DirectX 12 Devs now have more of a reason to push games forward.. Mantle was AMD only only two games pushed something new under Mantle and that was SFR in Civilization Beyond Earth first game I believe to support it officially and Thief used some crossfire Mantle tech but I forget the name sure AMDmatt will help me here.

Back to DirectX 12 though after all this not a Mantle thread.
They is loads of Graphics that are very demanding that a game will benefit from using on another GPU almost all demanding Shadow effect at much higher details, or Smoke effect and Particles or even just Draw Distance

You correct about, not everyone will see the same effects if they dont support the hardware but dont we already have this now? on BF4 I can switch down settings or enable settings other players might not be running why this any different?

Real-time Ray-based GI on the second card? :P

Still waiting on someone to do Wave based 3D surround using the TrueAudio DSP's like Aureal did with their own 3D sound cards in the 90's.

Yeah even that man very demanding! They is all kinds I would love too see! :D
 
Last edited:
Thief was also the first game to use Asynchronous shaders. Although we can't tell how much a performance boost it gave as it was not a switchable option.
 
But why would Microsoft go through the effort of adding asymmetric multi-GPU into the DirectX 12 API if how you put it something that can't be done?

Has Robert Said
Even with these things in mind, I’m excited about the future of PC gaming because developers already have expressed interest in explicit multi-adapter’s benefits—that’s why the feature made it into the API! So with time, demand from gamers, and a little help from AMD, we can make high-end PC gaming more powerful and versatile than ever before.

It has been added largely for compute reasons.

It is also probably. Of. Ich effort to add to the API of rate odd occasion it could be useful. But ther are some basic facts that simply can't be argued with, if both GPUs need access to the same texture then that texture will need to be duplicated in each GPU's VRAM.
 
Not really think of it another way and not just about adding performance..

Lets say one GPU on Battlefield 5 gets you Ultra graphics but when you add in another GPU you then can get Ultra+
What Ultra+ might add is better Graphics being rendered on 2nd GPU for example 2nd GPU doing only Effects

They is a lot of things am thinking of right now that could work very well..

No, that is a very bad way of thinking about it. Each GPU will be rendering the same scene in most scenarios, either different parts of a single frame or different frames. Image composition of separately rendered objects won't happen.

What you talk about may exist for soe. Effects like environment mapping or shadow map creation It it all goes back to the same issue that you never want to be sending data from 1 GPU to the other unless you absolutely have to, and the easiest way to do that is to keep rendering unique on each GPU and only merge the final frames.
 
It has been added largely for compute reasons.

It is also probably. Of. Ich effort to add to the API of rate odd occasion it could be useful. But ther are some basic facts that simply can't be argued with, if both GPUs need access to the same texture then that texture will need to be duplicated in each GPU's VRAM.

Guess time will tell then.. I firmly believe PC gaming is due a massive overhaul.
 
Guess time will tell then.. I firmly believe PC gaming is due a massive overhaul.

Maybe when we get a NVlink and AMD's fabric tech.


Parallel computation is not easy at the best of times, when there is no shared memory the performance possibilities are limited by data transfer rates and the ability to separate tasks at the algorithmic level, something which is not always possible.
 
Back
Top Bottom