• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The state of Multi GPU (SLI/Crossfire) in DX12

Caporegime
Joined
30 Jul 2013
Posts
29,681
So we're finally starting to see a trickle of DX12 titles available to play:

Gears of War: Ultimate Edition
Rise of the Tomb Raider
HITMAN
Ashes of the Singularity

Now, I know it's very early days but I thought DX12 was supposed to make it easier for developers to integrate Multiadapter/Multi-GPU options to the point where you can even run AMD and Nvidia GPU's together.

Having tested 3 of the 4 titles listed above (I don't own Ashes of the Singularity) it seems that none of them currently support SLI using the DX12 renderer.

I understand the controversy over the Windows 10 store restrictions which could impact Gears of War, and it seems Crossfire won't work on any Windows store games until they allow the games to run Exclusive Fullscreen (correct me if I am wrong)?

But with regards to both HITMAN and Rise of the Tomb Raider, they work fine with SLI in DX11.

Tomb Raider in particular always shows 99% usage on both my Titan X GPU's in DX11 and yet MSI afterburner confirms only one GPU in use in DX12

I only ran the HITMAN benchmark quickly, and can see SLI needs work as the usage fluctuates wildly, but again it is utilizing both graphics cards in DX11 and only one in DX12

So is this currently a developer issue or do Nvidia/AMD need to release some updated DX12 specific drivers with SLI/Crossfire profiles?
 
The developer does have more control but multi-GPU is such a niche that it will be a very low priority.
 
For the ability to utilise multi adapters in DX12 I think you need to produce the game architecturally from the ground up to be able to hand off processing different parts of the game to different GPUs - which would be a very different approach to what you'd do if you built the engine with considerations for having the game work on DX11 or older.

Using multi GPU in DX12 is a very different concept to the driver level multi GPU of SLI or Crossfire.
 
I dunno why GPU's cannot just be scalable - why doesn't DX12 just handle multi-gpu's without the need for any further thought? THere's probably good reason.........

We can use 2 or 4 sticks of memory and it just works. We can raid hard disks with minimal setup. Buy buy two expensive GPU's and they may or may not work well depending on driver or developer etc.
 
Last edited:
Maybe this'll be my last sli config as multigpu has seemed less and less of a priority to Nvidia lately even though they bang on about sli for VR. Support just is lacking more and more. It'll be a shame though a Damn shame, I've had sli since the 680 came out and always loved the extra oompf it gives you
 
I dunno why GPU's cannot just be scalable - why doesn't DX12 just handle multi-gpu's without the need for any further thought? THere's probably good reason.........

We can use 2 or 4 sticks of memory and it just works. We can raid hard disks with minimal setup. Buy buy two expensive GPU's and they may or may not work well depending on driver or developer etc.

Because memory is just read and writes, that's all it is. GPU's do far more complicated things than that. Some multy GPU drivers have over a million lines of code.
 
Because memory is just read and writes, that's all it is. GPU's do far more complicated things than that. Some multy GPU drivers have over a million lines of code.

Of course, GPU's are much more complex. Then again servers have multi-cpu support.

Hopefully it will be possible in the future. Game developers use Direct X and that handles the distribution of work to the attached GPU's - easy :) (sounds easy anyway )
 
For DX12 they have to make the games renderer mGPU aware and choose a MGPU rendering system. AFR or some type of SFR.

This explicit control means better mGPU performance when done right, but MGPU was always left in the IHVs hands. This means very few people having any experience with it. Devs may have experience with low abstraction apis when making console ports. But explicit mGPU is an entirely different monster.

But AMD are working on releasing info about mGPU coding during and after GDC, so we will more than likely see some AFR mGPU sample code on GPUopen site after GDC.
 
Of course, GPU's are much more complex. Then again servers have multi-cpu support.

Hopefully it will be possible in the future. Game developers use Direct X and that handles the distribution of work to the attached GPU's - easy :) (sounds easy anyway )

It's very hard to wrote multi-threaded code in gneral. It add a lot developer time. Even the simplest c9 cells can be broken, all tour favorite functions juzt no longer ger work safely (and worse still you get no warning if You dont study documentArion in excruciating detail.
For example I spent a couple of weeks getting a simple single threaded app to be thread safe. Simple things like rand () no longer work correctly. It took me a few days to get a random number generator.that was thread safe and worked approprivately with fixed seeds, and it still doesn't work as expected if you change the number.if threads.
 
AMD and Nvidia need to stop their BS marketing of mgpu setups. Motherboard manufacture might as well stop making boards with crossfire/sli pcie configurations as well while we are at it. Just winds me up, the power of mgpu is just pretty much one big lie, neither company gives two ***** about it, and game devs seem to give about the same amount of care.
 
Last edited:
^ That annoys me as well, they say it's awesome yet never seem to want to address micro stuttering completely & have much better compatibility with games.

Thought multi gpu would get better eventually but it seems it won't, that's a problem for people running high resolutions.
 
AMD and Nvidia need to stop of their BS marketing of mgpu setups. Motherboard manufacture might as well stop making boards with crossfire/sli pcie configurations as well while we are at it. Just winds me up, the power of mgpu is just pretty much one big lie, neither company gives two ***** about it, and game devs seem to give about the same amount of care.

+1

It's mainly just a cash cow now.

Going to get worse with DX12(as the onus falls mainly on the dev) especially with game sponsorship.;)
 
I'd agree, CF and Sli is one big massive bag of balls, only thing I found CF fury X good for was making my system look good, quite the price to pay for aesthetics.
 
Back
Top Bottom