• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Unreal Engine 5 - unbelievable.

You wondered why they removed NVLINK/SLI from 40 series ? Well regarding Unreal Engine 5.1 ... This is how simple it is to add dual GPU NVLINK/SLI support.




This will only work with GPUs with NVLINK/SLI support and a NVLINK/SLI bridge. See how Nvidia removed a feature that is actually useful and can save people money/time by buying cards that support it.

Lets hope they bring it back to 50xx series. 4090 this generation is nothing more than a 4080Ti really.
 
Last edited:
Errr..No

They removed SLI because 99% of new games even during SLI's heyday never used it and when DX12 was introduced and game developers had to do the implementation, even fewer games would receive SLI.

Its even easier to enable DLSS in Unreal Engine, yet many games dont have it cause devs are lazy, but you think they are gonna enable SLI and test it..

And regardless, even when SLI worked, it always had issues with micro stutters and if you are going to accept micro stutters, you may as well play games on a console

SLI is only good for setting benchmarks scores in 3D Mark, not for gaming. which is why Kaapstad quit the forum, there is no more 4 way SLI GPUs he can buy to sit at the top of the leaderboard's e-peen contests
 
Last edited:
Errr..No

They removed SLI because 99% of new games even during SLI's heyday never used it and when DX12 was introduced and game developers had to do the implementation, even fewer games would receive SLI.

Its even easier to enable DLSS in Unreal Engine, yet many games dont have it cause devs are lazy, but you think they are gonna enable SLI and test it..

And regardless, even when SLI worked, it always had issues with micro stutters and if you are going to accept micro stutters, you may as well play games on a console

SLI is only good for setting benchmarks scores in 3D Mark, not for gaming. which is why Kaapstad quit the forum, there is no more 4 way SLI GPUs he can buy to sit at the top of the leaderboard's e-peen contests
:rolleyes:
 
Dual GPU won't be coming back. For starters the power consumption and heat gen is just too high for even one card these days and the cards themselves take up triple slots. What kinf of mobo and case do you think will be able to give TWO cards like these enough breathing room to not cook each other the first time you load up a demanding game?

It had its time and was a total niche with no real interest from anyone other than a small pocket of diehards, it isn't coming back.
 
Last edited:
You wondered why they removed NVLINK/SLI from 40 series ? Well regarding Unreal Engine 5.1 ... This is how simple it is to add dual GPU NVLINK/SLI support.




This will only work with GPUs with NVLINK/SLI support and a NVLINK/SLI bridge. See how Nvidia removed a feature that is actually useful and can save people money/time by buying cards that support it.

Lets hope they bring it back to 50xx series. 4090 this generation is nothing more than a 4080Ti really.
This seems like unreal is in offline render mode and not the interactive “gameplay” rendering mode.

I’m surprised it even needs SLI/NVlink to work. Generally Offline render engines don’t care about that and will detect all GPUs on the system and work just fine.

NVlink was removed to bump everyone onto the A series of cards, since the 3090 was too good of a deal for people who made money using their graphics cards.
 
Dual GPU won't be coming back. For starters the power consumption and heat gen is just too high for even one card these days and the cards themselves take up triple slots. What kinf of mobo and case do you think will be able to give TWO cards like these enough breathing room to not cook each other the first time you load up a demanding game?

It had its time and was a total niche with no real interest from anyone other than a small pocket of diehards, it isn't coming back.
And it's not needed, ironically DX12 makes it easier to do multi card than ever but only Ashes of the Singularity implemented it AFAIK...
 
And it's not needed, ironically DX12 makes it easier to do multi card than ever but only Ashes of the Singularity implemented it AFAIK...

People used to say it was because of dx12 that multi gpu died off, supposodly it killed support for it. But considering the size and power requirements these days it's a good thing it's no longer around.
 
People used to say it was because of dx12 that multi gpu died off, supposodly it killed support for it. But considering the size and power requirements these days it's a good thing it's no longer around.
AOTS was the DX12 showcase and it uses multiGPU effectively, if you have a decent iGPU it can use that to help your main GPU as well.
It's simply a matter of devs not bothering, otherwise a 8800G would have started being a very interesting value proposition when paired to a mid-range GPU...
 
Errr..No

They removed SLI because 99% of new games even during SLI's heyday never used it and when DX12 was introduced and game developers had to do the implementation, even fewer games would receive SLI.

Its even easier to enable DLSS in Unreal Engine, yet many games dont have it cause devs are lazy, but you think they are gonna enable SLI and test it..

And regardless, even when SLI worked, it always had issues with micro stutters and if you are going to accept micro stutters, you may as well play games on a console

SLI is only good for setting benchmarks scores in 3D Mark, not for gaming. which is why Kaapstad quit the forum, there is no more 4 way SLI GPUs he can buy to sit at the top of the leaderboard's e-peen contests

I only played with multi GPU for 3D back in HD 4850 days and while some were stuttering, others were very smooth. Basically once it was over 60 it was perfect and also better with frame rate manually capped.
People used to say it was because of dx12 that multi gpu died off, supposodly it killed support for it. But considering the size and power requirements these days it's a good thing it's no longer around.

Still doable with relative ease for something like 4080 and below. Even 4090 if you had the PSU. :)
 
AOTS was the DX12 showcase and it uses multiGPU effectively, if you have a decent iGPU it can use that to help your main GPU as well.
It's simply a matter of devs not bothering, otherwise a 8800G would have started being a very interesting value proposition when paired to a mid-range GPU...
The way it worked it was doubling the performance of your slowest GPU.
I tried it with 980ti and RX 480 and while it worked it was giving me marginally better performance than 980ti on its own.
8800G would have to be paired with really really slow graphics card to make sense.
 

However, arguably the most exciting part of Unreal Engine 5.4 is the performance upgrade across the board. Simon Tourangeau, Vice President of Engineering for UE at Epic, took the stage to say:

We've made a lot of performance improvements. We now have faster Lumen, Shadows, and ray tracing. We've added Variable Rate Shading (VRS) for Nanite. We massively improved instance culling and we significantly improved parallelism in the renderer.

Any claim of performance upgrade has to be backed by some data, so Tourangeau said that the City Sample originally shipped with version 5.0 is now much faster in Unreal Engine 5.4 when performing a console test. More specifically, render thread time has been halved, while GPU time was reduced by 25%.

Epic also stated that 5.4 needs far fewer shader precompilations, which means maybe..... maybe..... Stuttergate can be laid to rest.
 
Back
Top Bottom