Nothing strange about it. Making tech demos is easy. It's very, very difficult to ship a game.Everyone can create fancy tech demos but no one seems to be able to make a good game, very strange
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Nothing strange about it. Making tech demos is easy. It's very, very difficult to ship a game.Everyone can create fancy tech demos but no one seems to be able to make a good game, very strange
Errr..No
They removed SLI because 99% of new games even during SLI's heyday never used it and when DX12 was introduced and game developers had to do the implementation, even fewer games would receive SLI.
Its even easier to enable DLSS in Unreal Engine, yet many games dont have it cause devs are lazy, but you think they are gonna enable SLI and test it..
And regardless, even when SLI worked, it always had issues with micro stutters and if you are going to accept micro stutters, you may as well play games on a console
SLI is only good for setting benchmarks scores in 3D Mark, not for gaming. which is why Kaapstad quit the forum, there is no more 4 way SLI GPUs he can buy to sit at the top of the leaderboard's e-peen contests
This seems like unreal is in offline render mode and not the interactive “gameplay” rendering mode.You wondered why they removed NVLINK/SLI from 40 series ? Well regarding Unreal Engine 5.1 ... This is how simple it is to add dual GPU NVLINK/SLI support.
This will only work with GPUs with NVLINK/SLI support and a NVLINK/SLI bridge. See how Nvidia removed a feature that is actually useful and can save people money/time by buying cards that support it.
Lets hope they bring it back to 50xx series. 4090 this generation is nothing more than a 4080Ti really.
And it's not needed, ironically DX12 makes it easier to do multi card than ever but only Ashes of the Singularity implemented it AFAIK...Dual GPU won't be coming back. For starters the power consumption and heat gen is just too high for even one card these days and the cards themselves take up triple slots. What kinf of mobo and case do you think will be able to give TWO cards like these enough breathing room to not cook each other the first time you load up a demanding game?
It had its time and was a total niche with no real interest from anyone other than a small pocket of diehards, it isn't coming back.
And it's not needed, ironically DX12 makes it easier to do multi card than ever but only Ashes of the Singularity implemented it AFAIK...
AOTS was the DX12 showcase and it uses multiGPU effectively, if you have a decent iGPU it can use that to help your main GPU as well.People used to say it was because of dx12 that multi gpu died off, supposodly it killed support for it. But considering the size and power requirements these days it's a good thing it's no longer around.
Errr..No
They removed SLI because 99% of new games even during SLI's heyday never used it and when DX12 was introduced and game developers had to do the implementation, even fewer games would receive SLI.
Its even easier to enable DLSS in Unreal Engine, yet many games dont have it cause devs are lazy, but you think they are gonna enable SLI and test it..
And regardless, even when SLI worked, it always had issues with micro stutters and if you are going to accept micro stutters, you may as well play games on a console
SLI is only good for setting benchmarks scores in 3D Mark, not for gaming. which is why Kaapstad quit the forum, there is no more 4 way SLI GPUs he can buy to sit at the top of the leaderboard's e-peen contests
People used to say it was because of dx12 that multi gpu died off, supposodly it killed support for it. But considering the size and power requirements these days it's a good thing it's no longer around.
Yes, but I don't miss the bloody heat!!! 290X's in Crossfire were just the *worst*SLI looked amazing though
Can attest to that.Yes, but I don't miss the bloody heat!!! 290X's in Crossfire were just the *worst*
The way it worked it was doubling the performance of your slowest GPU.AOTS was the DX12 showcase and it uses multiGPU effectively, if you have a decent iGPU it can use that to help your main GPU as well.
It's simply a matter of devs not bothering, otherwise a 8800G would have started being a very interesting value proposition when paired to a mid-range GPU...
So.... like a 4090 and a clocked CPU?Yes, but I don't miss the bloody heat!!! 290X's in Crossfire were just the *worst*
No man, my 4090 runs about 42/43c thanksSo.... like a 4090 and a clocked CPU?
However, arguably the most exciting part of Unreal Engine 5.4 is the performance upgrade across the board. Simon Tourangeau, Vice President of Engineering for UE at Epic, took the stage to say:
We've made a lot of performance improvements. We now have faster Lumen, Shadows, and ray tracing. We've added Variable Rate Shading (VRS) for Nanite. We massively improved instance culling and we significantly improved parallelism in the renderer.
Any claim of performance upgrade has to be backed by some data, so Tourangeau said that the City Sample originally shipped with version 5.0 is now much faster in Unreal Engine 5.4 when performing a console test. More specifically, render thread time has been halved, while GPU time was reduced by 25%.