• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Unreal Engine 5 - unbelievable.

unreal could also bake in multi gpu support in the engine by default, dx 12 has the spec built-in, developers are responsible for implementation
"Developers are responsible of implementation" of DX12's multi GPU supports has really turned out so badly that both SLI and CrossFire are now dead.

If some middleware like the game engine could be made so it handle the implementation, then multi GPU could make a comeback.

Although with multi chiplets GPUs now a thing (Nvidia is a bit behind but - like Intel following AMD with CPU chiplets - I'm sure it's on their internal roadmap too), then the GPU vendors may not like this.

Sure, one customer may buy multiple GPUs at the same time, but I suspect once chiplets are widespread that pricing will be something like:
X chiplets give 100% performance and cost 100%
2 * X chiplets give 180% performance but cost 300%.
And so on.

DX12 multi GPU support from the game engine? Nvidia and AMD would probably consider that eating into "their" margins and would want to do something like the "pre-scalped for your convenience" cards they did during the last mining boom!

EDIT: typo "but" instead of "buy"
 
Last edited:
the chiplet thing is not turning out to be very promising for games
blackwell b100 is a multichip die, so i would guess nvidia would have tried or trying this for games as we speak
regarding middleware support, major studios are shifting development to unreal, so i was thinking more on the lines of if unreal could implement this in its compiler or if they could build standard interfaces to leverage multigpu, could be the next big thing
 
the chiplet thing is not turning out to be very promising for games
blackwell b100 is a multichip die, so i would guess nvidia would have tried or trying this for games as we speak
regarding middleware support, major studios are shifting development to unreal, so i was thinking more on the lines of if unreal could implement this in its compiler or if they could build standard interfaces to leverage multigpu, could be the next big thing
Well chiplet has to be tackled at some point as nodes barely move much ATM and the cost per transistors is stagnant or going up. That rumours currently point to only the non-gaming Blackwell GB100 being chiplet probably means that Nvidia are not in any great hurry.

Yes, AMD's approach was disappointing - I've said this before at the Navi 32 (7800 XT) level chiplet made little sense nor did making their biggest chip, Navi 31, only have a 300mm² chiplet. A true high end 600mm² chiplet plus the 8 memory chiplets (512-bits) running at lower clocks - since 7900 XTX is already power hungry - that is wide but clocked at the max perf/watt the 5nm node can do would have made far more sense. Probably low enough sales volume but sometimes halo cards are about mindshare.

Yes, the engine doing this is what I meant by middleware.

Problem with anyone else doing it - like the game developer - is that DX12 multi-GPU has be something which keeps having to be updated constantly.

Hence why we had some DX12 tech demos (Ashes of the Singularity springs to mind) but even then the game developer would have to constantly updated their game as new GPUs came out.

In the days of SLI/CrossFire the GPU driver vendors did this work - and with DX12 they have now got away with doing none of that work.

And that would only be viable if it was the game engine developer doing that.

Although even then it might be problematic: where we might have UE5 v5.123 supporting GPUs up to mid 2023 and then UE5 v5.124 supporting GPUs up to end 2024 etc. but surely actual released games would be fixed with whatever version of the engine they shipped with / were developed with.
 
I still despise this engine. Still has stuttering issues. Tired of promises "it will be fixd in UE 4.2" "just wait for UE5" "fixed in UE 5.4 for sure!".

Can't wait for The Witcher 4 with traversal stutter :D
 
That's a jab @mrk as he was against the switch from REDengine to UE5 for some reason :cry:
I wasn't against it at all though. I've always known that RED was ancient, but it's optimised well now and looks great, but lacks potential for all the new stuff now out or coming out so it's time to move on with something more photo realistic with wider scope.

As for UE5 now:


That's at 5160x2160 using only DLAA in Epic settings. rather excellent performance:
Average framerate : 81.5 FPS
Minimum framerate : 65.3 FPS
Maximum framerate : 100.4 FPS
1% low framerate : 60.4 FPS
0.1% low framerate : 31.1 FPS
 
Won't be long now before unreal engine is pretty much the majority. Seems to be more and more releases due to come out that are using it.

Unreal engine 4 was widely used, 5 will be much more, also helped by unity engine trying to play silly beggars with their charging scheme
 
I wasn't against it at all though. I've always known that RED was ancient, but it's optimised well now and looks great, but lacks potential for all the new stuff now out or coming out so it's time to move on with something more photo realistic with wider scope.

As for UE5 now:


That's at 5160x2160 using only DLAA in Epic settings. rather excellent performance:
Average framerate : 81.5 FPS
Minimum framerate : 65.3 FPS
Maximum framerate : 100.4 FPS
1% low framerate : 60.4 FPS
0.1% low framerate : 31.1 FPS

Don't make me go digging bro. I give you until the morning to go edit those posts!

I was all for the move and you was like why are they doing it and they should stick to RedEngine which I found was an odd thing to think :p
 
Last edited:
Don't make me go digging bro. I give you until the morning to go edit those posts!

I was all for the move and you was like why are they doing it and they should stick to RedEngine which I found was an odd thing to think :p
Redengine is great but the people who know how to utilize it left CD a long time ago. Hence the move to something more generic :D

There was a good interview of why they originally built Redengine for The Witcher 2 instead of using a generic engine like UE3 back then.

Now we have all these UE games that look the same and innovation is less and less. I liked when we had a lot of unique engines built for specific games/genres and they'd push different, new technology.
 
Last edited:
Redengine is great but the people who know how to utilize it left CD a long time ago. Hence the move to something more generic :D

There was a good interview of why they originally built Redengine for The Witcher 2 instead of using a generic engine like UE3 back then.

Now we have all these UE games that look the same and innovation is less and less. I liked when we had a lot of unique engines built for specific games/genres and they'd push different, new technology.

I agree more engines would be a good thing. I also think moving away from REDengine to UE5 for Cyberpunk 2 is a good thing. Mrk did not seem so thrilled about it though. Well at least until recently where he seems to have changed his tune :p
 
I was specifically on about how well optimised RED engine is nowadays vs how Unreal Engine has always not been well optimised, but that has changed massively in recent time. Since UE5.3 and up Epic built i tools for devs to use that removes shader comp stutter but we still have some minor traversal stutter, night and day difference though. This is fine since the current RED engine also has minor traversal stutter in Cyberpunk so much of a muchness.

The problem with RED is it's inability to support some of the rendering techniques used by unreal 5, distance rendering is non existant in RED which is why stuff far away when you zoom in with your Kiroshi optics looks like something out of the 1990s, it's also why lights don't render properly for distant objects and why ray tracing doesn't cover that afr ahead either.

in unreal 5 Lumen and Nanite completely solve those issues and distance rendering has no pop-in for anything when using hardware ray tracing paths. Currently all UE5 games use software Lumen or hybrid Lumen along with Nanite in the latest titles. Still, once HW Lumen is being used, we will say goodbye to texture detail loss at a distance and shadow/texture pop-in, this is something RED cannot do and will never do because the people who knew how to implement it left CDPR and in 2022 CDPR signed a 15 year deal with EPIC to only develop with UE, which is probably why some of those RED experts left too.

People are forgetting that maintaining an in-house engine costs a lot of money and time, so why do that when you can literally make an entire movie in UE5 and make it playable? The creative side of UE5 comes down to individual devs, most games typically look "unreal" because they use similar assets and camera angles and whatnot, it's the actually creative devs that use their own assets and import into UE5 that will show what it's truly capable of. Look at the upcoming games like Perfect Dark, look at current releases like St ll Wakes the Deep. These don't use bundled assets but their own textures because the dev went out on location to photograph and 3d model etc to then use in UE5.
 
Theres a fair bit of stuttering in that

I’m assuming it’s the engine, but it was running much better when not capturing footage.

It was packaged directly from UE 5.4 using default parameters, and the created windows package was used for footage without any delay after execution.

Apparently there’s a way to reduce stuttering by waiting some time for shaders, but I didn’t try any of that.

It didn’t seem that bad to me after I made some changes with capture FPS.
 
Back
Top Bottom