• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

Who (I assume you refer to the enthusiasts on youtube) and where ignores it when it works? Does RT need constant praise like DF does to be viable?

So, should Hogwarts not be criticised for bad reflections, or what is your point here? At least from my side the critique is always aimed at lazy devs, where I can often see very good results from indie studios and very bad ones from AAA studios. One would think it would be the opposite, but here we are.

Sure, if you go down to 480p on 4090, then it can handle it relatively well with PT. :) And I wish I was over-exagerating, but I am not - that's what I see in Indiana Jones for example, on my 4090 with PT, as I have to go to DLSS Ultra Perf from 1440p resolution to get to 60FPS without FG. To be clear, FG is bugged horribly for me in that title, so not usable (horrible stutter making game unplayable, even on clean Windows and clean drivers etc. - as I tested) - seems to be relatively common judging by Steam's and other forums full of complaints about it on 4090, though it could be related to 16 cores CPU as well, hard to say in this specific game (I've seen people confirming both). I don't think that's what people actually want - to go back to 480p gaming. :) Even with best DLSS crutch it's a blurfest.


That would result in a slideshow even on 4090 in 480p (or just horrible noise all over the screen) - you can see that in actual 3D rendering software (for example by moving camera live in the scene). They might be lazy (devs) but they're not that crazy. :)


As I wrote above, updating often breaks things, cost monies and there's no profit in it. If company gets funded by for example NVIDIA to do it to promote new functions in successful title - they'll do it. Otherwise, forget about it.

I would be fine with his little rant, pointing out lazy development and the usual shortcomings of the tech that can appear here and there, just like you or I can go on about the same issues with typical raster. But then he goes and says: "As we're moving towards this more realistic, more accurate lighting system in modern games, we're sacrificing image clarity, detail and image stability", as a sort of universal constant which... just ain't true.

He can critique Hogwarts, that's a bad implementation for reflections, but that isn't the worse for shadows for instance.

I have only a 4080. I'm not sure what version of Indiana Jones you're playing, but in Italy now and doing 4k @ Balanced DLSS is around 50fps + (where CPU limited, even with a 5800x3d :)) ) or usually above 60fps when not. That's what, 1440p? BTW, using Vulkan is one of the smoothest experiences that I've seen, so hats off to them! On Game Pass, btw.

CB77 Path Tracing is 60fps + with 1080p DLSS Quality or 1440p with DLSS Balanced. 4k @ Performance is around 50fps I believe with 60-70fps+ at Ultra Performance.
AW2 is 60fps with DLSS Quality 1080p and above that it depends, I haven't messed around much.

Usually 4k with Ultra Performance (720p) is above 60fps pretty well, but due to the flexibility of DLSS, I can use DLSS Tweaks and play with the ratio, so DLSS Ultra Performance can get closer to Performance and find a nice balance.
Moreover, I downsample from 4k since I have a native 1080p screen, so perhaps some performance lost there as well.

As for updating the engine version, let me quote you on it:

Tinek said:
Again, gamers don't need to know, don't want to know etc. how things are made - they just want good final product, as that's what they pay for. Sadly, often they get a poo these days, for a lot of monies - badly optimised, lazily done, buggy as hell and priced higher than ever before.

It did happen, Immortals of Aveum got upgraded to UE5.2., even thought it didn't do well on sells and it was under EA umbrella. Other devs, including those with Stalker 2, have no excuse.
Actually, it got changed twice: from UE4 to UE5 and then to UE5.2.
There were so many design meetings for Immortals over the course of five years, but I remember one very clearly being a pivotal moment for us. Bret, our writer, and our combat team were dreaming up a huge battle that would serve as a big moment in the game’s story. They were talking about an entire level taking place on a giant 400-foot mech that walks across the ocean and comes under attack by the enemy. The player would fight inside of the mech’s chest, and on scaffolding outside, and at one point fall off the mech and catch themselves with their lash ability. All the while the mech is taking heavy fire from flying enemy ships and smashing them out of the sky. As the person in charge of environment art, I was in heaven. I thought this sounded like a really memorable set-piece moment in the making. Mark, our CTO was more like, “how the hell will we actually make this level? It’s not technically possible in this time frame. You content people are absolutely dreaming!”

And that’s when it hit us. Well, it hit Mark first, really. UE5 would be entering Early Access soon and in theory its new features could solve most of the technical impossibilities Mark said were standing in our way of making this level a reality.

Within two months of UE5’s Early Access in 2021, our team was working in the untested and in-progress version of the engine. A year later, we moved our full game from UE4 to UE5 as it entered Preview.

Switching to UE5 wasn’t an instant-win button for us. Afterall, we started work in UE5 before it was even production ready. We faced many challenges along the way. But with constant communication, trial and error, and great teamwork, we worked through these challenges.


Our studio’s journey continues as we update Immortals of Aveum to UE 5.2 and begin work on our next project in UE 5.3. Here’s a sneak peek into some of what we’ll be exploring with the engine upgrade:

  • Lumen – In UE 5.1, Lumen solved the indoor-to-outdoor lighting transition seamlessly, allowing four lighters to light over 15 levels, and also allowed our modelers to instantly view assets in a variety of lighting scenarios. In 5.2, we want to take that even further by improving lighting detail around characters and visual fidelity of animations.
  • Nanite – Nanite gave us unprecedented geometric fidelity, while saving our artists countless hours of setting LODs. In 5.2, we’re looking to further enhance overall game visuals as well as faster geometry calculations that can help further reduce pop in.
Players can expect to see a number of visual and performance improvements with the move to 5.2, and we’re excited to share what we learn along the way.
 
Last edited:
I would be fine with his little rant, pointing out lazy development and the usual shortcomings of the tech that can appear here and there, just like you or I can go on about the same issues with typical raster. But then he goes and says: "As we're moving towards this more realistic, more accurate lighting system in modern games, we're sacrificing image clarity, detail and image stability", as a sort of universal constant which... just ain't true.
But he's absolutely right and it's empirically evident plus it IS a universal constant in modern AAA games - not only he shown why he says so, the TI guy said exactly same thing and shown why he says so (plenty of examples). It's not just RT games, it's raster games too - proper transparency got replaced in games with just textures and TAA filtering, all the noise (both from RT and from some raster effects) is being filtered by TAA (aside actual denoiser), plenty of other bad raster effects where TAA covers all kinds of artefacts, TAA used badly as AA (multiple frames instead of just 2), etc. We all know how blurry TAA can be and it's everywhere in modern games, irrelevant of turning it on or off in settings, as it's embedded into almost all effects and transparency in games. It turns whole image into a blur-fest. Then, add on top of it ever present upscaling, because of devs forgetting about basic optimisation (also in raster games) and you have lots of detail loss on textures and blur everywhere, together with ghosting on top (noise is a different issue all together). It's all very apparent, with multiple examples shown online from various sources. It's the reality - the more detailed games became, the more cost-cutting was introduced, bad TAA filtering and the more blur we got in effect. What you said later in your post explains why you don't see it (Supersampling, more about it later).

I have only a 4080. I'm not sure what version of Indiana Jones you're playing
GamePass, free one - wouldn't have paid for it. :) And I've stopped playing it relatively quickly, for the time being. It works fine with RT (~150FPS with DLSS Quality, no FG) but I am waiting for them fixing bugs first. There's a reason I don't play games soon after release, as they tend to be bug-fests for a while.

but in Italy now and doing 4k @ Balanced DLSS is around 50fps + (where CPU limited, even with a 5800x3d :)) ) or usually above 60fps when not. That's what, 1440p? BTW, using Vulkan is one of the smoothest experiences that I've seen, so hats off to them! On Game Pass, btw.
Game's very CPU limited, even on 7950x3D - most of the time it's the CPU that holds my FPS down, not GPU, as per built-in monitoring (usually CPU needs 2x more time per frame than GPU). It might well be something wrong with the engine and 16 cores CPUs, as I mentioned, as GPU often sits at about 60% used only. I've read someone used debugging tool on it and found out that on some configuration game runs in Vulkan debug mode, which uses a lot of CPU power, by detecting some dlls in the OS. They were able to fix it on their machine, which instantly enabled them to run it with 100% GPU being used but so far I wasn't able to do it on mine and it's not really my job to do it anyway - devs have to fix it in the end. I have plenty of other games to play in the meantime. :)

CB77 Path Tracing is 60fps + with 1080p DLSS Quality or 1440p with DLSS Balanced. 4k @ Performance is around 50fps I believe with 60-70fps+ at Ultra Performance.
CP2077 works very well for me, as in require FG but is very playable with DLSS on Quality. And from UE5 games Hellblade II works well enough and looks stunningly well, though it's a slow walking simulator - still, visually it's 10/10 for me and that is just software Lumen with a good optimisation but most of all very good effects, textures and the whole design. Devs here really know what they're doing.

Moreover, I downsample from 4k since I have a native 1080p screen, so perhaps some performance lost there as well.
What you do is pretty much Supersampling - best way to do it for visual clarity. This explains a lot why you don't have a problem with ever present blur in moden games, as Supersampling gets rid of most of it. This is exactly why people do it (Nexus18 loved it for the same reason) in the first place - as image in modern games is NOT clear, it's full of blur. You wouldn't have to do it if the games weren't a blur-fest in the first place, which is a proof in itself to what TI, HUB and other said about visual clarity. Maybe now you understand why they said it in the first place. Or just play 1080p native for a while and come back to say how it looks. :)

As for updating the engine version, let me quote you on it: (...)
Did you really just bring up as a good example a game that costed $125mil to make and bombed financially, pretty much killing the studio? You just proved my point, thank you. :) Also, with all the mismanagement and money wasting, it still took them a year, relatively early in development, to switch from UE4 to UE5 - that is not a simple update after release, is it? Other studios and publishers took notice, most likely, learned that hard lesson and we will likely never see that happen again. There's no money in it, as I said. No incentive.
 
What is forced about it? baked lighting takes longer development time to do and contrary to what some would have you believe, is more fake than these so called fake frames as it's an interpretation instead of actual light simulation, developers have shown this in many behind the scenes documentaries that have been posted here and all over over the years. I would say one of the main reasons that this game looks so good even in normal HW RT mode is because of exactly that, HWRT is doing all of the GI leaving the devs to focus on the game story and everything else more.

There is nothing forced about it, people can't stay stick on 5+ year old GPUs that aren't capable of HW RT forever. You are also forgetting that back in the early days of DX9x etc, that one GPU generation only had a 2yr lifespan before everyone was "forced" to upgrade as API technology had just updated and now new shader models etc were a thing and devs started to use it in new games.

Being stuck in the past does nobody any favours, either stay up to date rvery few years or get left behind. This week's DF Direct talks about this in detail and they are right, People expecting to be able to run the latest engine tech on old HW is silly.
It's costly keeping up with the jones's
 
But he's absolutely right and it's empirically evident plus it IS a universal constant in modern AAA games - not only he shown why he says so, the TI guy said exactly same thing and shown why he says so (plenty of examples). It's not just RT games, it's raster games too - proper transparency got replaced in games with just textures and TAA filtering, all the noise (both from RT and from some raster effects) is being filtered by TAA (aside actual denoiser), plenty of other bad raster effects where TAA covers all kinds of artefacts, TAA used badly as AA (multiple frames instead of just 2), etc. We all know how blurry TAA can be and it's everywhere in modern games, irrelevant of turning it on or off in settings, as it's embedded into almost all effects and transparency in games. It turns whole image into a blur-fest. Then, add on top of it ever present upscaling, because of devs forgetting about basic optimisation (also in raster games) and you have lots of detail loss on textures and blur everywhere, together with ghosting on top (noise is a different issue all together). It's all very apparent, with multiple examples shown online from various sources. It's the reality - the more detailed games became, the more cost-cutting was introduced, bad TAA filtering and the more blur we got in effect. What you said later in your post explains why you don't see it (Supersampling, more about it later).


GamePass, free one - wouldn't have paid for it. :) And I've stopped playing it relatively quickly, for the time being. It works fine with RT (~150FPS with DLSS Quality, no FG) but I am waiting for them fixing bugs first. There's a reason I don't play games soon after release, as they tend to be bug-fests for a while.


Game's very CPU limited, even on 7950x3D - most of the time it's the CPU that holds my FPS down, not GPU, as per built-in monitoring (usually CPU needs 2x more time per frame than GPU). It might well be something wrong with the engine and 16 cores CPUs, as I mentioned, as GPU often sits at about 60% used only. I've read someone used debugging tool on it and found out that on some configuration game runs in Vulkan debug mode, which uses a lot of CPU power, by detecting some dlls in the OS. They were able to fix it on their machine, which instantly enabled them to run it with 100% GPU being used but so far I wasn't able to do it on mine and it's not really my job to do it anyway - devs have to fix it in the end. I have plenty of other games to play in the meantime. :)


CP2077 works very well for me, as in require FG but is very playable with DLSS on Quality. And from UE5 games Hellblade II works well enough and looks stunningly well, though it's a slow walking simulator - still, visually it's 10/10 for me and that is just software Lumen with a good optimisation but most of all very good effects, textures and the whole design. Devs here really know what they're doing.


What you do is pretty much Supersampling - best way to do it for visual clarity. This explains a lot why you don't have a problem with ever present blur in moden games, as Supersampling gets rid of most of it. This is exactly why people do it (Nexus18 loved it for the same reason) in the first place - as image in modern games is NOT clear, it's full of blur. You wouldn't have to do it if the games weren't a blur-fest in the first place, which is a proof in itself to what TI, HUB and other said about visual clarity. Maybe now you understand why they said it in the first place. Or just play 1080p native for a while and come back to say how it looks. :)


Did you really just bring up as a good example a game that costed $125mil to make and bombed financially, pretty much killing the studio? You just proved my point, thank you. :) Also, with all the mismanagement and money wasting, it still took them a year, relatively early in development, to switch from UE4 to UE5 - that is not a simple update after release, is it? Other studios and publishers took notice, most likely, learned that hard lesson and we will likely never see that happen again. There's no money in it, as I said. No incentive.
I was talking about RT from the HUB video, nu the entire gaming industry.

For Indiana you said it required Ultra performance to get to 60fps, but now you say you can get 150? So which one is it? 150 seems more than fine.

What's the difference between 4k native and Supersampling 4k to 1080p? Don't they look similar? Anyway, I do play at 1080p alone and is fine, too. 3x1080p as well

As for Immortals, what is it in the end, we don't care as players, change the engine 1000 times until it performs perfectly or that tools do matter and get it done cheap makes better sense?
 
Last edited:
Back
Top Bottom