• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

Who (I assume you refer to the enthusiasts on youtube) and where ignores it when it works? Does RT need constant praise like DF does to be viable?

So, should Hogwarts not be criticised for bad reflections, or what is your point here? At least from my side the critique is always aimed at lazy devs, where I can often see very good results from indie studios and very bad ones from AAA studios. One would think it would be the opposite, but here we are.

Sure, if you go down to 480p on 4090, then it can handle it relatively well with PT. :) And I wish I was over-exagerating, but I am not - that's what I see in Indiana Jones for example, on my 4090 with PT, as I have to go to DLSS Ultra Perf from 1440p resolution to get to 60FPS without FG. To be clear, FG is bugged horribly for me in that title, so not usable (horrible stutter making game unplayable, even on clean Windows and clean drivers etc. - as I tested) - seems to be relatively common judging by Steam's and other forums full of complaints about it on 4090, though it could be related to 16 cores CPU as well, hard to say in this specific game (I've seen people confirming both). I don't think that's what people actually want - to go back to 480p gaming. :) Even with best DLSS crutch it's a blurfest.


That would result in a slideshow even on 4090 in 480p (or just horrible noise all over the screen) - you can see that in actual 3D rendering software (for example by moving camera live in the scene). They might be lazy (devs) but they're not that crazy. :)


As I wrote above, updating often breaks things, cost monies and there's no profit in it. If company gets funded by for example NVIDIA to do it to promote new functions in successful title - they'll do it. Otherwise, forget about it.

I would be fine with his little rant, pointing out lazy development and the usual shortcomings of the tech that can appear here and there, just like you or I can go on about the same issues with typical raster. But then he goes and says: "As we're moving towards this more realistic, more accurate lighting system in modern games, we're sacrificing image clarity, detail and image stability", as a sort of universal constant which... just ain't true.

He can critique Hogwarts, that's a bad implementation for reflections, but that isn't the worse for shadows for instance.

I have only a 4080. I'm not sure what version of Indiana Jones you're playing, but in Italy now and doing 4k @ Balanced DLSS is around 50fps + (where CPU limited, even with a 5800x3d :)) ) or usually above 60fps when not. That's what, 1440p? BTW, using Vulkan is one of the smoothest experiences that I've seen, so hats off to them! On Game Pass, btw.

CB77 Path Tracing is 60fps + with 1080p DLSS Quality or 1440p with DLSS Balanced. 4k @ Performance is around 50fps I believe with 60-70fps+ at Ultra Performance.
AW2 is 60fps with DLSS Quality 1080p and above that it depends, I haven't messed around much.

Usually 4k with Ultra Performance (720p) is above 60fps pretty well, but due to the flexibility of DLSS, I can use DLSS Tweaks and play with the ratio, so DLSS Ultra Performance can get closer to Performance and find a nice balance.
Moreover, I downsample from 4k since I have a native 1080p screen, so perhaps some performance lost there as well.

As for updating the engine version, let me quote you on it:

Tinek said:
Again, gamers don't need to know, don't want to know etc. how things are made - they just want good final product, as that's what they pay for. Sadly, often they get a poo these days, for a lot of monies - badly optimised, lazily done, buggy as hell and priced higher than ever before.

It did happen, Immortals of Aveum got upgraded to UE5.2., even thought it didn't do well on sells and it was under EA umbrella. Other devs, including those with Stalker 2, have no excuse.
Actually, it got changed twice: from UE4 to UE5 and then to UE5.2.
There were so many design meetings for Immortals over the course of five years, but I remember one very clearly being a pivotal moment for us. Bret, our writer, and our combat team were dreaming up a huge battle that would serve as a big moment in the game’s story. They were talking about an entire level taking place on a giant 400-foot mech that walks across the ocean and comes under attack by the enemy. The player would fight inside of the mech’s chest, and on scaffolding outside, and at one point fall off the mech and catch themselves with their lash ability. All the while the mech is taking heavy fire from flying enemy ships and smashing them out of the sky. As the person in charge of environment art, I was in heaven. I thought this sounded like a really memorable set-piece moment in the making. Mark, our CTO was more like, “how the hell will we actually make this level? It’s not technically possible in this time frame. You content people are absolutely dreaming!”

And that’s when it hit us. Well, it hit Mark first, really. UE5 would be entering Early Access soon and in theory its new features could solve most of the technical impossibilities Mark said were standing in our way of making this level a reality.

Within two months of UE5’s Early Access in 2021, our team was working in the untested and in-progress version of the engine. A year later, we moved our full game from UE4 to UE5 as it entered Preview.

Switching to UE5 wasn’t an instant-win button for us. Afterall, we started work in UE5 before it was even production ready. We faced many challenges along the way. But with constant communication, trial and error, and great teamwork, we worked through these challenges.


Our studio’s journey continues as we update Immortals of Aveum to UE 5.2 and begin work on our next project in UE 5.3. Here’s a sneak peek into some of what we’ll be exploring with the engine upgrade:

  • Lumen – In UE 5.1, Lumen solved the indoor-to-outdoor lighting transition seamlessly, allowing four lighters to light over 15 levels, and also allowed our modelers to instantly view assets in a variety of lighting scenarios. In 5.2, we want to take that even further by improving lighting detail around characters and visual fidelity of animations.
  • Nanite – Nanite gave us unprecedented geometric fidelity, while saving our artists countless hours of setting LODs. In 5.2, we’re looking to further enhance overall game visuals as well as faster geometry calculations that can help further reduce pop in.
Players can expect to see a number of visual and performance improvements with the move to 5.2, and we’re excited to share what we learn along the way.
 
Last edited:
I would be fine with his little rant, pointing out lazy development and the usual shortcomings of the tech that can appear here and there, just like you or I can go on about the same issues with typical raster. But then he goes and says: "As we're moving towards this more realistic, more accurate lighting system in modern games, we're sacrificing image clarity, detail and image stability", as a sort of universal constant which... just ain't true.
But he's absolutely right and it's empirically evident plus it IS a universal constant in modern AAA games - not only he shown why he says so, the TI guy said exactly same thing and shown why he says so (plenty of examples). It's not just RT games, it's raster games too - proper transparency got replaced in games with just textures and TAA filtering, all the noise (both from RT and from some raster effects) is being filtered by TAA (aside actual denoiser), plenty of other bad raster effects where TAA covers all kinds of artefacts, TAA used badly as AA (multiple frames instead of just 2), etc. We all know how blurry TAA can be and it's everywhere in modern games, irrelevant of turning it on or off in settings, as it's embedded into almost all effects and transparency in games. It turns whole image into a blur-fest. Then, add on top of it ever present upscaling, because of devs forgetting about basic optimisation (also in raster games) and you have lots of detail loss on textures and blur everywhere, together with ghosting on top (noise is a different issue all together). It's all very apparent, with multiple examples shown online from various sources. It's the reality - the more detailed games became, the more cost-cutting was introduced, bad TAA filtering and the more blur we got in effect. What you said later in your post explains why you don't see it (Supersampling, more about it later).

I have only a 4080. I'm not sure what version of Indiana Jones you're playing
GamePass, free one - wouldn't have paid for it. :) And I've stopped playing it relatively quickly, for the time being. It works fine with RT (~150FPS with DLSS Quality, no FG) but I am waiting for them fixing bugs first. There's a reason I don't play games soon after release, as they tend to be bug-fests for a while.

but in Italy now and doing 4k @ Balanced DLSS is around 50fps + (where CPU limited, even with a 5800x3d :)) ) or usually above 60fps when not. That's what, 1440p? BTW, using Vulkan is one of the smoothest experiences that I've seen, so hats off to them! On Game Pass, btw.
Game's very CPU limited, even on 7950x3D - most of the time it's the CPU that holds my FPS down, not GPU, as per built-in monitoring (usually CPU needs 2x more time per frame than GPU). It might well be something wrong with the engine and 16 cores CPUs, as I mentioned, as GPU often sits at about 60% used only. I've read someone used debugging tool on it and found out that on some configuration game runs in Vulkan debug mode, which uses a lot of CPU power, by detecting some dlls in the OS. They were able to fix it on their machine, which instantly enabled them to run it with 100% GPU being used but so far I wasn't able to do it on mine and it's not really my job to do it anyway - devs have to fix it in the end. I have plenty of other games to play in the meantime. :)

CB77 Path Tracing is 60fps + with 1080p DLSS Quality or 1440p with DLSS Balanced. 4k @ Performance is around 50fps I believe with 60-70fps+ at Ultra Performance.
CP2077 works very well for me, as in require FG but is very playable with DLSS on Quality. And from UE5 games Hellblade II works well enough and looks stunningly well, though it's a slow walking simulator - still, visually it's 10/10 for me and that is just software Lumen with a good optimisation but most of all very good effects, textures and the whole design. Devs here really know what they're doing.

Moreover, I downsample from 4k since I have a native 1080p screen, so perhaps some performance lost there as well.
What you do is pretty much Supersampling - best way to do it for visual clarity. This explains a lot why you don't have a problem with ever present blur in moden games, as Supersampling gets rid of most of it. This is exactly why people do it (Nexus18 loved it for the same reason) in the first place - as image in modern games is NOT clear, it's full of blur. You wouldn't have to do it if the games weren't a blur-fest in the first place, which is a proof in itself to what TI, HUB and other said about visual clarity. Maybe now you understand why they said it in the first place. Or just play 1080p native for a while and come back to say how it looks. :)

As for updating the engine version, let me quote you on it: (...)
Did you really just bring up as a good example a game that costed $125mil to make and bombed financially, pretty much killing the studio? You just proved my point, thank you. :) Also, with all the mismanagement and money wasting, it still took them a year, relatively early in development, to switch from UE4 to UE5 - that is not a simple update after release, is it? Other studios and publishers took notice, most likely, learned that hard lesson and we will likely never see that happen again. There's no money in it, as I said. No incentive.
 
What is forced about it? baked lighting takes longer development time to do and contrary to what some would have you believe, is more fake than these so called fake frames as it's an interpretation instead of actual light simulation, developers have shown this in many behind the scenes documentaries that have been posted here and all over over the years. I would say one of the main reasons that this game looks so good even in normal HW RT mode is because of exactly that, HWRT is doing all of the GI leaving the devs to focus on the game story and everything else more.

There is nothing forced about it, people can't stay stick on 5+ year old GPUs that aren't capable of HW RT forever. You are also forgetting that back in the early days of DX9x etc, that one GPU generation only had a 2yr lifespan before everyone was "forced" to upgrade as API technology had just updated and now new shader models etc were a thing and devs started to use it in new games.

Being stuck in the past does nobody any favours, either stay up to date rvery few years or get left behind. This week's DF Direct talks about this in detail and they are right, People expecting to be able to run the latest engine tech on old HW is silly.
It's costly keeping up with the jones's
 
But he's absolutely right and it's empirically evident plus it IS a universal constant in modern AAA games - not only he shown why he says so, the TI guy said exactly same thing and shown why he says so (plenty of examples). It's not just RT games, it's raster games too - proper transparency got replaced in games with just textures and TAA filtering, all the noise (both from RT and from some raster effects) is being filtered by TAA (aside actual denoiser), plenty of other bad raster effects where TAA covers all kinds of artefacts, TAA used badly as AA (multiple frames instead of just 2), etc. We all know how blurry TAA can be and it's everywhere in modern games, irrelevant of turning it on or off in settings, as it's embedded into almost all effects and transparency in games. It turns whole image into a blur-fest. Then, add on top of it ever present upscaling, because of devs forgetting about basic optimisation (also in raster games) and you have lots of detail loss on textures and blur everywhere, together with ghosting on top (noise is a different issue all together). It's all very apparent, with multiple examples shown online from various sources. It's the reality - the more detailed games became, the more cost-cutting was introduced, bad TAA filtering and the more blur we got in effect. What you said later in your post explains why you don't see it (Supersampling, more about it later).


GamePass, free one - wouldn't have paid for it. :) And I've stopped playing it relatively quickly, for the time being. It works fine with RT (~150FPS with DLSS Quality, no FG) but I am waiting for them fixing bugs first. There's a reason I don't play games soon after release, as they tend to be bug-fests for a while.


Game's very CPU limited, even on 7950x3D - most of the time it's the CPU that holds my FPS down, not GPU, as per built-in monitoring (usually CPU needs 2x more time per frame than GPU). It might well be something wrong with the engine and 16 cores CPUs, as I mentioned, as GPU often sits at about 60% used only. I've read someone used debugging tool on it and found out that on some configuration game runs in Vulkan debug mode, which uses a lot of CPU power, by detecting some dlls in the OS. They were able to fix it on their machine, which instantly enabled them to run it with 100% GPU being used but so far I wasn't able to do it on mine and it's not really my job to do it anyway - devs have to fix it in the end. I have plenty of other games to play in the meantime. :)


CP2077 works very well for me, as in require FG but is very playable with DLSS on Quality. And from UE5 games Hellblade II works well enough and looks stunningly well, though it's a slow walking simulator - still, visually it's 10/10 for me and that is just software Lumen with a good optimisation but most of all very good effects, textures and the whole design. Devs here really know what they're doing.


What you do is pretty much Supersampling - best way to do it for visual clarity. This explains a lot why you don't have a problem with ever present blur in moden games, as Supersampling gets rid of most of it. This is exactly why people do it (Nexus18 loved it for the same reason) in the first place - as image in modern games is NOT clear, it's full of blur. You wouldn't have to do it if the games weren't a blur-fest in the first place, which is a proof in itself to what TI, HUB and other said about visual clarity. Maybe now you understand why they said it in the first place. Or just play 1080p native for a while and come back to say how it looks. :)


Did you really just bring up as a good example a game that costed $125mil to make and bombed financially, pretty much killing the studio? You just proved my point, thank you. :) Also, with all the mismanagement and money wasting, it still took them a year, relatively early in development, to switch from UE4 to UE5 - that is not a simple update after release, is it? Other studios and publishers took notice, most likely, learned that hard lesson and we will likely never see that happen again. There's no money in it, as I said. No incentive.
I was talking about RT from the HUB video, nu the entire gaming industry.

For Indiana you said it required Ultra performance to get to 60fps, but now you say you can get 150? So which one is it? 150 seems more than fine.

What's the difference between 4k native and Supersampling 4k to 1080p? Don't they look similar? Anyway, I do play at 1080p alone and is fine, too. 3x1080p as well

As for Immortals, what is it in the end, we don't care as players, change the engine 1000 times until it performs perfectly or that tools do matter and get it done cheap makes better sense?
 
Last edited:
And so would you two. You slate them but then hand over your money to them :p

I wonder what gpu you guys will get next? :D
Some of the posting here doesn't stop me coming back for more does it?....

But to answer your question, it didn't work out last time did it after 12gb at launch was running out on my 4070.

Dependant on price, I'd perhaps suffer the enforced DLSS for UW but 16Gb ain't going on my 4K qd-oled-no matter the spin on their new *expertVramsaver feature, I'd keep what I have first as I'm not really interested in mirror simulation, for everything else 24s the real plenty.
 
Last edited:
I was talking about RT from the HUB video, nu the entire gaming industry.
But what HUB said in the pointed out by you summary is about whole gaming industry. They shown few examples but the comment was much more general - or so it sounded to me like it, at least.
For Indiana you said it required Ultra performance to get to 60fps, but now you say you can get 150? So which one is it? 150 seems more than fine.
PT vs RT, I thought I was clear about it but maybe not clear enough. RT works fine (150+), PT is a horror story (25FPS on average) in my case.

What's the difference between 4k native and Supersampling 4k to 1080p? Don't they look similar? Anyway, I do play at 1080p alone and is fine, too. 3x1080p as well
Almost no difference, but that's the point - 4k owners don't have much problem with the blur, as 4k is generating so many pixels not even TAA can mess it up that easily. The thing is that most people do not have 4k monitors, 1080p is still the majority and 1440p is gaining popularity but 4k is a miniscule percentage of gamers still. 1440p and lower are much worse, hence I can see in near all games a big uplift in image clarity by Supersampling too. This is why Supersampling has such good opinions, as everyone can instantly see a huge clarity uplift.
Some game devs allow Supersampling to 4k and then DLSS it down to native directly in games too, but it's still very rare to see. Most importantly, wouldn't be needed at all if TAA wasn't abused on everything in the first place. Proper 1080p clear image with good AA on it is indistinguishable from 4k image down-scalled to 1080p. Sadly, most new games don't give proper clear native image anymore, which is the problem.

As for Immortals, what is it in the end, we don't care as players, change the engine 1000 times until it performs perfectly or that tools do matter and get it done cheap makes better sense?
I don't understand what's confusing here - gamers don't care one bit, they buy game, finish, rarely revisit later (unless it's some online live-services one). Studios/publishers do, as they lose big amount of monies on that and there's no way they can recover that monies from sales. Someone has to pay for it and it's not gamers that will cover it, so who will? NVIDIA sometimes do, it's cheaper than advertisement elsewhere (like CP2077), but it's rare. Most studios do not even want to update simple things like DLSS libraries in their games, as that requires whole testing process to be done, to be sure it doesn't introduce new issues - not worth it for them, even such tiny change. And you dream about whole UE engine update... :)
 
Last edited:
Until threat interactive show their own versions of what they say I consider it all nonsense designed for clickbait antics. All they do is nitpick and point fingers and claim to know xyz yet to this day have yet to show a single shred of their own work.

The main guy even looks AI generated in various videos lol.
 
Let me clarify, as it sounds bit stupid when I read it now myself :D Supersampling, ergo rendering game in 4k, is the same as just... rendering in 4k :) Performance-wise and quality-wise. Obviously, down-scaling to 1080p will have quality degraded but it will still look much better than TAA infested native 1080p rendering.
 
Last edited:
  • Like
Reactions: TNA
Until threat interactive show their own versions of what they say I consider it all nonsense designed for clickbait antics. All they do is nitpick and point fingers and claim to know xyz yet to this day have yet to show a single shred of their own work.
And yet, as I quoted earlier, main creator of MegaLights in UE5.5 said exactly same thing TI did in a video in which they analysed in editor UE5 MegaLights demo scene (to focus on one specific thing). TI shown how badly it's been designed in the first place and how easily it could be optimised, then shown step by step how to do it - took him minutes and suddenly scene looks almost the same but works fine on RTX 3060 instead of requiring 4080+ for 60FPS. The steps TI did were exact steps which said ML creator listed as what devs should do in the first place, with exactly same reasoning. Ergo, TI dude knows what he's talking about and he followed UE/MegaLights author's best practices plus a bit extra of himself. Hence, your argument being "it's all nonsense designed for clicbait antics" doesn't sound very strong, does it?

The main guy even looks AI generated in various videos lol.
Insulting the person because you have no actual arguments to refute anything they said? Very mature.
 
Last edited:
Let me clarify, as it sounds bit stupid when I read it now myself :D Supersampling, ergo rendering game in 4k, is the same as just... rendering in 4k :) Performance-wise and quality-wise. Obviously, down-scaling to 1080p will have quality degraded but it will still look much better than TAA infested native 1080p rendering.

Much better :D
 
But what HUB said in the pointed out by you summary is about whole gaming industry. They shown few examples but the comment was much more general - or so it sounded to me like it, at least.

PT vs RT, I thought I was clear about it but maybe not clear enough. RT works fine (150+), PT is a horror story (25FPS on average) in my case.


Almost no difference, but that's the point - 4k owners don't have much problem with the blur, as 4k is generating so many pixels not even TAA can mess it up that easily. The thing is that most people do not have 4k monitors, 1080p is still the majority and 1440p is gaining popularity but 4k is a miniscule percentage of gamers still. 1440p and lower are much worse, hence I can see in near all games a big uplift in image clarity by Supersampling too. This is why Supersampling has such good opinions, as everyone can instantly see a huge clarity uplift.
Some game devs allow Supersampling to 4k and then DLSS it down to native directly in games too, but it's still very rare to see. Most importantly, wouldn't be needed at all if TAA wasn't abused on everything in the first place. Proper 1080p clear image with good AA on it is indistinguishable from 4k image down-scalled to 1080p. Sadly, most new games don't give proper clear native image anymore, which is the problem.


I don't understand what's confusing here - gamers don't care one bit, they buy game, finish, rarely revisit later (unless it's some online live-services one). Studios/publishers do, as they lose big amount of monies on that and there's no way they can recover that monies from sales. Someone has to pay for it and it's not gamers that will cover it, so who will? NVIDIA sometimes do, it's cheaper than advertisement elsewhere (like CP2077), but it's rare. Most studios do not even want to update simple things like DLSS libraries in their games, as that requires whole testing process to be done, to be sure it doesn't introduce new issues - not worth it for them, even such tiny change. And you dream about whole UE engine update... :)

To me, I'd say HUB was speaking about the state of RT/PT, that's why I've added the transcription for that part. I don't agree with him, because although I "supersample", I also play at 3x1080p which is my preferred setup over one single screen. Even with just the general classical raster, while some do look less than ideal at 1080p (from the top of my head Mafia 3), other look decent and ok with some driver sharpening (Immortals of Aveum, even at Balanced/Performance) and others are just fine as they are.

Personally I do visit again the games that I like, more than once, ergo I expect proper support from devs. So you have this conundrum: gamers want the best, no matter what the costs, game devs want the cheapest solution. Well, for me RT/PT combines the two, albeit at the price of performance. I think that's the trend going forward anyway.

Now, with Mega Lights... it seems like PT with extra steps. Frankly, if you start a game now to launch it in 4-5 years, might as well go for hardware RT/PT. There will be 2 more gens of cards by then and perhaps a new console gen.

Until threat interactive show their own versions of what they say I consider it all nonsense designed for clickbait antics. All they do is nitpick and point fingers and claim to know xyz yet to this day have yet to show a single shred of their own work.

The main guy even looks AI generated in various videos lol.

If by optimizing he means cutting down the quality, including disabling nanite and hardware RT or perhaps PT, then no thanks. If he can do it some other way with those on, by all means, go right ahead.
 
Last edited:
Even with just the general classical raster, while some do look less than ideal at 1080p (from the top of my head Mafia 3), other look decent and ok with some driver sharpening (Immortals of Aveum, even at Balanced/Performance) and others are just fine as they are.
Sharpening is only masking the results of blur-fest - it makes image look sharper but it does not bring back any missing details (and that's why it's all soft and blurry). It's similar situation to mobile photography where people often liked older Samsung phones' photos but upon any closer inspection they were just over-sharpened yet noisy and blurry otherwise. At least newer DLSS removed the problem of over-sharpening, as they added native much better sharpening. There's a reason general opinion about TAA is that it's inducing a lot of softness to the image and general blurriness plus ghosting (similar to FXAA before it with regards to softness). That's been in games for many years and that's the main reason people started to use Supersampling to combat it. Though, both FXAA and TAA can be fixed (FXAA in many newer games looks fine actually, after they improved algorithm considerably) - but that assumes devs know what they're doing and not just enable defaults. It's getting very rare these days.

Now, with Mega Lights... it seems like PT with extra steps. Frankly, if you start a game now to launch it in 4-5 years, might as well go for hardware RT/PT. There will be 2 more gens of cards by then and perhaps a new console gen.
Mega Lights is used with hardware Lumen and that's not really PT as introduced by NVIDIA to gaming, but Epic's own way of doing RT. It has the potential to be the good middle ground if/when it gets proper hardware support. That means, we need much faster RT acceleration in GPUs than current 4000 series can do - AMD described what they're planning to do with RT hardware even in mid and lower GPUs (not in next gen but one following it), as I mentioned in earlier posts, NVIDIA focuses on AI crutches (which can be good in its own way). And with much better optimisation by devs, which doesn't seem like it's going to happen anytime soon. Epic has pretty much already ditched software Lumen support with UE5.5, as the whole industry is moving toward more and more hardware RT. Problem is that devs (I know, I am repeating myself) are way too lazy to optimise things properly and that limits AAA gaming to high-end GPUs at the moment. And PT is just stupid idea all together currently, as only 4090 can handle it sensibly well and even 4080 often has trouble with it. Sometimes 4090 isn't fast enough either - (like in Indiana Jones for me) and that excludes the whole issues with noise, ghosting etc.
If by optimizing he means cutting down the quality, including disabling nanite and hardware RT or perhaps PT, then no thanks. If he can do it some other way with those on, by all means, go right ahead.
No, he does (as I responded to mrk) follow exact guidelines written down by main creator of Mega Lights. Which means removal of lights that are behind and inside the wall yet had radius set so big they were "shining" in other rooms, also decreasing actual radius of lights. Even though they're occluded and can't affect any pixels, they still had to be all calculated on every pass of Mega Lights, which is just horribly inefficient scene creation, slowing down rendering to a fraction of what it can be, if you just limit the radius of these lights and try to avoid overlapping too many. Lumen uses usually 4-5 rays per pixel, which means no more than that number of lights should be affecting every single pixel - again, as per ML devs. Otherwise it's a performance hit and you introduce a lot of noise that will overwhelm denoiser.
They (UE/ML devs) even provide debugger, so devs can see exactly where lights are put wrong and easily correct them. And yet it's been ignored by 3rd party devs creating demos with ML (same as they do in games later!), committing all these optimisation sins and then claiming only 4090 can handle it, which is absolutely not true - 3060 can handle it easily enough if you do it properly, as proven, with minimal changes to scene's look. We're really talking here about such simple optimisation steps. There were few other - like for example demo devs made flat floor full of polygons instead of just a flat texture, for no logical reason at all aside just claiming "We use nanite!"?, whilst, again, killing a percentage of performance.
 
Last edited:
Sharpening is only masking the results of blur-fest - it makes image look sharper but it does not bring back any missing details (and that's why it's all soft and blurry). It's similar situation to mobile photography where people often liked older Samsung phones' photos but upon any closer inspection they were just over-sharpened yet noisy and blurry otherwise. At least newer DLSS removed the problem of over-sharpening, as they added native much better sharpening. There's a reason general opinion about TAA is that it's inducing a lot of softness to the image and general blurriness plus ghosting (similar to FXAA before it with regards to softness). That's been in games for many years and that's the main reason people started to use Supersampling to combat it. Though, both FXAA and TAA can be fixed (FXAA in many newer games looks fine actually, after they improved algorithm considerably) - but that assumes devs know what they're doing and not just enable defaults. It's getting very rare these days.


Mega Lights is used with hardware Lumen and that's not really PT as introduced by NVIDIA to gaming, but Epic's own way of doing RT. It has the potential to be the good middle ground if/when it gets proper hardware support. That means, we need much faster RT acceleration in GPUs than current 4000 series can do - AMD described what they're planning to do with RT hardware even in mid and lower GPUs (not in next gen but one following it), as I mentioned in earlier posts, NVIDIA focuses on AI crutches (which can be good in its own way). And with much better optimisation by devs, which doesn't seem like it's going to happen anytime soon. Epic has pretty much already ditched software Lumen support with UE5.5, as the whole industry is moving toward more and more hardware RT. Problem is that devs (I know, I am repeating myself) are way too lazy to optimise things properly and that limits AAA gaming to high-end GPUs at the moment. And PT is just stupid idea all together currently, as only 4090 can handle it sensibly well and even 4080 often has trouble with it. Sometimes 4090 isn't fast enough either - (like in Indiana Jones for me) and that excludes the whole issues with noise, ghosting etc.

No, he does (as I responded to mrk) follow exact guidelines written down by main creator of Mega Lights. Which means removal of lights that are behind and inside the wall yet had radius set so big they were "shining" in other rooms, also decreasing actual radius of lights. Even though they're occluded and can't affect any pixels, they still had to be all calculated on every pass of Mega Lights, which is just horribly inefficient scene creation, slowing down rendering to a fraction of what it can be, if you just limit the radius of these lights and try to avoid overlapping too many. Lumen uses usually 4-5 rays per pixel, which means no more than that number of lights should be affecting every single pixel - again, as per ML devs. Otherwise it's a performance hit and you introduce a lot of noise that will overwhelm denoiser.
They (UE/ML devs) even provide debugger, so devs can see exactly where lights are put wrong and easily correct them. And yet it's been ignored by 3rd party devs creating demos with ML (same as they do in games later!), committing all these optimisation sins and then claiming only 4090 can handle it, which is absolutely not true - 3060 can handle it easily enough if you do it properly, as proven, with minimal changes to scene's look. We're really talking here about such simple optimisation steps. There were few other - like for example demo devs made flat floor full of polygons instead of just a flat texture, for no logical reason at all aside just claiming "We use nanite!"?, whilst, again, killing a percentage of performance.

Optimizing Mega Lights is one thing, using a texture so you avoid using actual geometry for a relative "flat" floor only "optimizes" for that scene (assuming you don't have other complex assets in the scene). Once you go out, you encounter vegetation (also Nanite), possible plenty of characters (also nanite), perhaps snow/mud (also dynamic nanite tesselattion) to which, if you apply the same "optimization" pass, you'd need to make them "old gen".

As for PT, as I've said, it runs fine on my 4080... Even a 4070 does decently.
 
Back
Top Bottom