• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RT Related Games, Benchmarks, Software, Etc Thread.

Status
Not open for further replies.
I see DLSS and FSR (and XeSS) as great 'value-adds' - useful technologies that *should* be used to extend the useful life of a GPU or allow users to enable visual features they otherwise wouldn't have the performance to enable. DLAA (and hopefully FSR Native) is a transformative technology and is the best AA solution currently available (TAA looks stone-age by comparison) - I'd love it if all PC users could use DLAA (or FSR Native) to get better AA as it's been terrible ever since everyone switched to deferred renderers (this is also why we need all these technologies in all PC releases).

What I'm not keen on is these features being used as a crutch to prop up shoddy PC ports or unoptimized engines like UE5.x - when DLSS first appeared it was speculated that this would happen and well, the prophecy appears to be coming true unfortunately.
 
Last edited:
Everytime someone comments about stagnating performance, all we hear is DLSS has better IQ.:cry:

Which absolutely no one disagrees.

Must be a complete and utter shock for some to hear Alex@DF commenting after all these years keeping quiet about how upscalings degrading IQ especially RT.
I haven't watched the video fully but are you sure they aren't referring more so to just the ray tracing quality with regards to reflections etc.? Which has always been acknowledged to have a drop in quality when using upscaling (more so when you go down to the lower presets) since upscaling has never worked on RT reflections etc.

Anytime the "better than native" comment is made (which btw isn't just from DF/Alex but also gamers nexus, HUB, TPU, computerbase, oc3d.net and even hardware unboxed etc.), it's been in reference to the aspects which upscaling does improve over native IQ i.e. reducing shimmering, aliasing, essentially having better temporal stability on the whole and rendering in more detail, basically everything aside from RT hence why ray reconstruction is seeking to resolve the IQ issues with where RT and upscaling is concerned.

I see DLSS and FSR (and XeSS) as great 'value-adds' - useful technologies that *should* be used to extend the useful life of a GPU or allow users to enable visual features they otherwise wouldn't have the performance to enable. DLAA (and hopefully FSR Native) is a transformative technology and is the best AA solution currently available (TAA looks stone-age by comparison) - I'd love it if all PC users could use DLAA (or FSR Native) to get better AA as it's been terrible ever since everyone switched to deferred renderers (this is also why we need all these technologies in all PC releases).

What I'm not keen on is these features being used as a crutch to prop up shoddy PC ports or unoptimized engines like UE5.x - when DLSS first appeared it was speculated that this would happen and well, the prophecy appears to be coming true unfortunately.
I can't see AMDS DLAA competitor being any good tbh, dlaa is good because of how it resolves the issues with current AA methods (which is what makes dlss so great) where as FSR in its current form only increases the artifacts/issues with things like shimmering, aliasing etc. further.

And yup sadly I said that from the very beginning, developers have even said it themselves :( Can't blame them tbf.
 
No, magic AI super super upscaling is the answer to hardware stagnation.
Everytime someone comments about stagnating performance, all we hear is DLSS has better IQthan FSR.
:cry:

What I'm not keen on is these features being used as a crutch to prop up shoddy PC ports or unoptimized engines like UE5.x - when DLSS first appeared it was speculated that this would happen and well, the prophecy appears to be coming true unfortunately.

It was also predicted it would be used to sell weaker dGPUs for more money.Many here argued that it wouldn't happen, and now it's happening it's all OK. Irrespective of how optimised games are,the RTX4060TI/RTX4060/RX7600 are barely faster than the last generation dGPUs they have replaced at similar price-points(especially if we ignore the Pandemic price increases). Also the upscaling wars are all fine and dandy,except PC users tend to be much closer to their display than console users so will notice image quality issues much more easily,so upscaling was always an easier sell for consoles.

Most mainstream gamers buy according to price-points,so as performance stagnates I suspect people will just keep their dGPUs longer.Us on tech forums probably upgrade far more quicker than most people. So it seems really weird tech companies want to hold back PC gaming this way. But then I doubt many here would even consider buying mainstream cards anyway.
 
Last edited:
Everytime someone comments about stagnating performance, all we hear is DLSS has better IQthan FSR.:cry:

Which absolutely no one disagrees.

Must be a complete and utter shock for some to hear Alex@DF commenting after all these years keeping quiet about how upscalings degrading IQ especially RT.
OMG you're right - now that I've seen it I can't unsee it LITERALLY UNPLAYABLE!

(By the way, without DLSS at this quality it *is* unplayable, even on a 4090).

01.png

02.png
 
Last edited:
Everytime someone comments about stagnating performance, all we hear is DLSS has better IQthan FSR.:cry:

Which absolutely no one disagrees.

Must be a complete and utter shock for some to hear Alex@DF commenting after all these years keeping quiet about how upscalings degrading IQ especially RT.

:cry: TV tech and consoles. Another elephant in the room I guess.
 
The 'better than native' comment has been banded about and argued as 'it's true, it's true' since DLSS 1.0!

Every time artifacting is reduced it's met with 'better than native', FFS, they've improved and reduced ghosting again, but was argued years ago it doesn't ghost anymore...

Upscaling has primarily been used as a tool to convince people their underspecced card be it grunt or vram related is running higher in game settings, therefore it's increasing IQ when you've always been able to reduce Res for higher settings, and it's been getting lapped up like milk ever since.

Where it really does shine is getting playable fps on settings your GPU can't handle, older hardware, or smaller panels, and handhelds, it's good but it's not native IQ good.
 
Where it really does shine is getting playable fps on settings your GPU can't handle, older hardware, or smaller panels, and handhelds, it's good but it's not native IQ good.

Exactly. It's better than native good :cry:

I argued this a lot when DLSS 1 was out when many were saying better than native. Made no sense. But guys like Grim were lapping it up.

It has come a long way since and I do really like DLSS 2. It is worth the any small issues it has as I usually don't see them. Could be mainly because I use DLDSR and run at 5160x2160 which helps DLSS a lot.
 
The 'better than native' comment has been banded about and argued as 'it's true, it's true' since DLSS 1.0!

Every time artifacting is reduced it's met with 'better than native', FFS, they've improved and reduced ghosting again, but was argued years ago it doesn't ghost anymore...

Upscaling has primarily been used as a tool to convince people their underspecced card be it grunt or vram related is running higher in game settings, therefore it's increasing IQ when you've always been able to reduce Res for higher settings, and it's been getting lapped up like milk ever since.

Where it really does shine is getting playable fps on settings your GPU can't handle, older hardware, or smaller panels, and handhelds, it's good but it's not native IQ good.

Again got links to this? Could link all the content by gamer nexus, hub, oc3d, computerbase, df etc. but I imagine that will be ignored.

Ghosting has always been a problem with motion vector based AA such as TAA, difference is people never complained about it until dlss came along because "nvidia" and even then TAA issues were/are still ignored (no point saying about better AA methods being out there as there simply isn't as shown time and time again), my favourite is days gone, no one batted an eye lid with the terrible issues of TAA.....

To say/insinuate that simply running a lower res. etc. achieves the same thing as DLSS/upscaling is pure nonsense and you know it. Unless again, you have something to back up this argument of yours?
 
I haven't watched the video fully but are you sure they aren't referring more so to just the ray tracing quality with regards to reflections etc.? Which has always been acknowledged to have a drop in quality when using upscaling (more so when you go down to the lower presets) since upscaling has never worked on RT reflections etc.

Anytime the "better than native" comment is made (which btw isn't just from DF/Alex but also gamers nexus, HUB, TPU, computerbase, oc3d.net and even hardware unboxed etc.), it's been in reference to the aspects which upscaling does improve over native IQ i.e. reducing shimmering, aliasing, essentially having better temporal stability on the whole and rendering in more detail, basically everything aside from RT hence why ray reconstruction is seeking to resolve the IQ issues with where RT and upscaling is concerned.


I can't see AMDS DLAA competitor being any good tbh, dlaa is good because of how it resolves the issues with current AA methods (which is what makes dlss so great) where as FSR in its current form only increases the artifacts/issues with things like shimmering, aliasing etc. further.

And yup sadly I said that from the very beginning, developers have even said it themselves :( Can't blame them tbf.


You have basically stated:

- DF saying dlss has always been worse than native IQ - like I said, I haven't watched the video so got a link to this as I'm pretty sure they are mainly referring to where RT is affected and the ray reconstruction as part of dlss 3.5
- ghosting always being there and being reduced/improved, well yeah because it relies on motion vectors, same way as TAA does (which is used for 99% of games) hence why TAA has always exhibit ghosting to some extent too
- "therefore it's increasing IQ when you've always been able to reduce Res for higher settings, and it's been getting lapped up like milk ever since." - that's directly saying dlss/upscalers is the exact same as just simply lowering your res. when you know fine well, it's not otherwise not think every tech reviewer would have stated this too? If you want, I'll take a screenshot later to show you just how much worse a lower res vs dlss really is.
 
Last edited by a moderator:
Exactly. It's better than native good :cry:

I argued this a lot when DLSS 1 was out when many were saying better than native. Made no sense. But guys like Grim were lapping it up.

It has come a long way since and I do really like DLSS 2. It is worth the any small issues it has as I usually don't see them. Could be mainly because I use DLDSR and run at 5160x2160 which helps DLSS a lot.
This right here, and it wasn't hard to agree either as what your saying makes complete sense, it has it's issues but it can be worth those issues, that's where upscaling excells and if yer eyesights ****, jackpot, win win.:p
 
  • Angry
Reactions: TNA
I think the whole 'DLSS > Native' thing has always had a few qualifiers - if you asked me, "is DLSS better than 8x MSAA in Forza Horizon 5?" I'd say no because (to my preference) MSAA gives a cleaner, sharper look than DLSS in that game - however, even with 8x MSAA, there's still some visible aliasing that simply isn't there when you use DLSS - you *could* argue that's better image quality than native but in that instance, I prefer native.

Now, 'native' in Resident Evil 4 means TAA+FXAA (spit) at its max quality settings - and it looks terrible - even if you turn off FXAA it looks bad 'cause you get TAA artifacts everywhere - it's simply dreadful. RE4 with DLSS/DLAA added via REFramework looks stunning and way better than what Capcom gives you as 'native'.

Sadly Forza is an outlier insofar as keeping MSAA as an option - almost every game now has TAA as its AA solution and DLSS is objectively superior to TAA.

Oh, and I think resolution and quality modes should be part of the discussion whenever DLSS vs native is being brought up - do I consider DLSS Performance to be better than native? Of course not - Quality? Possibly - depends on how good (or bad) the game looks to begin with and what kind of AA solution it's using.

*Edit* lastly - DLSS has had a lot of revisions since it first debuted - some of them (2.5.1) are widely acknowledged as being excellent - others less so - and not everyone gaming using DLSS is swapping DLLs to get the best image quality so there's some leeway there in the 'argument' too - 'cause some DLSS versions are just plain bad (and devs rarely update them).
 
Last edited:
I think the whole 'DLSS > Native' thing has always had a few qualifiers - if you asked me, "is DLSS better than 8x MSAA in Forza Horizon 5?" I'd say no because (to my preference) MSAA gives a cleaner, sharper look than DLSS in that game - however, even with 8x MSAA, there's still some visible aliasing that simply isn't there when you use DLSS - you *could* argue that's better image quality than native but in that instance, I prefer native.

Now, 'native' in Resident Evil 4 means TAA+FXAA (spit) at its max quality settings - and it looks terrible - even if you turn off FXAA it looks bad 'cause you get TAA artifacts everywhere - it's simply dreadful. RE4 with DLSS/DLAA added via REFramework looks stunning and way better than what Capcom gives you as 'native'.

Sadly Forza is an outlier insofar as keeping MSAA as an option - almost every game now has TAA as its AA solution and DLSS is objectively superior to TAA.

Oh, and I think resolution and quality modes should be part of the discussion whenever DLSS vs native is being brought up - do I consider DLSS Performance to be better than native? Of course not - Quality? Possibly - depends on how good (or bad) the game looks to begin with and what kind of AA solution it's using.

*Edit* lastly - DLSS has had a lot of revisions since it first debuted - some of them (2.5.1) are widely acknowledged as being excellent - others less so - and not everyone gaming using DLSS is swapping DLLs to get the best image quality so there's some leeway there in the 'argument' too - 'cause some DLSS versions are just plain bad (and devs rarely update them).

Yup sadly, the majority of AA methods and implementations are just **** poor now, MSAA is generally great, more so in older games but a lot of games are made with motion vectors etc. in mind so when you use anything but a TAA based solution, it looks beyond awful to the point of the game literally looking broken and artifacting e.g. RDR 2, especially surrounding hair and fur


You then get people saying oh "smaa is the best" but as also shown, this isn't necessarily "better" as the issues like shimmering are extremely noticeable and immersion breaking:


It definetly does depend a lot on the game too e.g. DLSS balanced and performance in some games can look extremely good but this obviously depends on the res and also the quality of textures and so on but as shown by various technical press now, "most" of the time, likes of dlss can achieve a better image "on the whole".

Agree, is annoying that devs don't up date FSR and DLSS themselves but thankfully it is possible with dlss and pretty much a pain free process too.
 
Yup sadly, the majority of AA methods and implementations are just **** poor now, MSAA is generally great, more so in older games but a lot of games are made with motion vectors etc. in mind so when you use anything but a TAA based solution, it looks beyond awful to the point of the game literally looking broken and artifacting e.g. RDR 2, especially surrounding hair and fur
At least FXAA seems to been abandoned as the terrible idea it always was (well, apart from bloody Capcom who just loooove making RE Engine look dreadful :rolleyes:)
 
Cd projekt red says the new expansion for Cyberpunk is very CPU heavy, they recommend a minimum of 8 cores and users should expect an 8 core CPU to run at almost 100% load. They recommend users go ahead and run Cinebench multi core test on their PC to make sure their system is stable with the CPU going at full load because that's what it will be doing playing Cyberpunk

 
Last edited:
Will be interesting to see how well it utilises 12 cores. Might be the first time my 5900X stretches it's legs in a game.
 
Status
Not open for further replies.
Back
Top Bottom