• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

It looks like the 'real' /affordable RDNA3 + next gen NV desktop launch won't launch until September. Thoughts?

So did I right from the 3850 up to the vega 56 for every gen, sadly amds lack of a uk store for rdna 2 meant nvidia offered the best bang per buck as I could get a nvidia gpu for msrp, having dlss and rt grunt (things that matter to me and I use every day) meant I was also getting a much better deal since as we all know amd fell behind here. Never really cared for "physx" or the other things nvidia offer/do better but as shown dlss, dldsr and rt are worth it IMO.

My question is mainly aimed at the vocal ones as every time people raise how things like dlss etc. don't matter and people are "idiots" for buying nvidia because of their features or "brainwashed" as they buy nvidia , they always avoid this question though :



That's the problem with you, you are happy with poor image quality or you really do need to get your eyes checked.

Dlss isn't an solution when it makes images look worse or introduce inconsistency.

My question to you is, why do you resist good image quality?

No one avoided to answer, you didn't like the answer that you need glasses or a new eye check
 
Last edited:
That's the problem with you, you are happy with poor image quality or you really do need to get your eyes checked.

Dlss isn't an solution when it makes images look worse or introduce inconsistency.

My question to you is, why do you resist good image quality?

No one avoided to answer, you didn't like the answer that you need glasses or a new eye check

I'll bite...

You and others keep saying this yet you lot still can't post anything of substance to backup your statements to prove this outside of a couple of cherry picked scenes. Meanwhile, oc3d, DF, pcgamershardware, computerbase, tpu, gamersnexus and even HUB have shown how and when dlss can often look as good as native or better and just to reiterate again, had hub updated the dlss file for the games where they picked native as being better, dlss would have come out on top here too but you know, as a pc gamer, apparently it is such a faff copying and pasting one file....

Until you can show something that 100% proves your point and debunks the general consensus that dlss is in fact pretty much as good as native or better, I'll stick with what reputable and knowledgable folk show especially when their analysis is in depth.
 
Last edited:
I'll bite...

You and others keep saying this yet you lot still can't post anything of substance to backup your statements to prove this outside of a couple of cherry picked scenes. Meanwhile, oc3d, DF, pcgamershardware, computerbase, tpu, gamersnexus and even HUB have shown how and when dlss can often look as good as native or better and just to reiterate again, had hub updated the dlss file for the games where they picked native as being better, dlss would have come out on top here too but you know, as a pc gamer, apparently it is such a faff copying and pasting one file....

Until you can show something that 100% proves your point and debunks the general consensus that dlss is in fact pretty much as good as native or better, I'll stick with what reputable and knowledgable folk show especially when their analysis is in depth.
Again, extra funny when the guy talking about image quality uses a ps5. You know, the one that drops as low as 768p. Gosh, you can't make this up...
 
"See" yes, read my post again.
But that is the point of fg, visual smoothness. Not my cup of tea, I prefer playing without fg at 80 fps then with it at 120, but I know lots of people who prefer the opossite. Having the option to do it is great. Not having the option cannot be better than having the option, so no matter how bad you think fg is, not even having the option is worse.
 
I'll bite...

You and others keep saying this yet you lot still can't post anything of substance to backup your statements to prove this outside of a couple of cherry picked scenes. Meanwhile, oc3d, DF, pcgamershardware, computerbase, tpu, gamersnexus and even HUB have shown how and when dlss can often look as good as native or better and just to reiterate again, had hub updated the dlss file for the games where they picked native as being better, dlss would have come out on top here too but you know, as a pc gamer, apparently it is such a faff copying and pasting one file....

Until you can show something that 100% proves your point and debunks the general consensus that dlss is in fact pretty much as good as native or better, I'll stick with what reputable and knowledgable folk show especially when their analysis is in depth.
Its not the reviewers job to patch DLSS files. 99% of players won't do it, they shouldn't do it either.
 
Its not the reviewers job to patch DLSS files. 99% of players won't do it, they shouldn't do it either.
Iv'e got RD2 on Rockstar, you can't mod the DLSS file, it's locked.

While on the subject,

HUB's Tim trying his hardest not to offend anyone...

hub1.png

hub2.png
 
Last edited:
DLSS2 (and FSR2) can look better than native in one specific case... When they native TAA implementation is so botched that it reduces image quality.
Remember that most AA techniques are compromises that takes shortcut as the original AA (supersampling) is extremely expensive computationally and was basically abandoned in 3D games.

Luckily, some games allow to turn AA off and you can do your own supersampling with virtual resolution, however this works best on lower resolution monitors as no card on this planet can supersample at decent performance if you're already playing 4k (which was originally hailed as the "we don't need antialiasing anymore" resolution!)
 
When people first started claiming that 4k resolution wouldn't need anti aliasing, the average gaming monitor was like 20 inches and yeah 4k on a 20 inch screen probably looks very clean. But now that gamers have 4k screens they are generally 32 inches and bigger and at that size they still have to use anti aliasing
 
Last edited:
I'll bite...

You and others keep saying this yet you lot still can't post anything of substance to backup your statements to prove this outside of a couple of cherry picked scenes. Meanwhile, oc3d, DF, pcgamershardware, computerbase, tpu, gamersnexus and even HUB have shown how and when dlss can often look as good as native or better and just to reiterate again, had hub updated the dlss file for the games where they picked native as being better, dlss would have come out on top here too but you know, as a pc gamer, apparently it is such a faff copying and pasting one file....

Until you can show something that 100% proves your point and debunks the general consensus that dlss is in fact pretty much as good as native or better, I'll stick with what reputable and knowledgable folk show especially when their analysis is in depth.
You literally plastered cyberpunk screen shots thinking they are amazing and in every screen shot I pointed out in every single one why it was more bad then good.
 
DLSS2 (and FSR2) can look better than native in one specific case... When they native TAA implementation is so botched that it reduces image quality.
Remember that most AA techniques are compromises that takes shortcut as the original AA (supersampling) is extremely expensive computationally and was basically abandoned in 3D games.

Luckily, some games allow to turn AA off and you can do your own supersampling with virtual resolution, however this works best on lower resolution monitors as no card on this planet can supersample at decent performance if you're already playing 4k (which was originally hailed as the "we don't need antialiasing anymore" resolution!)
TAA is awful, I will happily force super sampling for this reason and it's a shame that good AA method are not being adopted but it works in world of Warcraft so well.
 
You literally plastered cyberpunk screen shots thinking they are amazing and in every screen shot I pointed out in every single one why it was more bad then good.

That's great and all but you're in a very small minority of people who thinks CP.2077 visuals are bad :cry: Each to their own and all.

Btw what's that got to do with dlss iq anyway?
 
Last edited:
  • Like
Reactions: TNA
DLSS2 (and FSR2) can look better than native in one specific case... When they native TAA implementation is so botched that it reduces image quality.
Remember that most AA techniques are compromises that takes shortcut as the original AA (supersampling) is extremely expensive computationally and was basically abandoned in 3D games.

Luckily, some games allow to turn AA off and you can do your own supersampling with virtual resolution, however this works best on lower resolution monitors as no card on this planet can supersample at decent performance if you're already playing 4k (which was originally hailed as the "we don't need antialiasing anymore" resolution!)
Comparing supersampling with dlss is just silly, one cuts your fps in half the other one increases it by 50%.
 
I am not comparing DLSS to supersampling, I am comparing TAA to supersampling. DLAA would be a valid comparison though.
My main point is a proper comparison would take into account the framerate. If you target a specific framerate, say 100 fps, and try to see what method gives you the best image quality while retaining that framerate you will find out that dlss wins, and quite easily might I add.
 
When people first started claiming that 4k resolution wouldn't need anti aliasing, the average gaming monitor was like 20 inches and yeah 4k on a 20 inch screen probably looks very clean. But now that gamers have 4k screens they are generally 32 inches and bigger and at that size they still have to use anti aliasing
It was always incorrect & we pointed it out even then. AA will be necessary even 20 years from now, easily.


 
Last edited:
My main point is a proper comparison would take into account the framerate. If you target a specific framerate, say 100 fps, and try to see what method gives you the best image quality while retaining that framerate you will find out that dlss wins, and quite easily might I add.
That depends on your priorities.
In my case, I prefer to minimize distortion and aberration and I'm fine as long as 1% low is above 30fps but to each its own.
My position is not a majority one but hey, choice is great when available and I'm the kind of guy who still buys Music CDs so I can rip them to lossless.

That said, I'd be happy to try something akin to DLAA, where what I'm getting is a different antialiasing compromise starting from native resolution, that might look pretty close to supersampling indeed.

After all, we don't have unlimited computing power so we just have to accept the compromise we like best.
 
That depends on your priorities.
In my case, I prefer to minimize distortion and aberration and I'm fine as long as 1% low is above 30fps but to each its own.
My position is not a majority one but hey, choice is great when available and I'm the kind of guy who still buys Music CDs so I can rip them to lossless.

That said, I'd be happy to try something akin to DLAA, where what I'm getting is a different antialiasing compromise starting from native resolution, that might look pretty close to supersampling indeed.

After all, we don't have unlimited computing power so we just have to accept the compromise we like best.
So if you target 1% low fps at 30 while using dlss youll get much better image quality. Dlaa is kinda pointless when dldsr + dlss does the same thing but better, and it doesn't need specific support, it just works with everything.
 
Back
Top Bottom