• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
In a number of cases based on personal experience, I'd say DLSS has very little negative impact on image quality, especially in actual gameplay as opposed to screenshots. Bear in mind that I will tend to use the highest quality DLSS available, ymmv with lower levels of it.

However, it's a tool I'd apply to improve framerate, not image quality. If I was already pushing my monitor's max rate, I wouldn't apply it.
 
So you do believe DLSS is "Better than native image quality" ?
Based on my "own" testing, yes, it fits exactly what hu, tpu, df, gamers nexus, oc3d have shown with their testing.

As posted in another thread, I wish this game had dlss because the shimmering and aliasing is awful and completely takes me out of the game and that's not even the worst case....


Here's another example, which one do you think is better whilst playing the game based on these images (taken in motion)? Of course, neither is "perfect" but one is clearly better for motion clarity.... I must redo this test in rdr 2 as this is using an old version of dlss.

TTSLtQX.jpg

L8j7yb2.jpg

In a number of cases based on personal experience, I'd say DLSS has very little negative impact on image quality, especially in actual gameplay as opposed to screenshots. Bear in mind that I will tend to use the highest quality DLSS available, ymmv with lower levels of it.

However, it's a tool I'd apply to improve framerate, not image quality. If I was already pushing my monitor's max rate, I wouldn't apply it.

Funny thing is I posted ages ago a blindtest of dlss vs native and people couldn't guess which one was which, some didn't even attempt to answer :cry: I find dlss now to be superior in motion compared to native/taa (which was its weakest point prior to 2.3 or 2.2 iirc)

I will always go with dlss quality if possible but if the game is crying out for extra fps i.e. cp 2077 with RT then I'll reduce down, surprisingly dlss performance has come along way to the point of being usable, again based on my "own" testing.
 
If a GPU does 120FPS at 1440p then DLSS isn't needed at the moment but it's a nice feature to have for future games. Some people buy a GPU and keep it for 5+ years.

The ones that go for the actual high end GPUs tend to be more active on upgrade cycles then those from lower end gpus.

My further point is, people who go for high end hardware don't want to compromise.
 
I've got a 3090 and I still love a bit of DLSS, it means I can smash on all those bells and whistles to full on max and gain what I would have lost if I wasn't using DLSS in doing so.

Regarding if it looks better than native, it's certainly no worse and gains fps, what's not to love.
 
I've got a 3090 and I still love a bit of DLSS, it means I can smash on all those bells and whistles to full on max and gain what I would have lost if I wasn't using DLSS in doing so.

Regarding if it looks better than native, it's certainly no worse and gains fps, what's not to love.
Absolutely, that's what it's all about, although sometimes it only gets to nearly full on max with an 3080:p, however, when I turn off DLSS/FSR just to compare with native- native wins, I'm just going off what my own eyes are showing me, but I don't watch a lot of tech sites.:)
 
The ones that go for the actual high end GPUs tend to be more active on upgrade cycles then those from lower end gpus.

My further point is, people who go for high end hardware don't want to compromise.

That's not true, in my case I buy the best that I need at the time and keep it as long as possible because sometimes I may need a VRAM update or more performance, reality is I use to update every year or two and that was the days when we got real upgrades not the 20-30% gen on gen we have had for a while now, since that I went from the year to two years upgrade cycle to every other gen so 4-6 years normally upgrade cycles and that started on the Nvidia gtx580 time, from 580 I went 780ti for a few months then got a 980ti to now dual 3090's and will not be upgrading till 5000 or 6000 series nvidia if they are worth the upgrade then if not I will not be moving from the 3090s.


I like to buy the best at the time of purchase to stop the none stop upgrading for little increases that are not here or there normally, in the past I enjoyed that when we got real upgrades, now it's nothing but a waste of time, waste of money and headaches with the gremlins in the new tech for first few months to a year and I just want a system to work now without the headaches and only upgrade normally when any gremlins have been removed too. The new tech bugs and driver issues gets old very quickly at my age now and don't have the patience I once had for it and the time.
 
Builds can last for years if you want them to. Rarely do people need to upgrade, the only time it triggers that is for example you upgrade display to something current (bigger size, better resolution, higher hz/fps) and the old card lacks the features, display port and actual grunt.

Some people will have bought a 30 series card to last them years. Not months. If you bought an Ampere card then have to get an Ada your upgrading every cycle and Jensen loves you!
 
RDNA3 will make sense to me if I can bag a decent card under 600€ and I hope this generation will make RT useful on every step of the ladder instead of just high end.
My game purchasing habits make me tend to lag 2-3 years behind the curve so a GPU can easily last me easily 5 years and if we'll have another launch shortage I will happily bag a discounted RX 5700xt instead.

That said, I'm waiting for the industry to rediscover voxels and put some resources on optimisation, that would be a technology with real gameplay impacts instead of just visual...
 
That's not true, in my case I buy the best that I need at the time and keep it as long as possible because sometimes I may need a VRAM update or more performance, reality is I use to update every year or two and that was the days when we got real upgrades not the 20-30% gen on gen we have had for a while now, since that I went from the year to two years upgrade cycle to every other gen so 4-6 years normally upgrade cycles and that started on the Nvidia gtx580 time, from 580 I went 780ti for a few months then got a 980ti to now dual 3090's and will not be upgrading till 5000 or 6000 series nvidia if they are worth the upgrade then if not I will not be moving from the 3090s.


I like to buy the best at the time of purchase to stop the none stop upgrading for little increases that are not here or there normally, in the past I enjoyed that when we got real upgrades, now it's nothing but a waste of time, waste of money and headaches with the gremlins in the new tech for first few months to a year and I just want a system to work now without the headaches and only upgrade normally when any gremlins have been removed too. The new tech bugs and driver issues gets old very quickly at my age now and don't have the patience I once had for it and the time.
Ok that's you, that's fine if you do that, but plenty of others that upgrade on a shorter cycle
 
That said, I'm waiting for the industry to rediscover voxels and put some resources on optimisation, that would be a technology with real gameplay impacts instead of just visual...
Because of physics? Don't think there is an enough powerful hardware for that to do it a massive scale. For geometry is Nanite.
When it comes to physics, they don't care and at times even downgrade what was working fine in previous installments. There is no Red Faction physics game around...

RT/PT "it just works", while the others require a lot of work... to work.
 
Status
Not open for further replies.
Back
Top Bottom