• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
But WHY would you upscale in the first place is the question which you are not answering cause it will prove your whole point wrong. LOL this is nuts..
Are you doing this deliberately? The clue is literally in the name you'd upscale an image because you wanted to make it bigger, because you wanted to scale it up, because you want a 10x10 pixel image to fit into a 40x40 display. If you think...
The goal of DLSS isn't to improve image quality, it's to improve your framerate.
Then tell me how many frames you can see here? How many images do you see?
Dude, 1 picture is a thousand words I guess. You can't seriously tell me it's just nvidia marketing that makes me think the right AI image is higher quality than the left native image
DLSS-power.png
Because i see one or if you want to be pedantic two separately rendered images side by side. One rendered at 1080p with temporal anti aliasing image enhancement applied to it and another rendered at 960x540p that's been upscaled with DLSS to 1080p and image enhanced with DLSS.
 
Last edited:
Are you doing this deliberately? The clue is literally in the name you'd upscale an image because you wanted to make it bigger, because you wanted to scale it up, because you want a 10x10 pixel image to fit into a 40x40 display. If you think...

Then tell me how many frames you can see here? How many images do you see?

Because i see one or if you want to be pedantic two separately rendered images side by side. One rendered at 1080p with temporal anti aliasing image enhancement applied to it and another rendered at 960x540p that's been upscaled with DLSS to 1080p and image enhanced with DLSS.
That image, if I'm not mistaken, shows to the left, a 1080p render on a 4k screen. Is native, because there is no upscaling tech applied to it, just a dropped resolution from 4k to 1080p when you don't have a powerful enough graphics card to run at 4k. It won't be great as the monitor upscaler isn't good.

To the right, you run the resolution at 4k native for the display, however, through DLSS is rendered yet again at 1080p in performance mode, but now is upscaled to 4k with a lot more detail.

So, thanks to DLSS, you have a better image quality than you would have if you'd just drop resolution or use the internal upscaler of the game. Also, on a native 1080p screen, you could run super sample as you chose a 4k internal render, than on that image you could apply DLSS performance and, perhaps, you'd be having about the same frame rate as native, but at a higher quality. I haven't played much with this, so I don't know how well it actually works.
 
I stand corrected then as i just read the caption on the bottom. :)

Point still stand though, that you upscale an image to make it look better not increase the frame-rate. That if you want to increase frame-rates you lower the resolution.
 
Last edited:
I stand corrected then as i just read the caption on the bottom. :)

Point still stand though, that you upscale an image to make it look better not increase the frame-rate. That if you want to increase frame-rates you lower the resolution.

why-not-both-why-not.gif

:D
 
So - different question. Has anyone seen a generational analysis based on power usage? The price of cards has varied hugely in the last few years, because of mining, scalping etc. But the power used to push the pixels should be a fairly independent measure.

I remember being surprised at the power level of the 3000-series, and the 4090 is nuts. But how this changes over generations would be interesting - if anyone has seen anything like that?
 
For what games? And try not to say cyberpunk.

Arguably the highest and consistent box sales game is world of Warcraft on PC doesn't use dlss.

Using your argument, the dlss portion does nothing here and where it does will have image degradation and inconsistency in various games which has been evident.

Every time I've used dlss, it has been very easy to spot the degradation in image quality, I don't use TAA but will super sample where msaa or no in game ssaa is available by custom resolution to get sharper quality 100% of the time with no inconsistency.

Every game needs a render scale option and comprehensive AA option like that used to do.
Not this again....it's like people want to stick to their opinion even if evidence is presented to the contrary. You just don't budge do you? Let me try this again. Try 1440p Native vs 4k DLSS Quality. Both have the same internal resolution, but the latter will always, ALWAYS look better. Much, much, much better.
 
Are you doing this deliberately? The clue is literally in the name you'd upscale an image because you wanted to make it bigger, because you wanted to scale it up, because you want a 10x10 pixel image to fit into a 40x40 display. If you think...
But we are not talking aboout an image, we are talking about a game. You know, you can choose your monitor's resolution in games, you doon't need to upscale. So why would you, if not for performance reasons?
 
But that is fundamentally wrong. Firstly because even when used as an upscaler, DLSS improves image quality (going by hwunboxds video), and second of all, because you don't need to use it as an upscaler.

But native does not offer you more pixels. Native 1080p offers you the exact same internal render resolution as 4k DLSS Performance, and the latter is painfully better in image quality. And so is FSR btw. Native is dead, either because people use DLSS / FSR as upscalers to gain fps boosts (while sometimes improving imaging quality) or because people with spare gpu horsepower use DLDSR (which is basically a reverse DLSS) to downscale from higher resolutions. With DLDSR and DLSS existing there is no reason to play on native, ever.

Or to put it better, let''s say you have a 1440p monitor. Your option is to either play in native 1440p, or to render at 4k and then use DLSS Q. The latter WILL look better while offering the same performance.
Except when you play one of the 99% of games that has no DLSS support.

DLDSR is interesting but is not a full on replacement for native resolution, its not perfect.
 
Also, on a native 1080p screen, you could run super sample as you chose a 4k internal render, than on that image you could apply DLSS performance and, perhaps, you'd be having about the same frame rate as native, but at a higher quality. I haven't played much with this, so I don't know how well it actually works.
Actually if your card has the horsepower for 4k native in a specific game, it's better to use DLDSR.. Say you are getting 80 fps at 4k and you want better image quality, you use DLDSR to render at 5461*2880 and then use DLSS on top of that. Performance should be a little bit lower than 4k native but image quality will be oh mama
 
Last edited:
But we are not talking aboout an image, we are talking about a game. You know, you can choose your monitor's resolution in games, you doon't need to upscale. So why would you, if not for performance reasons?
So games don't display images? And you can choose your monitor's resolution in games, damn why didn't someone tell me, that would've saved me a lot of consternation when i bought a new monitor last year.

You really have no idea how games, monitors, or even GPUs display images do you.
 
So games don't display images? And you can choose your monitor's resolution in games, damn why didn't someone tell me, that would've saved me a lot of consternation when i bought a new monitor last year.

You really have no idea how games, monitors, or even GPUs display images do you.
The fact that you keep not answering a simple question after 10 posts has me convinced that you know im righht, and of course you now pulled the personal attacks card. Okay, you got me!
 
IMO you shouldn't be comparing 4k with DLSS to 1440p or 1080p depending on whether using performance or quality mode. You should be comparing to 4k native as 4k is what DLSS will be upscaling to.
Also still images don't really give a good representation of DLSS as most artifacts that are introduced through the use DLSS are shown during motion.
 
The fact that you keep not answering a simple question after 10 posts has me convinced that you know im righht, and of course you now pulled the personal attacks card. Okay, you got me!
I think the very fact that you think I've not answered the question despite having answered it multiple times over multiple posts only goes to demonstrate that you have no idea how games, monitors, or even GPUs display images.

Also what personal attack? Because if you think the post you replied to contains a personal attack then talk about being oversensitive.
 
Last edited:
Not this again....it's like people want to stick to their opinion even if evidence is presented to the contrary. You just don't budge do you? Let me try this again. Try 1440p Native vs 4k DLSS Quality. Both have the same internal resolution, but the latter will always, ALWAYS look better. Much, much, much better.
Sorry why would I when I want to target 4k use 1440, that's not how people want their experience to be.

They will want to select 4k native or 4k dlss.

Why? Especially when I can super sample even greater resolution to any game in existence to get superior image quality.

Every time I super sample, I get guaranteed image quality improvements, every time I've used dlss I've seen degraded image quality, if you haven't noticed I rock an Nvidia GPU currently.

TAA is garbage, I actually think games look better with it off in every game that has it, funny enough mainly ones that are Nvidia optimised...
 
Sorry why would I when I want to target 4k use 1440, that's not how people want their experience to be.

They will want to select 4k native or 4k dlss.

Why? Especially when I can super sample even greater resolution to any game in existence to get superior image quality.

Every time I super sample, I get guaranteed image quality improvements, every time I've used dlss I've seen degraded image quality, if you haven't noticed I rock an Nvidia GPU currently.

TAA is garbage, I actually think games look better with it off in every game that has it, funny enough mainly ones that are Nvidia optimised...
Oh gosh. There we go again with the flawed comparison. Yes obviously supersampling is better than 4k DLSS quality, but the latter gives you triple the framerate, lol.. If you want to compare apples to apples you target the same framerate and THEN compare image quality, in which case DLSS walks all over native and supersampling. Your argument is basically "if I drop to 30 fps i get better image quality than when I get 90 fps". Welll, DUH, obviously. If it's image quality you are after supersampling is the wrong way to do it, DLDSR + DLSS Quality is the way to go.
 
Sorry why would I when I want to target 4k use 1440, that's not how people want their experience to be.

They will want to select 4k native or 4k dlss.
Well that's my point, if DLSS lowers image quality like you are suggesting then native 1440p should look better than 4k DLSS Quality, since they both render at 1440p. But it doesn't. It actually looks much worse. Which is the point.
 
So - different question. Has anyone seen a generational analysis based on power usage? The price of cards has varied hugely in the last few years, because of mining, scalping etc. But the power used to push the pixels should be a fairly independent measure.

I remember being surprised at the power level of the 3000-series, and the 4090 is nuts. But how this changes over generations would be interesting - if anyone has seen anything like that?
Something like this?

 
Status
Not open for further replies.
Back
Top Bottom