• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Upscaling & Fake frames?

Associate
Joined
18 Jan 2006
Posts
190
So, this is a genuine question, as I don't know the answer and am interested.

I see a lot of conversation about the relative performance of GPUs based on rasterization, without the DLSS/FSR/Whatever funky software enhancements are offered. And I understand that the manufacturers are being somewhat disingenuous when they claim, for example, 4090 performance from a 5070, if one expects that to apply to raw frame processing. But, in the real world, does anyone switch all the enhancements off in their games? And given that there is no real metric for image quality provided in reviews, how can one actually understand the best GPU for a given set up?

My preference is single player games, with as much eye candy as possible on a 49" super UW monitor. That monitor will "only" manage 144hz refresh, so anything over 144fps is essentially pointless. But I want the game to be as smooth as possible, with all the fancy lighting and stuff. If DLSS4 with tons of "fake frames" will provide an objectively better experience than DLSS3 or no DLSS at all, then why should I care about raw rasterization performance? Am I missing something?

Alternatively, I can completely understand that if you have a 1080p monitor running at 540hz playing competitive fps games, and care not one bit about image quality, then your priorities will be different. But again, if performance is better, why would you care whether the frames are generated by the engine or the GPU?

What am I missing? Is there a reason why I should particularly care about performance without frame gen, etc? I appreciate this is probably a somewhat contentious question, but I'm hoping the responses stay friendly. The question is a genuine one.

TIA
 
I don't like the frame delivery of FG, it's hard to describe but it's especially jarring when moving side to side on first person perspective games.

My current game is Satisfactory which I play at 3840x1600 and I get an average of 110fps with my 4090 with no DLSS or FG turned on, everything maxed to the hilt and it's a lovely experience. Not so with DLSS and worse with FG turn on. In Satisfactory you can easily change the settings of DLSS and FG to experience their effects - yes the FPS jumps up to 190fps average but that comes at a cost of visual fidelity and I really don't like it.

Increasing rasterization performance is getting harder and harder each generation so they're using AI with it's higher FPS to sublimate us in to thinking it's acceptable.

I'm surprised it's not a more discussed subject on the forum as it looks like this technology is here to stay, until at least the next paradigm shift in graphics technology.
I have only really used DLSS on my card with Cyberpunk, and honestly didn't notice any graphical artefacts, so this is useful to hear. I guess that improving that kind of thing may be the enhancements that they are making with DLSS4, which should flow down to older cards, so perhaps it'll be less noticeable or troublesome.

Or perhaps not! But either way, I am ever less convinced that fps has any meaning anymore unless you are very deep into that competitive stuff.

I'm also of an age where i suspect the worst part of the code to brain chain is the optical bits in my head. I suspect that the frame gen thing is, as you say, here to stay, and so understand the tradeoffs is still a worthwhile thing... even if I am already committed to a new GPU in the next few weeks, as I have already promised my current PC to my 15 year old daughter, and she is too excited to disappoint!
 
Thanks, this is all exceedingly educational. There seems to be more acceptance of DLSS for upscaling, than Frame gen, and if I understand correctly that's because the latter impacts input lag, and causes ghosting, etc...

This article seems cautiously positive about improvements coming with this gen:
 
... Considering these factors I think it pointless talking about upscalers when purchasing any high end GPU unless you plan on keeping it for multiple generations...
This is an interesting point. I didn't think I have any particularly regular upgrade cadence, but looking at my more recent gpu history (HD7970, GTX1080, RTX3080), I do seem to upgrade every couple of generations. I wonder whether a view on the value of these technologies is influenced directly by whether you are upgrading every gen, or looking to sweat assets for a long time, or somewhere in between.

Kinda off topic, but you set me to musing.
 
Back
Top Bottom