• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Upscaling & Fake frames?

Associate
Joined
18 Jan 2006
Posts
188
So, this is a genuine question, as I don't know the answer and am interested.

I see a lot of conversation about the relative performance of GPUs based on rasterization, without the DLSS/FSR/Whatever funky software enhancements are offered. And I understand that the manufacturers are being somewhat disingenuous when they claim, for example, 4090 performance from a 5070, if one expects that to apply to raw frame processing. But, in the real world, does anyone switch all the enhancements off in their games? And given that there is no real metric for image quality provided in reviews, how can one actually understand the best GPU for a given set up?

My preference is single player games, with as much eye candy as possible on a 49" super UW monitor. That monitor will "only" manage 144hz refresh, so anything over 144fps is essentially pointless. But I want the game to be as smooth as possible, with all the fancy lighting and stuff. If DLSS4 with tons of "fake frames" will provide an objectively better experience than DLSS3 or no DLSS at all, then why should I care about raw rasterization performance? Am I missing something?

Alternatively, I can completely understand that if you have a 1080p monitor running at 540hz playing competitive fps games, and care not one bit about image quality, then your priorities will be different. But again, if performance is better, why would you care whether the frames are generated by the engine or the GPU?

What am I missing? Is there a reason why I should particularly care about performance without frame gen, etc? I appreciate this is probably a somewhat contentious question, but I'm hoping the responses stay friendly. The question is a genuine one.

TIA
 
at 144hz I think the new version of FG will be awful. Basically how it works is half or third your target framerate depending on which FG you go for to work out the base game frequency. Anything under about 50 base game frequency feel pants, like really laggy and sluggish to turn in an FPS

So for the new FG to shine I'd think you'll need to be in a target range of 180+ really

Then it's just does the game implement it correctly so you don't get ghosting etc. To tune a game with a 4090 the FG part is like the lowest priority to use as it usually causes the most issues

There's also a lot of older games with no FG support
 
Last edited:
I don't like the frame delivery of FG, it's hard to describe but it's especially jarring when moving side to side on first person perspective games.

My current game is Satisfactory which I play at 3840x1600 and I get an average of 110fps with my 4090 with no DLSS or FG turned on, everything maxed to the hilt and it's a lovely experience. Not so with DLSS and worse with FG turn on. In Satisfactory you can easily change the settings of DLSS and FG to experience their effects - yes the FPS jumps up to 190fps average but that comes at a cost of visual fidelity and I really don't like it.

Increasing rasterization performance is getting harder and harder each generation so they're using AI with it's higher FPS to sublimate us in to thinking it's acceptable.

I'm surprised it's not a more discussed subject on the forum as it looks like this technology is here to stay, until at least the next paradigm shift in graphics technology.
 
yeah this fg thing is going a bit too far, i believe its going to be a constant fixture eachtime nvdia releases a new card.. friends now we have MFG 164x, 164 predicted frames for 1 actual frame
i am no longer sure about what is supposed to be the essence of graphics performance
 
I don't like the frame delivery of FG, it's hard to describe but it's especially jarring when moving side to side on first person perspective games.

My current game is Satisfactory which I play at 3840x1600 and I get an average of 110fps with my 4090 with no DLSS or FG turned on, everything maxed to the hilt and it's a lovely experience. Not so with DLSS and worse with FG turn on. In Satisfactory you can easily change the settings of DLSS and FG to experience their effects - yes the FPS jumps up to 190fps average but that comes at a cost of visual fidelity and I really don't like it.

Increasing rasterization performance is getting harder and harder each generation so they're using AI with it's higher FPS to sublimate us in to thinking it's acceptable.

I'm surprised it's not a more discussed subject on the forum as it looks like this technology is here to stay, until at least the next paradigm shift in graphics technology.
I have only really used DLSS on my card with Cyberpunk, and honestly didn't notice any graphical artefacts, so this is useful to hear. I guess that improving that kind of thing may be the enhancements that they are making with DLSS4, which should flow down to older cards, so perhaps it'll be less noticeable or troublesome.

Or perhaps not! But either way, I am ever less convinced that fps has any meaning anymore unless you are very deep into that competitive stuff.

I'm also of an age where i suspect the worst part of the code to brain chain is the optical bits in my head. I suspect that the frame gen thing is, as you say, here to stay, and so understand the tradeoffs is still a worthwhile thing... even if I am already committed to a new GPU in the next few weeks, as I have already promised my current PC to my 15 year old daughter, and she is too excited to disappoint!
 
DLSS renders the image at a lower resolution and upscales it using magical AI wizardry (so 720-1440p base resolution for 4k output, depending on which preset - at least I think those are the numbers, could be wrong!) - At the 'quality' setting (i.e. 1440 > 4k) it's pretty good in most games.

Framegen inserts extra frames, again using AI wizardry, between the rendered frames making your fps either 2x or 4x at the expense of temporal artifacts and input lag, as such it's probably fine for casual games if your framerate is marginal but I would never want it enabled for anything competitive - would be better to just lower your settings to improve responsiveness
 
Last edited:
Like someone said somewhere else, raster is like humping a dead pig, it's all going AI, the AI portion of the chips will get bigger and bigger, the raster will get smaller and smaller, and there is nothing anyone can do about it.
 
So, this is a genuine question, as I don't know the answer and am interested.

I see a lot of conversation about the relative performance of GPUs based on rasterization, without the DLSS/FSR/Whatever funky software enhancements are offered. And I understand that the manufacturers are being somewhat disingenuous when they claim, for example, 4090 performance from a 5070, if one expects that to apply to raw frame processing. But, in the real world, does anyone switch all the enhancements off in their games? And given that there is no real metric for image quality provided in reviews, how can one actually understand the best GPU for a given set up?

My preference is single player games, with as much eye candy as possible on a 49" super UW monitor. That monitor will "only" manage 144hz refresh, so anything over 144fps is essentially pointless. But I want the game to be as smooth as possible, with all the fancy lighting and stuff. If DLSS4 with tons of "fake frames" will provide an objectively better experience than DLSS3 or no DLSS at all, then why should I care about raw rasterization performance? Am I missing something?

Alternatively, I can completely understand that if you have a 1080p monitor running at 540hz playing competitive fps games, and care not one bit about image quality, then your priorities will be different. But again, if performance is better, why would you care whether the frames are generated by the engine or the GPU?

What am I missing? Is there a reason why I should particularly care about performance without frame gen, etc? I appreciate this is probably a somewhat contentious question, but I'm hoping the responses stay friendly. The question is a genuine one.

TIA
Even with single player games, do you care how well you play?

If you care then fake frames isn't desirable.

It's also inconsistent game to game and where it doesn't work then it's an issue.
 
We really, really need reviewers to focus on IQ and overall experiencs (as well as quantatative benchmarks). HardOCP, Anand and a few others used to do this... but those reviewers are unfortunatley defunct. If we had this, people would have a better idea of the deficiencies inherent to DLS/FS/Frame Gen. We have absolutely no one giving (as objective as possible) here because a) it's difficult and expensive to show and b) I suspect nV and AMD would try to shut them down at this point.

For me, I can smell these methods in game like a fart in a car. I have a background in AI and have an inkling of what's going on behind the scenes, granted, but I think anyone who has seen clean raster vs DLSS/FSR and frame gen side by side would "taste" the difference and be largely put off.

As for manufactuers advertising 1:1 performance numbers between cards while employing these methods is pattently deceptive, and I take exception to it as a graphics card enthusiast for the past 30 years.
 
I have only really used DLSS on my card with Cyberpunk, and honestly didn't notice any graphical artefacts,
I'm with you here on dlss. got 2 pc (home home and home where I work) 1st is a b550e-e/5800x/32gb3600c18/3080fe which my eldest mainly uses. I'm using a b650e-e/7800x3d/32gb6000c30 and a 3070ti. both paired with 4k lg oled screns (c2 and C4) I can't tell much if any on image quality when i turn dlss on (i do have it set to quality though, not performance or anything)..I turn input lag reduction on as well and works fine. Never tried FG(as of course 3000 series doesn't have it), but everyone says the input lag goes through the roof, so really not interested in that...but dlss. yes please
unlike my son, I'm def with you in the single player, everything loking good category. my hand eye coordination gone thru the floor now I'm older, so leave the pvp/first person shooter type games to my son. That and just going round shooting people all the time gets a bit boring and repetetive to me..prefer a little bit of a story
i really hope these new cards are good. a 3070ti is not really meant for 4k. does an ok job but it runs at 99% while my 7800x3d sits in the "0% realm. i could def do with an upgrade
think fg really shines in games such as flight sim, where input lag not such an issue
 
people should start calling them out, gamers don't want gimmicks.


It's like claiming a 5090 is 20x faster if you purposely bottlekneck the 4090 with 100x AA or some other nonsense.

it's not real world conditions, it never has been but now its such a huge stretch, it's like snake oil salesman

Give it time, the number of fake frames will exceed the number of real frames.
  • 100+ FPS: Each generated frame is shown for less than 10 milliseconds
  • 80 FPS: Each generated frame is shown for 12.5 milliseconds
wonder what that is for 5000 series.

with 4000 series it's every other frame? so
real - fake - real

but with a 5000 series if 3 frames are fake.
it becomes what?
real - fake - fake - fake - real

? doesn't it also just mean even more artifacts and weird behaviour? or do people think it will hide them better?

I wonder how many fake frames they can add before it feels like your playing with a 200ms ping


do any competitive games support frame gen? surely it would be a huge disadvantage?

like a 4090 with frame gen has a advantage over a 5090 with frame gen both at the same 100 fps or whatever, purely because your seeing more real data on a 4090.

Wonder how weird it feels like driving games
 
Last edited:
the time between the real frames doesn't change much between the 2, 3 and 4 frame gen versions. There's just more frames between them now, so it shouldn't feel much different as in lumpy or inacurate
 
Back
Top Bottom