• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD FSR 3.0 has exposed the ugly truth about most PC gamers

Associate
Joined
13 Jul 2020
Posts
501
That’s a bit of a weird take. I’ve been using FG ok my 4090 since it launched. Yes it had UI artifacts at launch in some games but now it’s completely gone and 99% perfect. At least in the games i’ve played ( cp2077, witcher 3, portal RTX, and probably others which i don’t recall right now since i just turn if on and forget about it ). But these were the games i played most so…

Only issue i had was a small bug with cp2077 but was fixed by restarting ( was running it at DLDSR 3440x1440 @2.25 with dlss balanced with pathtracing and the FG wasn’t doing anything. Restart and it was fixed. Prolly a bit too much of a res to run PT with and got buggy or something due to overhead? No idea.

So, i’m not seeing it. Not sure what you’re seeing on your mates computer.

Edit: only crashes i ever saw were in starfield with puredarks dlss3fg mod. But it was a mod, and it was fixed also.
 
Last edited:
Soldato
Joined
3 Jan 2006
Posts
24,968
Location
Chadderton, Oldham
People don't want "fake frames" even though most of the time it's not noticeable, it just boosts frame rates and in a lot of cases, creates a smoother experience, everyone needs to learn one thing, and I keep thinking can can I teeech these gamers.

"IT DOESN'T MATTER WHAT'S UNDER THE HOOD, IT'S THE EXPERIENCE THAT MATTERS"

4k native vs DLSS Quality + FG, if the experience is the same, IDGAF what's going on under the hood, it could be fake framing * 1000 with DLSS Super dooper quality 100P for all I care.
 
Last edited:
Soldato
Joined
19 Feb 2007
Posts
14,485
Location
ArcCorp
People don't want "fake frames" even though most of the time it's not noticeable, it just boosts frame rates and in a lot of cases, creates a smoother experience, everyone needs to learn one thing, and I keep thinking can can I teeech these gamers.

"IT DOESN'T MATTER WHAT'S UNDER THE HOOD, IT'S THE EXPERIENCE THAT MATTERS"

4k native vs DLSS Quality + FG, if the experience is the same, IDGAF what's going on under the hood, it could be fake framing * 1000 with DLSS Super dooper quality 100P for all I care.

Exactly, Something else people need to realize... even without FG it's ALL fake frames, In real life there's no such thing as a frame.
 
Last edited:
Soldato
Joined
21 Apr 2007
Posts
2,507
People don't want "fake frames" even though most of the time it's not noticeable, it just boosts frame rates and in a lot of cases, creates a smoother experience, everyone needs to learn one thing, and I keep thinking can can I teeech these gamers.

"IT DOESN'T MATTER WHAT'S UNDER THE HOOD, IT'S THE EXPERIENCE THAT MATTERS"

4k native vs DLSS Quality + FG, if the experience is the same, IDGAF what's going on under the hood, it could be fake framing * 1000 with DLSS Super dooper quality 100P for all I care.

Ah but its only the same if you as an individual perceive it as such and benefit from it, for example playing only games that support FG and are not particularly sensitive to input lag.... which is fine if that applies to your usage.

Someone like me for example using PCVR (which could really benefit from FG) isn't supported, legacy games like DayZ not supported and from my experience in say Witcher 3 enhanced FG will cause a ton of instability and game crashes.

So I prefer 4k native (native in general) not because I'm hating on FG its just not doing anything useful for me until I getting around to playing CP2077 or Alan Wake for 30hrs. What I want from a GPU (and what I get for my money) is raw performance and graphics acceleration applied across everything. If FG could do that then yes I agree it doesn't matter but in reality it does matter for the reasons I've laid out... at least to me.
 
Associate
Joined
15 Sep 2009
Posts
1,414
Location
London
Exactly, Something else people need to realize... even without FG it's ALL fake frames, In real life there's no such thing as a frame.
You may think you’re making a insightful point but you’re really not.

Whilst rendering techniques have always been a complex illusion tailored to create the best visual fidelity at the best performance, the number of frames a given engine can generate on a hardware platform and the associated input latency are very real and very measurable.

Just because an engine might render shadows at a low resolution and filter them, or calculate ambient occlusion at a low fidelity and mask it with textures and normal maps, it doesn’t diminish the fact that an increase in frames-per-second has always resulted in a subsequent improvement in input-latency – higher framerates have been typically desired for both those qualities of which, frame-generation, or ‘fake frames’ only provides the former.

Now, personally I don’t care much, since I’ve only tinkered with DLSS-FG and don’t presently have much use for it – I get 90+ (real) fps in all the games I play at the resolution and quality settings I play at.

I do think it’s important though to separate what frame generation does (improve apparent visual smoothness) from the claims (mostly by Nvidia) that it offers improved ‘performance’ (which it doesn’t – genuine performance gains = more frames and less latency).

Does it matter? To some people yes, and to some no, but to dismiss the specifics of the discussion with the trite ‘it’s all fake anyway’ argument is swallowing Nvidia’s marketing BS hook, line and sinker.
 
Associate
Joined
19 Oct 2002
Posts
1,460
At the end of the day the gaming experience matters and if frame generation helps with that experience then so be it. Honestly I have no problem with either technology or future technologies that enhance my gameplay experience. If it's a fake frame who cares? Are you able to identify a "fake frame" that passes faster than you can visually identify without artificially slowing down the frames? Of course you can't but you would notice the smoothness jump probably. This is one of these technologies like freesync/gsync that will just take time to adopt and then becomes mainstream to the point we don't actually notice or even care.

If enabling frame generation decreases quality and you can see that to the point that it detracts from your gaming experience then you have a choice to not use it. I don't see any negatives to having the option to use frame generation if you so choose.
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,372
You may think you’re making a insightful point but you’re really not.

Whilst rendering techniques have always been a complex illusion tailored to create the best visual fidelity at the best performance, the number of frames a given engine can generate on a hardware platform and the associated input latency are very real and very measurable.

Just because an engine might render shadows at a low resolution and filter them, or calculate ambient occlusion at a low fidelity and mask it with textures and normal maps, it doesn’t diminish the fact that an increase in frames-per-second has always resulted in a subsequent improvement in input-latency – higher framerates have been typically desired for both those qualities of which, frame-generation, or ‘fake frames’ only provides the former.

Now, personally I don’t care much, since I’ve only tinkered with DLSS-FG and don’t presently have much use for it – I get 90+ (real) fps in all the games I play at the resolution and quality settings I play at.

I do think it’s important though to separate what frame generation does (improve apparent visual smoothness) from the claims (mostly by Nvidia) that it offers improved ‘performance’ (which it doesn’t – genuine performance gains = more frames and less latency).

Does it matter? To some people yes, and to some no, but to dismiss the specifics of the discussion with the trite ‘it’s all fake anyway’ argument is swallowing Nvidia’s marketing BS hook, line and sinker.

Blurbusters describes it best:


Yesterday, “fake frames” was meant to refer to classical black-box TV interpolation. It is funny how the mainstream calls them “fake frames”;
But, truth to be told, GPU’s are currently metaphorically “faking” photorealistic scenes via drawing polygons/triangles, textures, and shaders. Reprojection-based workflows is just another method of “faking” frames, much like an MPEG/H.26X video standard of “faking it” via I-Frames, B-Frames and P-Frames.
That’s why, during a bit of data loss, video goes “kablooey” and turns into garbage with artifacts — if a mere 1 bit gets corrupt in a predicted/interpolated frame in a MPEGx/H26x video stream. Until the next full non-predicted/interpolated frame comes in (1-2 seconds later).
Over the long-term, 3D rendering is transitioning to a multitiered workflow too (just like digital video did over 30 years ago out of sheer necessity of bandwidth budgets). Now our sheer necessity is a Moore’s Law slowdown bottleneck. So, as a shortcut around Moore’s Law — we are unable to get much extra performance via traditional “faking-it-via-polygons” methods.
The litmus test is going lagless and artifactless, much like the various interpolated frame subtypes built into your streaming habits, Netflix, Disney, Blu-Ray, E-Cinema, and other current video compression standards that use prediction systems in their compression systems.
Just as compressors have original knowledge of the original material, modern GPU reprojection can gain knowledge via z-buffers and between-frame inputreads. And “fake it” perceptually flawlessly, unlike year 1993’s artifacty MPEG1. Even the reprojection-based double-image artifacts disappear too!
TL;DR: Faking frames isn’t bad anymore if you remove the “black box” factor, and make it perceptually lagless and lossless relative to other methods of “faking frames” like drawing triangles and textures

At the end of the day the gaming experience matters and if frame generation helps with that experience then so be it. Honestly I have no problem with either technology or future technologies that enhance my gameplay experience. If it's a fake frame who cares? Are you able to identify a "fake frame" that passes faster than you can visually identify without artificially slowing down the frames? Of course you can't but you would notice the smoothness jump probably. This is one of these technologies like freesync/gsync that will just take time to adopt and then becomes mainstream to the point we don't actually notice or even care.

If enabling frame generation decreases quality and you can see that to the point that it detracts from your gaming experience then you have a choice to not use it. I don't see any negatives to having the option to use frame generation if you so choose.

Exactly.

Even with amds FSR 3 injected into cp 2077, it is obvious the ghosting/trailing on the car tail and so on and whilst it is somewhat distracting/annoying, it makes path tracing playable on older/weaker gpus that otherwise would not be possible unless you like 10-50 fps depending on your gpu and res and what upscaling you're using
 
Last edited:
Associate
Joined
15 Sep 2009
Posts
1,414
Location
London
Even with amds FSR 3 injected into cp 2077, it is obvious the ghosting/trailing on the car tail and so on and whilst it is somewhat distracting/annoying, it makes path tracing playable on older/weaker gpus that otherwise would not be possible unless you like 10-50 fps depending on your gpu and res and what upscaling you're using
Not saying that it isn't a useful or worthwhile feature - just that the 'fake frames' moniker is actually an appropriate shorthand given its shortcomings.

It's also handy to keep AMD and Nvidia honest - don't forget Nvidia advertised the 4070 Ti as having double the 'performance' of the 3090 Ti.

*Edit* BTW - I agree with Blur Busters' statement:

TL;DR: Faking frames isn’t bad anymore if you remove the “black box” factor, and make it perceptually lagless and lossless relative to other methods of “faking frames” like drawing triangles and textures

The devil is in the details though - current methods of frame generation aren't perceptually lagless relative to the latency improvements you'd get by simply drawing triangles and textures faster ('lossless' is somewhat more subjective).
 
Last edited:
Caporegime
OP
Joined
4 Jun 2009
Posts
31,372
Not saying that it isn't a useful or worthwhile feature - just that the 'fake frames' moniker is actually an appropriate shorthand given its shortcomings.

It's also handy to keep AMD and Nvidia honest - don't forget Nvidia advertised the 4070 Ti as having double the 'performance' of the 3090 Ti.

*Edit* BTW - I agree with Blur Busters' statement:



The devil is in the details though - current methods of frame generation aren't perceptually lagless relative to the latency improvements you'd get by simply drawing triangles and textures faster ('lossless' is somewhat more subjective).

Isn't there shortcomings to everything though? Even native has its shortcomings and imo, arguably more shortcomings than this "fakery" stuff. It is always a case of pick your poison.

Latency can and definetly is an issue though to an extent and has various factors which can impact it i.e.

- if you use M+K or controller
- what your base fps is, if base fps is 60 fps, it's not a huge issue imo
- what type of games you play, obviously anything FPS/PVP will be a no go but for SP, especially games like MSFS, AW 2, I wouldn't be worrying about latency personally
- nvidia does lead here in this regard which makes it a better experience, although I found avatars FSR 3 latency to be very good and it felt snappier than without (but had to stop using due to other reasons)

The one thing which I really like about frame gen which I don't see many people mention is how it increases the 0.1 and 1% lows, this is something that can bug me a lot of the time e.g. in hogwarts, even though, no need to use FG here, these are much better
 
Last edited:
Associate
Joined
15 Sep 2009
Posts
1,414
Location
London
Isn't there shortcomings to everything though? Even native has its shortcomings and imo, arguably more shortcomings than this "fakery" stuff. It is always a case of pick your poison.

Latency can and definetly is an issue though to an extent and has various factors which can impact it i.e.

- if you use M+K or controller
- what your base fps is, if base fps is 60 fps, it's not a huge issue imo
- what type of games you play, obviously anything FPS/PVP will be a no go but for SP, especially games like MSFS, AW 2, I wouldn't be worrying about latency personally
- nvidia does lead here in this regard which makes it a better experience, although I found avatars FSR 3 latency to be very good and it felt snappier than without (but had to stop using due to other reasons)

The one thing which I really like about frame gen which I don't see many people mention is how it increases the 0.1 and 1% lows, this is something that can bug me a lot of the time e.g. in hogwarts, even though, no need to use FG here, these are much better
As an additional feature (like upscaling, Reflex etc.) I think frame generation is an obvious win for PC gamers - if you like it, use it - if you don't, you don't have to.

But as we've seen, Nvidia wants the blur the line between what's a genuine generational performance uplift (say, 4090 vs 3090) and what's a feature improvement (3060 Ti vs 4060 Ti) - they want to sell you less actual hardware and make up the performance deficit with software features that may or may not be supported by the games you play.

Had they been completely transparent with what frame generation was in their marketing, I think they'd had have gotten a lot less backlash regarding the feature - instead, they attempted to obfuscate the lack of any real generational improvement in the low-to-midrange stack of the 40-series product line and their customers were rightly ticked off about it.

AMD's own FG technology has also highlighted that clearly Nvidia could've shipped a software-only version of DLSS-FG for 20/30-series owners and chose not to. Is it any wonder the technology is regarded with derision and scepticism by some now?

One positive I'll note though is that whether you use DLSS-FG or not, it's sped up the adoption of Reflex in games that otherwise wouldn't have it and that's a big win for all Nvidia users.
 
Soldato
Joined
4 Feb 2006
Posts
3,226
That article looks like it's written by a Nvidia fanboy. Calling people 'whiny little kids' is immature and poor journalism. Maybe he should wake up and look at the reality. People were shown DLSS3 FG as a great new feature but the caveat was that you needed to purchase a very expensive 4000 series gpu. Majority of games do not have the money to spend a large amount of cash just to experience what is essentially a fake frame interpolation technique similar to the smooth frames feature on countless TV's.

Now that AMD's has let people experience it for FREE, many will have changed their opinion and that is absolutely fine and a sign of maturity. Only a butthurt fanboy would complain that gamers are appreciating AMD and it's open source FSR3 FG. I'm pretty sure everyone would have been very happy if Nvidia had done the same and allowed everyone to experience frame generation instead of locking the majority out.

I suspect the article author was having a dig at AMD users but the reality is that the free FSR3 mod by Nukem only works on RTX cards so it is actually Nvidia owners who are the majority of people voicing their positive opinions of FSR3 frame gen.
 
Last edited:
Associate
Joined
13 Jul 2020
Posts
501
That article looks like it's written by a Nvidia fanboy. Calling people 'whiny little kids' is immature and poor journalism. Maybe he should wake up and look at the reality. People were shown DLSS3 FG as a great new feature but the caveat was that you needed to purchase a very expensive 4000 series gpu. Majority of games do not have the money to spend a large amount of cash just to experience what is essentially a fake frame interpolation technique similar to the smooth frames feature on countless TV's.

Now that AMD's has let people experience it for FREE, many will have changed their opinion and that is absolutely fine and a sign of maturity. Only a butthurt fanboy would complain that gamers are appreciating AMD and it's open source FSR3 FG. I'm pretty sure everyone would have been very happy if Nvidia had done the same and allowed everyone to experience frame generation instead of locking the majority out.
Oh yeah, because complaining and bashing something they havent tried yet ( DLSSFG ) showed tons of maturity back when 4xxx series launched. Or the part where they go like FSR3 or AFMF same as DLSSFG when it’s clearly not the case is another sign of maturity. So mature these AMD and Nvidia (2x,3x) users that complained…
 
Last edited:
Caporegime
OP
Joined
4 Jun 2009
Posts
31,372
As an additional feature (like upscaling, Reflex etc.) I think frame generation is an obvious win for PC gamers - if you like it, use it - if you don't, you don't have to.

But as we've seen, Nvidia wants the blur the line between what's a genuine generational performance uplift (say, 4090 vs 3090) and what's a feature improvement (3060 Ti vs 4060 Ti) - they want to sell you less actual hardware and make up the performance deficit with software features that may or may not be supported by the games you play.

Had they been completely transparent with what frame generation was in their marketing, I think they'd had have gotten a lot less backlash regarding the feature - instead, they attempted to obfuscate the lack of any real generational improvement in the low-to-midrange stack of the 40-series product line and their customers were rightly ticked off about it.

AMD's own FG technology has also highlighted that clearly Nvidia could've shipped a software-only version of DLSS-FG for 20/30-series owners and chose not to. Is it any wonder the technology is regarded with derision and scepticism by some now?

One positive I'll note though is that whether you use DLSS-FG or not, it's sped up the adoption of Reflex in games that otherwise wouldn't have it and that's a big win for all Nvidia users.

Their powerpoint slides have always shown what DLSS preset is used as well as if FG is on or off so I wouldn't say entirely misleading, amds on the other hand, were beyond misleading. Nvidias PR/marketing material also describes the technology extremely well, DF promotional video also outlined it too so they were very transparent here how the performance improvement was being done:


And also, again, making software solutions is not cheap/free either, it probably costs more than improving hardware especially when you factor in the ongoing costs for developers time (it's closed source so only nvidia can improve it). Nvidia could very well do what amd did (although as shown, it still isn't quite as good as dlss 3 not to mention, the lack of uptake still) but it then means, they have to maintain another solution that works completely different to their current version unless they do a basic version based of FSR 3, which I doubt they'll want to as not much point since people can enable it FSR 3 or just mod it in themselves i.e. from nvidias pov, it doesn't make much sense, again, amd has no other choice but to go about the way they do, you can't be last to the market with an inferior version and expect to lock it down.

I agree, it's not ideal but at the end of the day, MLID is somewhat true and said companies have to look at other more efficient ways to get that much better performance, without these, we wouldn't be able to enjoy "real time" ray tracing right now for example (unless you gimp it which then makes it not worthwhile for gamers).

Agree on the reflex part although I believe the quicker/wider adoption is because of streamline so devs are able to integrate all the rtx tech in one swoop now.

That article looks like it's written by a Nvidia fanboy. Calling people 'whiny little kids' is immature and poor journalism. Maybe he should wake up and look at the reality. People were shown DLSS3 FG as a great new feature but the caveat was that you needed to purchase a very expensive 4000 series gpu. Majority of games do not have the money to spend a large amount of cash just to experience what is essentially a fake frame interpolation technique similar to the smooth frames feature on countless TV's.

Now that AMD's has let people experience it for FREE, many will have changed their opinion and that is absolutely fine and a sign of maturity. Only a butthurt fanboy would complain that gamers are appreciating AMD and it's open source FSR3 FG. I'm pretty sure everyone would have been very happy if Nvidia had done the same and allowed everyone to experience frame generation instead of locking the majority out.

So you don't like capitalism then?

The article is entirely about the irony and hypocrisy that happens.
 
Associate
Joined
15 Sep 2009
Posts
1,414
Location
London
Last edited:
Caporegime
OP
Joined
4 Jun 2009
Posts
31,372
Thanks Steve!


How do you measure performance in the grand scheme of "pc gaming"? Is it not FPS related?

Those charts are always exaggerated too, this is nothing new (from any brand).

I didn't realise they had that chart though where it didn't show FG etc. which is bad unless they followed up in the video by specifically saying dlss 3 was used?
 
Last edited:
Associate
Joined
15 Sep 2009
Posts
1,414
Location
London
How do you measure performance in the grand scheme of "pc gaming"? Is it not FPS related?

Those charts are always exaggerated too, this is nothing new (from any brand).

I didn't realise they had that chart though where it didn't show FG etc. which is bad unless they followed up in the video by specifically saying dlss 3 was used?
Every legit tech-tuber at the time called them out for their deceptive marketing and that '3x' they quoted was only for a single game - the (at that time, unreleased) path-traced version of CP2077 (in averaged performance the 4070 Ti is slower than the 3090 Ti).

I'd also add that an asterisk on a Power Point slide isn't being transparent with your customers - it's the bare minimum you do to avoid being sued for misrepresenting your products. The entire Ada launch was slimy.
 
Last edited:
Caporegime
OP
Joined
4 Jun 2009
Posts
31,372
Every legit tech-tuber at the time called them out for their deceptive marketing and that '3x' they quoted was only for a single game - the (at that time, unreleased) path-traced version of CP2077.

I'd also add that an asterisk on a Power Point slide isn't being transparent with your customers - it's the bare minimum you do to avoid being sued for misrepresenting your products. The entire Ada launch was slimy.

Just watched the video from nvidia and yeah it's much more deceptive than the original 4090/4080/dlss 3 video but he did specifically say with dlss 3 when talking about the increased performance but alas they should have at least shown at the bottom, the specs with FG/DLSS 3 specifically stated as being on.

I don't disagree it was pretty poor, amds was even worse especially where "power efficiency" claims were boasted about and that was a complete and utter lie. Same for intel and their cpus when compared to amd cpus.
 

mrk

mrk

Man of Honour
Joined
18 Oct 2002
Posts
101,105
Location
South Coast
My 3-step take on what's good and what's not:

1: Does upscaling/FG enhance the performance without sacrificing image quality?
2: Can I use path tracing with the rest of the GFX settings maxed out (or use values that look the best since some games ship with HIGH looking better than ULTRA for various options, for example)
3: Can all of the above net me 100fps or more for a 3440x1440 output resolution?

If all 3 are a yes, then nothing more needs to be considered. Most games hit all 3 fairly easily thanks to FG/DLSS. And as of late I've not actually seen any ghosting on the car/V when moving around so the latest 2.1 update (ReSTIR GI being the key driver for this) has certainly helped that, Obviously that sort of tech advance will only be seen by RTX card users, all the more reason to lobby to get AMD/Intel to put resources into supporting the same techniques.
 
Soldato
Joined
6 Feb 2019
Posts
17,875
That article looks like it's written by a Nvidia fanboy. Calling people 'whiny little kids' is immature and poor journalism. Maybe he should wake up and look at the reality. People were shown DLSS3 FG as a great new feature but the caveat was that you needed to purchase a very expensive 4000 series gpu. Majority of games do not have the money to spend a large amount of cash just to experience what is essentially a fake frame interpolation technique similar to the smooth frames feature on countless TV's.

Now that AMD's has let people experience it for FREE, many will have changed their opinion and that is absolutely fine and a sign of maturity. Only a butthurt fanboy would complain that gamers are appreciating AMD and it's open source FSR3 FG. I'm pretty sure everyone would have been very happy if Nvidia had done the same and allowed everyone to experience frame generation instead of locking the majority out.


Hypocritical fanboys deserved to be called out, article author was spot on
 
Soldato
Joined
16 Sep 2018
Posts
12,712
I pretty much stopped reading when the author began by trying to support their argument with fallacies...
Most PC gamers are whiny little kids with zero knowledge who pretend to know things.
Does John have evidence that 'most' gamers are what he claims? Or is he just projecting his personal opinion onto the whole.
When NVIDIA released DLSS 3, everyone was calling Frame Generation a total flop.
Again 'everyone' was not calling it a total flop, in fact IIRC the OP in-particular praised FG.

From that point on i pretty much gave up reading because the author is clearly being disingenuous, because if he said some "PC gamers are whiny little kids with zero knowledge who pretend to know things" and some people "was calling Frame Generation a total flop" his entire argument falls apart.
 
Back
Top Bottom