• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD FSR 3.0 has exposed the ugly truth about most PC gamers

Oh yeah, because complaining and bashing something they havent tried yet ( DLSSFG ) showed tons of maturity back when 4xxx series launched. Or the part where they go like FSR3 or AFMF same as DLSSFG when it’s clearly not the case is another sign of maturity. So mature these AMD and Nvidia (2x,3x) users that complained…

You are assuming (like the article writer) that those who were bashing DLSS3 are the same ones praising FSR3 FG. That is a fallacy. The article also seems to be blaming AMD users when the reality is that the free FSR3 mod on Nexus only works on RTX cards.
It's quite clear from the article comments that the guy who wrote the article is Nvidia biased. He received a FREE RTX 4090 a month ago. Perhaps it was a payment to the shill or he simply feels the need to defend them for giving him a freebie.
 
Last edited:
You are assuming (like the article writer) that those who were bashing DLSS3 are the same ones praising FSR3 FG. That is a fallacy.
It's quite clear from the article comments that the guy who wrote the article is Nvidia biased. He received a FREE RTX 4090 a month ago. Perhaps it was a payment to the shill or he simply feels the need to defend them for giving him a freebie.
I’ve tried both and think both Nvidia and AMDs FG is buggy and poor in its current state, it’s kinda like DLSS 1.0 and needs a lot of improvement’s before it’ll be any good.
 
I’ve tried both and think both Nvidia and AMDs FG is buggy and poor in its current state, it’s kinda like DLSS 1.0 and needs a lot of improvement’s before it’ll be any good.

We will see when they implement FSR3 in CP2077 and starfield to see a proper comparison. Mods are not really going to be perfect representations. I tested in CP2077, Witcher 3 and The Last of US and and it appears very smooth. Perfectly acceptable.
 
Last edited:
You may think you’re making a insightful point but you’re really not.

Whilst rendering techniques have always been a complex illusion tailored to create the best visual fidelity at the best performance, the number of frames a given engine can generate on a hardware platform and the associated input latency are very real and very measurable.

Just because an engine might render shadows at a low resolution and filter them, or calculate ambient occlusion at a low fidelity and mask it with textures and normal maps, it doesn’t diminish the fact that an increase in frames-per-second has always resulted in a subsequent improvement in input-latency – higher framerates have been typically desired for both those qualities of which, frame-generation, or ‘fake frames’ only provides the former.

Now, personally I don’t care much, since I’ve only tinkered with DLSS-FG and don’t presently have much use for it – I get 90+ (real) fps in all the games I play at the resolution and quality settings I play at.

I do think it’s important though to separate what frame generation does (improve apparent visual smoothness) from the claims (mostly by Nvidia) that it offers improved ‘performance’ (which it doesn’t – genuine performance gains = more frames and less latency).

Does it matter? To some people yes, and to some no, but to dismiss the specifics of the discussion with the trite ‘it’s all fake anyway’ argument is swallowing Nvidia’s marketing BS hook, line and sinker.

The amount of backhanded mean girl comments on this forum by what are meant to be grown men is amazing :cry:
 
Bitches PLEASE!!!!! :D
b6ac931e910837a3e1718cd4d84c7665.gif
 
My 3-step take on what's good and what's not:

1: Does upscaling/FG enhance the performance without sacrificing image quality?
2: Can I use path tracing with the rest of the GFX settings maxed out (or use values that look the best since some games ship with HIGH looking better than ULTRA for various options, for example)
3: Can all of the above net me 100fps or more for a 3440x1440 output resolution?

If all 3 are a yes, then nothing more needs to be considered. Most games hit all 3 fairly easily thanks to FG/DLSS. And as of late I've not actually seen any ghosting on the car/V when moving around so the latest 2.1 update (ReSTIR GI being the key driver for this) has certainly helped that, Obviously that sort of tech advance will only be seen by RTX card users, all the more reason to lobby to get AMD/Intel to put resources into supporting the same techniques.
I am surprised you suggested that Upscaling/FG doesn't sacrifice image quality considering how much emphasis you put into it because that statement is not a yes at all!
I still also don't understand why the ability to give 100fps is good if the latency is so much higher. It doesn't feel any better which is one of the key things higher FPS should give you, it has interpellation issues with hit registry in games due to the created frame having data points that are incorrect. The games that don't have such issues are not games you need the higher FPS anyways so it really doesn't add anything that is good at this point.

With that you can't get AMD/Intel to supporting the same techniques because Nvidia own them but that a secondary issue.

If they can get the latency issue to be zero so you actual feel the gain from the FPS number it is posting awesome. If you loose the ghosting (it deffo still there in 2.1 CP2077 I was playing it yesterday on the works computer and it still looked better with it off and reduced ghosting/shimmering).
 
Last edited:
I am surprised you suggested that Upscaling/FG doesn't sacrifice image quality considering how much emphasis you put into it because that statement is not a yes at all!
The games I play the image quality isn't really sacrificed at all, generally it's enhanced with sharper details and in some games even more details because of the image reconstruction stage of DLSS, something you don't get in native or DLAA. I've already posted countless comparisons of native vs DLAA vs DLSS in various game threads already showing that there isn't an image quality sacrifice as well.

But for a more recent reference point, here's Cyberpunk with native 3440x1440, then DLSS Quality, then DLSS Quality + Frame Gen. Keep in mind screenshots only tell half the story, it has to be seen in motion in person to get the full benefit, but the imgsli gives you a good idea of static image quality:


^ Look at the average PC latency shown lower left between native vs DLSS vs DLSS+FG, this is what HWUB were on about in their new video, if the raster performance was better then the native res latency would be much lower, so upscaling is needed to greatly reduce the latency by increasing the fps. Whilst adding FG into the mix increases the PC latency by 10ms, Reflex then goes in and reduces that so it feels much better.

The plants in the distance and other surface details are more defined in the DLSS+FG screen too.

Edit* LOL, I just noticed that on native you are actually missing things that should be there which DLSS is reconstructing correctly, zoom into the road surface:

KFSs4na.png


I think what we would be seeing if raster performance was high was that upscaling would still be used because of its AI benefits which increases surface texture detail and sharpness. DLAA is only an AA method, native + DLAA would not get those same results, and would only reduce fps even further as shown in basically every game right now. DLSS is what increases details/sharpness as well as framerate without a sacrifice to image quality.

As for the latency, as long as the baseline fps is over 60fps before enabling FG (whether simply upscaling is on or off doesn't matter), which on a 4090 is the case so all fine there. Reflex will do its thing to reduce any latency increase and bring it back down to normal levels as well and a delta of 10ms isn't going to make much difference when you're getting an output framerate of +100fps anyway.

I can't speak for what the experience is like on a mid range card though, so I fully expect latency to be an issue on say a 4070 because the baseline framerate is not going to be hitting 60fps before FG is enabled in games like Cyberpunk at 1440p when path tracing is enabled.
 
Last edited:
When it comes to input lag, tftcentral figures are pretty accurate in my experience:

  • Class 1) Less than 16ms / 1 frame lag – should be fine for gamers, even at high levels
  • Class 2) A lag of 16 – 32ms / One to two frames – moderate lag but should be fine for many gamers
  • Class 3) A lag of more than 32ms / more than 2 frames – Some noticeable lag in daily usage, not suitable for high end gaming
I would personally still take 40-50ms with 80+ fps over say 20-30ms and FPS of 40 though as to me, the experience of playing the game with the higher FPS is much more enjoyable. OBviosuly for PVP, it would be a no go but then you'll be reducing settings or/and only using upscaling here.
 
  • Like
Reactions: mrk
? It's a comparison and relevant to the exact discussion so I am not seeing what you are getting at. Do some of you have tunnel vision or something.
 
Last edited:
Edit* LOL, I just noticed that on native you are actually missing things that should be there which DLSS is reconstructing correctly, zoom into the road surface:
By that logic if i start seeing pink elephants the alcohol is letting me see things that are missing in real life.
 
Back
Top Bottom