• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD's FSR3 possibly next month ?

If we can see a 10-15% uplift in all games I’d welcome that.

Launching with Starfield would be a major plus for anyone owning a AMD card right now. Imagine a 10-15% uplift on a 7900xt(x), could be game changing! Need to see the benchmarks for this game first, im holding out till i see reviews whether to switch out my trusty 3090.

Fake frames uplift?

More like your on-screen FSP counter says one thing, your eyes tell you something completely different.

Lol, I forgot about this fake frame FSR nonsense. I’ve literally just installed CP2077 to test DLSS fake frames and my mind has not changed. All it does is add excessive lag and feels crap. My eyes say smooth but my brain says SLOOOWWW and that’s with the reduced input lag set in NVCP. It reminds me of setting vsync on with a no Freesync monitor.

On a side note CP 2077 really is a crap looking (not to mention just crap) game even with full RT. So much pop up it’s hard to believe this is the poster child for RT.
Not only that, it's a had a near $500 $360* million budget, and looks pretty poor. And it's not just the lighting where the RT sponsorship seems to have messed up the artistic direction - making it look poor whether raster or RT, just in different ways - but also the characters don't look great for such a huge budget.

* yes had to look it
"only" $360 and $142 of that was marketing. Maybe the huge marketing budget explains the hype?
 
Launching with Starfield would be a major plus for anyone owning a AMD card right now. Imagine a 10-15% uplift on a 7900xt(x), could be game changing! Need to see the benchmarks for this game first, im holding out till i see reviews whether to switch out my trusty 3090.

FSR3 is frame generation

Frame generation doesn't boost frame rate by 15%, it boosts it by 50% to 100%
 
FSR3 is frame generation

Frame generation doesn't boost frame rate by 15%, it boosts it by 50% to 100%

Nvidia's FG yes but AMD's may not be as powerful but even 15% is better than a kick in the unmentionables.

But fake frames, why on earth would you use that?! It's cheating!



:D

Once AMD can copy what Nvidia has, This goes for any tech, All the nay sayers come out of the wood work to praise it, Happens every single time.

Just watch the people on this forum who are screaming and crying frothing at the mouth seething with pure hatred for Nvidia's frame generation do a complete 180 once AMD's version is out.
 
Last edited:
Once AMD can copy what Nvidia has, This goes for any tech, All the nay sayers come out of the wood work to praise it, Happens every single time.

Just watch the people on this forum who are screaming and crying frothing at the mouth seething with pure hatred for Nvidia's frame generation do a complete 180 once AMD's version is out.

"DLSS killer!!!!"

For the 3rd time :D
 
I agree my biggest regret buying a game in 2020, £30 loss on this game I never played and still can’t bring myself to play. I thought the graphics were a bit meh and not sure what the fuss was about. Jedi Fallen Order or Survivor graphics on Unreal 4 so much better than Cyberpunk engine.
Why didn't you just refund the game then? I played for about 5 hours, tried to over look all the bugs but seen realised it was nearly impossible so got a refund :)
 
since lag and artifacting seem to be fundamental problem with frame gen i cant see it being much good for most users with slow amd gpu's, that's where its need most but seems the worst at least to me. might be ok in slow paced games where lag isnt that much of a problem so long as there's not glitches all over the picture.
 
More than two seconds thinking about, and it is obvious that fake frames will always have a latency penalty and there is nothing which can be done about that.

Well, until not just universal AI but also a precognitive one.

Until then, fake frames can do one of two things:

1) Make a game already running fast appear a bit smother. So taking a 80-120 FSP real frames and adding some interpolation to that.

2) Making a slow running game appear to have smother frames but far worse latency making it appear that you are playing a marionette.

In both cases, I fail to see the point.

What is next? Benchmark all games at max FPS when benchmarking should move away from average FPS to mins. But hey, far too many people think bigger numbers are better, so why not fool them?
 
we already have it with nvidia ? you do realize it increases latency ? + fake frames ? what makes you think AMD will do it better ?

Never tried it tbh, dont have a 4000 series card to test it. I've only ever heard it and never experienced it. Curious as to how bad it is.
 
Last edited:
Never tried it tbh, dont have a 4000 series card to test it. I've only ever heard it and never experienced it. Curious as to how bad it is.
It’s not bad at all as all games that include frame generation also include reflex which is mandatory.
If your base framerate is relatively high let’s say 60fps so upping that to 120fps using frame generation feels really good.
Obviously it’s not as good as real 120fps but somewhere in the middle.
Responsiveness feels like 60 with visual fluidity of 120.
Just like with DLSS upscaling implementation can vary from very good to pretty bad depending on a game.
 
Look at all these people complaining about latency eith FG. All i can say is ‘lol’. Doesnt deserve any other answer.

There is no perceivable latency as long as your ‘real’ fps is close to 60. And that is in FPS ( first person shooters), nevermind other genres which are inherently less prone to latency differences. I tested with 60 + FG vs 120 native on an oled monitor with instant pixel response. Same ****. But keep on making false claims out your behinds, as long as it makes you feel better. I swear tech forums are the worst in some cases, people pretending to be knowledgeable intellectuals … while only pretending.

:rolleyes:

Ps: reminds me. Even better. Casual gamers who simply ‘game’ and dont bother with min maxing / enthusiast **** won’t even notice latency difference… ever. They’ll look at anyone that says otherwise like they lost their marbles.

I mean, think about it. There’s ‘gamers’ out there who used tv smoothing. Do i need to say more?
 
Last edited:
Look at all these people complaining about latency eith FG. All i can say is ‘lol’. Doesnt deserve any other answer.

There is no perceivable latency as long as your ‘real’ fps is close to 60. And that is in FPS ( first person shooters), nevermind other genres which are inherently less prone to latency differences. I tested with 60 + FG vs 120 native on an oled monitor with instant pixel response. Same ****. But keep on making false claims out your behinds, as long as it makes you feel better. I swear tech forums are the worst in some cases, people pretending to be knowledgeable intellectuals … while only pretending.

:rolleyes:

Ps: reminds me. Even better. Casual gamers who simply ‘game’ and dont bother with min maxing / enthusiast **** won’t even notice latency difference… ever. They’ll look at anyone that says otherwise like they lost their marbles.

I mean, think about it. There’s ‘gamers’ out there who used tv smoothing. Do i need to say more?

Mate of mine is one of these who is always complaining and screaming about fake frames and latency, the frothing at the mouth type like we see on these forums... I sat him down to do a blind A/B test in The Witcher 3.

Everything he chose that "felt horrendous and was the worst ever" was 60+ FPS gameplay without FG enabled, With FG enabled it was and I quote "really buttery smooth so it can't be fake frames <insert frothing at the mouth and screaming>"

For whatever really weird reason, Frame gen seems to have brought the loons out of the woodwork or just highlighted some peoples desperate need to go to a therapist.
 
Mate of mine is one of these who is always complaining and screaming about fake frames and latency, the frothing at the mouth type like we see on these forums... I sat him down to do a blind A/B test in The Witcher 3.
I tried the Witcher 3 with FG on my mates 4090 and there was noticeable artifacting especially when moving the view around fast.
 
But but but, fake frames!!!! Surely the same people who slate nvidias FG will also have the same outlook for amds fsr 3 :p ;) :D

Going to be very interesting to see how amds solution looks though given nvidia are relying on hardware to solve latency issues etc. not to mention the quality, we are already seeing how the lack of ai/machine learning is holding back fsr 2 big time compared to dlss, can't see it being any better here either, if anything, will probably be an even bigger gap, at least for the initial phase....
As a current owner of an all out AMD rig, my feelings about FSR is the exact same as my feelings for DLSS. I'm not going to use it unless I do not have any other options for tolerable framerates. I don't mind new features but some is just not my cup of tea. I'm sure AMDs solution will be around 90-95% there of DLSS3 but be deemed worthless by the usual suspects because "OMG I need that last 5-10% image quality, starring at fences, while I'm suppose to go zoom zoom around on the map in my competitive game of choice". Here's the thing though, if you using FG(FSR/DLSS) then your already compromising somewhere else so why bother? Now that people actually care about framerates(remember the good old 30fps is enough days?), but for some reason doesn't understand frame pacing, FG just seems like a tool purely made to give the marketing department something shiny, aka high fps numbers, to show the masses. But eehhhh, what do I know, I'm just a pleb with a very expensive hobby.
 
Back
Top Bottom