• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

FSR 3.0 + frame generation looks like it could become a big deal

It will be interesting to see Nvidia squirm due to their earlier statement of not supporting FG on older cards due to hardware requirements IF AMD manages to make it work on pre 7000 series cards :D

I hope so as means I can keep my 3080 for longer, just hope it is the start of 2023 and not the end of 2023 though :p :D :cry:

Nvidia have stated they could bring their version to older nvidia RTX gpus but that it would result in a poor experience in terms of "latency and IQ" i.e. they don't want to put in the effort/time to optimise it for anything except ada and disrupt ada sales.
 
Last edited:
I hope so as means I can keep my 3080 for longer, just hope it is the start of 2023 and not the end of 2023 though :p :D :cry:

Nvidia have stated they could bring their version to older nvidia RTX gpus but that it would result in a poor experience in terms of "latency and IQ" i.e. they don't want to put in the effort/time to optimise it for anything except ada and disrupt ada sales.

I read on a random reddit thread of a software dev working with FG, Now whether he was a software dev I'll never know but the way he explained it sounded plausaible that with the way the optical flow accelerator works on Ada it may actually end up causing performance regression rather than adding "frames" on 2000/3000 series cards, If that's true only people working intimately with the hardware would know and be able to clarify it all.
 
I suspect so. I would expect the consoles coming next year are RDNA3 in order to make it to 8k using FSR3. This would mean an extensive shift within most new games that want to make it to this resolution and not run at 1fps.
8K????????????

LMFAO.
 
2-B64-D874-E27-B-4-D2-D-9110-089-CE9706438.jpg
There will be no games it can run at 8K. Unless they start pushing out PS2 games.
 
I read on a random reddit thread of a software dev working with FG, Now whether he was a software dev I'll never know but the way he explained it sounded plausaible that with the way the optical flow accelerator works on Ada it may actually end up causing performance regression rather than adding "frames" on 2000/3000 series cards, If that's true only people working intimately with the hardware would know and be able to clarify it all.
IF AMD gets it working on older hardware, then NV's just taking the **** at this point.
 
is frame gen like v blank, inserting a black frame to smooth motion out which id love to have as a feature for watching anime or shows with fast motion. tho for fg i presume its a fake but full color frame thats predicts motion and trys with ai to look like the next real frame.
 
120fps with 16ms latency looks better than 60fps with 16ms latency

Until you try it don't dismiss it, yes latency isn't changing but the motion happening on the screen looks smoother to your eyes so there is still some benefit
Except nvidia's figures showed 56ms latency. I don't need to try 56ms latency to know that it will feel bad as I had a TV that was slightly better than that and it still felt awful.
 
It's good that nvidia are leading the way with these features and setting an example for others to follow in their footsteps tbh, this is where and why nvidia can cash in on things like this i.e. being first and having the market to themselves for a good few months/years and those who don't want to spend the extra to get said features ASAP in the games they play can either wait or just reduce settings to hit their FPS target.

is frame gen like v blank, inserting a black frame to smooth motion out which id love to have as a feature for watching anime or shows with fast motion. tho for fg i presume its a fake but full color frame thats predicts motion and trys with ai to look like the next real frame.

Copying my post from another thread:

But yes, when we look past the over analytical testing scenarios by likes of HUB, DF and when used in the appropriate scenarios i.e. your base fps is already somewhat decent (so you're getting good input lag) and you're just playing the game, frame generation is very good.

Also, it doesn't reduce load on cpus, frame generation is doing nothing to help cpu issues here, it is simply adding "fake" frames to be able to increase overall fps and give the smoother gameplay, essentially bypassing cpu issues. Basically, it is taking the 1st and 3rd frame to insert a fake AI generated one in the middle of 2 real frames e.g.

hxr1bfg.png


This is why when you switch from a completely different frame/scene/angle, the fake frame looks considerably worse, hubs clickbait thumbnail for their video, when you look at what they did in order to get that:

the 1st real frame:

nTGG70i.png


the fake 2nd frame which is created based on the 1st and 3rd frame:

Fn6bt6Y.png


the 3rd real frame:

gOfazBj.png

But as has been stated by both HUB, DF etc. you wouldn't notice the "fake" frames outside of slowing footage down or/and even having to pause and pick out the fake frames, believe Tim said he had a hard time doing this even
:cry:

Except nvidia's figures showed 56ms latency. I don't need to try 56ms latency to know that it will feel bad as I had a TV that was slightly better than that and it still felt awful.

It depends entirely on the game, what your base fps is. Nvidia have reduced the latency since launch as shown in DF miles morale video compared to the first spiderman. It also largely comes down to the engine used too i.e. just because fps is the same in every game doesn't mean the latency is the same too, it varies. Essentially the only time you won't want to be using FG is when:

- your base fps is < 50/60 fps (subjective)
- MP/PVP games
 
Last edited:
Except nvidia's figures showed 56ms latency. I don't need to try 56ms latency to know that it will feel bad as I had a TV that was slightly better than that and it still felt awful.
It's going to feel a whole lot better than playing at 30FPS with RT enabled (in games like the Witcher 3).

The whole point is to produce more frames to get a smoother framerate. It doesn't have to look perfect, as long as the minimum framerate is decent,

speaking as someone who's been playing games for decades, I've never really been bothered by delays in input. There probably are some games where you definitely wouldn't want any increased delay though.

Many older games had delays built it (slow animations or other delays), now it just seems like part of the charm.
 
Last edited:
I think frame gen. has a lot of potential for improving the framerate of older games too, which are poorly optimised like many Assassins creed games. Games like these have been optimised for 30 FPS on consoles (and had poor PC ports), but I think frame gen could smooth out the performance here too, if they can implement it as an option in display drivers.
 
I've been playing a lot of WH40K: Darktide, which supports frame-gen. It's definitely a mixed bag - a big performance increase and most of the time it's fine, but you get the occasional blast of obvious artifacting.
 
speaking as someone who's been playing games for decades, I've never really been bothered by delays in input. There probably are some games where you definitely wouldn't want any increased delay though.

I'm not sure what an amount of time spent playing games has anything to do with perceptibility of input lag, unless you're just saying "I'm old so can't perceive input lag anyway so I'm a lost cause"

As I said, I've tried gaming on TV's enough to know that I can't stand input lag, so it's a choice between RTX and 56ms of input lag or RTX off and getting actual real generated frames with no input lag then I will pick real refresh rate every single time. They should concentrate on developing effects that the card is actually capable of displaying, rather than faking frames to kid people in to buying a poor substitute.
 
They should concentrate on developing effects that the card is actually capable of displaying, rather than faking frames to kid people in to buying a poor substitute.
Well, that would be nice. All of the effort seems to have gone into making ray tracing look nice, then coming up with workarounds afterwards to make the performance more acceptable. This is true even at 1080p for many games, but gets much worse as the resolution is increased.

Making a big deal about ray tracing is the key to them selling new and expensive cards, so it's not particularly surprising. Otherwise, people would stick with a card that can handle their favourite games at 1440/4k resolution (at 60 or 120 FPS), and call it a day.

The push isn't really coming from AMD though, they are just desperately trying to keep up.

Being bothered by input lag is something relatively new I'd say (last 10 years, but more than that for competitive games like Quake). In the past, input lag was just part of the game, thinking back to the PS1, and even further back to much simpler games on the Atari and Amiga. To some extent, players had to learn to compensate for delays, sometimes this would be part of what made the game more difficult.

Latency over the Internet in multiplayer games has pretty much always been an issue ofc, but that's quite a different problem.

It's different for everyone, but input lag wasn't something that 'took me out of games' generally, but poor framerate was, especially if there was a heavy bottleneck somewhere (too many units on the screen in a strategy game for example).
 
Last edited:
Thens let's build some delays into everything to get charm back?

There's a difference between nostalgia and good gameplay.
The mind learns to compensate for any lag over time, people can and will play fine if they have no choice, but witjh the advent of choice, they pefer screaming like banshee's on forums instead of playing games.

And to the point here, games with higher latency are ones compatible with a controller where it's not noticed.
 
The mind learns to compensate for any lag over time, people can and will play fine if they have no choice, but witjh the advent of choice, they pefer screaming like banshee's on forums instead of playing games.

And to the point here, games with higher latency are ones compatible with a controller where it's not noticed.

Yep. I used to be fine playing my single player games at 30fps. You get used to it. These days I target 60fps and in some 90fps. I see the difference in every single jump. But it is not a must have thing for me.

I would rather stick to low fps on a few new games if need be then pay the silly money AMD & Nvidia asking.
 
The mind learns to compensate for any lag over time, people can and will play fine if they have no choice, but witjh the advent of choice, they pefer screaming like banshee's on forums instead of playing games.
I remember playing sonic the hedgehog when I was little, and finding out that if he stood next to a ledge, the animation would change depending on how close he was to falling off.

Went to show my brother, and tried to creep sonic closer to the edge, but every time I tapped right, either nothing happened, or I'd hit it for too long and fell off. The charm of the game was nothing to do with lag, it really irritated me.

Of course, this is a fairly extreme example but it's one I remember well. Controller latency is indeed noticeable, I'm no esports player but it is still frustrating when you are trying to be extremely precise and can't because your inputs don't quite match what you are seeing on screen. Anything that increases that concerns me - it might be fine when I try it, but not keen on the idea.
It's different for everyone, but input lag wasn't something that 'took me out of games' generally, but poor framerate was, especially if there was a heavy bottleneck somewhere (too many units on the screen in a strategy game for example).
Input lag did bother me, but this is also very true. Difference is you can do something about framerate with hardware to a degree, whereas I out lag you are just trying to get back to an acceptable level, not solve the issue.
 
I might be an outlier, I play CSGO competitively, moved to ESEA. Latency means everything to me in games that require it because I am vs skilled people, a game like Battlefield for example is not competitive, you are a single player against 32 other opponents, you can deal a lot of damage and maybe squad wipe but in the majority of games you are ineffective and against almost always imbalanced teams, I would rather just focus on enjoying the game than caring about latency too much, it is context dependant.

A game that is truly built around comp play like CS is a different ball game and I can see low latency being needed for things that require split second reflexes and driving games.

If that is all you play then sure, I get it.

I mostly play single player games though.

Here is my look back.


Also ignore the huge gamut of games I played, I refunded a lot of games I wanted to try out like the Aone in The Dark games which don't work properly, managed to get AITD The New Nightmare for my PS2 instead.
 
Last edited:
Lol, this is such a weird little chart:

2022-10-12-image-640x419.jpg


With the RTX 3080 keeping with the previously named 'RTX 4080 12GB' without using any performance boosting software...

This is the best you will get from the now retitled RTX 4070 TI 12GB.

No reason to upgrade at all, unless you want frame generation. Which AMD is offering too fairly soon...

So the RTX 3080 10GB aged pretty well (still in the top 10 performing cards on techpowerup). If it could do frame generation as well, that would be hilariously bad for Nvidia's sales.

Even more so for the RTX 3090 TI, which keeps up with the AD103 die quite well.

I hate the way Nvidia combined the DLSS upscaling results with frame generation on the 4000 series cards, when they are really 2 different things.

Gotta wonder if there will eventually be a cheaper variant of the AD103 die, they can't be producing that many RTX 4080 16GB cards...
 
Last edited:
Back
Top Bottom