• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

FSR 3.0 + frame generation looks like it could become a big deal

Maybe frame gen will only be a thing on new hardware like the RTX 4000 series and RDNA3 cards?

Considering that RDNA3 cards like the 7900 XT and XTX include "AI accelerators now, which suggests that FSR 3.0 may take advantage of them".
 
Last edited:
I think the killer feature would be frame generation that can be enabled at the driver level, and therefore works in all games...
 
Last edited:
runs on all graphics cards.
Unfortunately, I think the frame gen part is likely to be limited to RDNA3 cards only as these cards include AI accelerators. With a bit of luck though, the hw in RTX 4000 cards might support FSR3 also.
 
Last edited:
I suspect so. I would expect the consoles coming next year are RDNA3 in order to make it to 8k using FSR3. This would mean an extensive shift within most new games that want to make it to this resolution and not run at 1fps.
8K for consoles? seems a bit overkill when they often struggle with 4K @ 30fps.

Maybe for PS6 / next gen Xbox.
 
I agree with this dude:

I hate the way Nvidia marketed frame generation and DLSS 3, it made me think FG was just going to be some pointless illusionary performance gimmick.
 
I think both companies will want ppl to upgrade, unfortunately. It's a big selling point.

Maybe it might be worthwhile for AMD to offer it for RDNA2 cards, if possible. But it seems unlikely.
 
Last edited:
Except nvidia's figures showed 56ms latency. I don't need to try 56ms latency to know that it will feel bad as I had a TV that was slightly better than that and it still felt awful.
It's going to feel a whole lot better than playing at 30FPS with RT enabled (in games like the Witcher 3).

The whole point is to produce more frames to get a smoother framerate. It doesn't have to look perfect, as long as the minimum framerate is decent,

speaking as someone who's been playing games for decades, I've never really been bothered by delays in input. There probably are some games where you definitely wouldn't want any increased delay though.

Many older games had delays built it (slow animations or other delays), now it just seems like part of the charm.
 
Last edited:
I think frame gen. has a lot of potential for improving the framerate of older games too, which are poorly optimised like many Assassins creed games. Games like these have been optimised for 30 FPS on consoles (and had poor PC ports), but I think frame gen could smooth out the performance here too, if they can implement it as an option in display drivers.
 
They should concentrate on developing effects that the card is actually capable of displaying, rather than faking frames to kid people in to buying a poor substitute.
Well, that would be nice. All of the effort seems to have gone into making ray tracing look nice, then coming up with workarounds afterwards to make the performance more acceptable. This is true even at 1080p for many games, but gets much worse as the resolution is increased.

Making a big deal about ray tracing is the key to them selling new and expensive cards, so it's not particularly surprising. Otherwise, people would stick with a card that can handle their favourite games at 1440/4k resolution (at 60 or 120 FPS), and call it a day.

The push isn't really coming from AMD though, they are just desperately trying to keep up.

Being bothered by input lag is something relatively new I'd say (last 10 years, but more than that for competitive games like Quake). In the past, input lag was just part of the game, thinking back to the PS1, and even further back to much simpler games on the Atari and Amiga. To some extent, players had to learn to compensate for delays, sometimes this would be part of what made the game more difficult.

Latency over the Internet in multiplayer games has pretty much always been an issue ofc, but that's quite a different problem.

It's different for everyone, but input lag wasn't something that 'took me out of games' generally, but poor framerate was, especially if there was a heavy bottleneck somewhere (too many units on the screen in a strategy game for example).
 
Last edited:
Lol, this is such a weird little chart:

2022-10-12-image-640x419.jpg


With the RTX 3080 keeping with the previously named 'RTX 4080 12GB' without using any performance boosting software...

This is the best you will get from the now retitled RTX 4070 TI 12GB.

No reason to upgrade at all, unless you want frame generation. Which AMD is offering too fairly soon...

So the RTX 3080 10GB aged pretty well (still in the top 10 performing cards on techpowerup). If it could do frame generation as well, that would be hilariously bad for Nvidia's sales.

Even more so for the RTX 3090 TI, which keeps up with the AD103 die quite well.

I hate the way Nvidia combined the DLSS upscaling results with frame generation on the 4000 series cards, when they are really 2 different things.

Gotta wonder if there will eventually be a cheaper variant of the AD103 die, they can't be producing that many RTX 4080 16GB cards...
 
Last edited:
But this does all beg the question, why bother with a very expensive high end 4000 series card, when the RTX 4070 and TI (and lower tiers) will have frame generation anyway?

The only exception to that, is if Nvidia released a much cheaper AD103 based graphics card... But they are already charging ~800 dollars (not including tax) for the RTX 4070 TI ... So it won't happen.
 
Last edited:
Because the faster the card is to start with the less artifacting you'll see and the better the input lag will be.
Like it matters? Isn't frame gen coming to consoles too? the visual issues can't be that bad.

Presumably, AMD will introduce frame gen to consoles to get framerates' of around 60 FPS at 4k resolution. But, it's not clear yet if that will need new GFX hardware...

With the GPU market as it is, most simply will be concerned with cost and performance (framerate).
 
Last edited:
Does capping the max framerate mitigate any impact on input latency, when frame generation is enabled?

So, if the max framerate is set to 60, for example?
 
Donno, but it seems that whatever card I try, the framerate is never quite good enough, and the graphical features get increasingly intensive and numerous. Also, adoption of DX12 has not been quite as fast as I'd hoped, and the differences are obvious in certain titles like Assassins Creed Valhalla.

So, the ability to render more frames is welcome as an option, and these features are always nice to have (e.g. they are easily configured), particularly in a few years when GPUs start to show their age again.
 
Last edited:
Looking at the prices of used RTX 3080s, I might just try to get another one from CEX this year, if an equivalent card doesn't become available for ~£500, which I think these cards will fall too.

I did this before, and got a refund on a used £620 EVGA RTX 3080 (10GB), after the fans started to completely fail after just 3 months.

I wonder if EVGA was getting a lot of expensive cards returned, and so decided it wasn't worth selling them anymore?

Nvidia and AMD need to realise that they have to offer something that is at least as good as the last generation*, in raw performance, and at an equivalent or lower price, if they actually want to sell new gfx cards.

No doubt, they will keep whining about low demand (e.g. sales).

Also, I would never buy a gfx card from a place that doesn't allow easy returns (e.g. not Ebay)...

*This includes the numerous used cards like the RTX 3080/3070 that will be available to buy.
 
Last edited:
Back
Top Bottom