• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

FSR 3.0 + frame generation looks like it could become a big deal

But this does all beg the question, why bother with a very expensive high end 4000 series card, when the RTX 4070 and TI (and lower tiers) will have frame generation anyway?

The only exception to that, is if Nvidia released a much cheaper AD103 based graphics card... But they are already charging ~800 dollars (not including tax) for the RTX 4070 TI ... So it won't happen.
 
Last edited:
But this does all beg the question, why bother with a very expensive high end 4000 series card, when the RTX 4070 and TI (and lower tiers) will have frame generation anyway?

The only exception to that, is if Nvidia released a much cheaper AD103 based graphics card... But they are already charging ~800 dollars (not including tax) for the RTX 4070 TI ... So it won't happen.
Because the faster the card is to start with the less artifacting you'll see and the better the input lag will be.
 
Because the faster the card is to start with the less artifacting you'll see and the better the input lag will be.
Like it matters? Isn't frame gen coming to consoles too? the visual issues can't be that bad.

Presumably, AMD will introduce frame gen to consoles to get framerates' of around 60 FPS at 4k resolution. But, it's not clear yet if that will need new GFX hardware...

With the GPU market as it is, most simply will be concerned with cost and performance (framerate).
 
Last edited:
But this does all beg the question, why bother with a very expensive high end 4000 series card, when the RTX 4070 and TI (and lower tiers) will have frame generation anyway?

When Far Cry 7 comes out needing 13GB VRAM for the HD texture pack you'll be screwed! :D

In all seriousness though that could be nvidia's hope that people will be duped by the FG numbers but it's in very few games at the moment and personally I have been annoyed by input lag stretching back to the Quakeworld and UT days. I feel sometimes like it's getting worse and games need higher and higher framerates to make up for it. :)
 
Does capping the max framerate mitigate any impact on input latency, when frame generation is enabled?

So, if the max framerate is set to 60, for example?
 
Does capping the max framerate mitigate any impact on input latency, when frame generation is enabled?

So, if the max framerate is set to 60, for example?

I don't know about what is the best setup when using FG but I think nvidia have fixed the issue with when using g/free sync screens and you were hitting/exceeding your monitors refresh rate, it was significantly increasing latency because vsync (which is recommended to be turned on in the nvidia control panel when using g/free sync) was kicking in and a frame cap didn't work. DF were the only ones to note this issue, this is from launch so not quite as relevant now since nvidia have reduced the latency further:

QlYRLvG.png

Problem is every game and game engine has different latency too:

xNqoLp6.png
 
Last edited:
Yeah, I think 100ms or more of input delay would bother my senses too :D

People adapt though...
I don't want to adapt to higher input latency, for the prices they are asking I want lower latency and higher frame rates not a software compromise where I have to sacrifice one or the other.
 
Frame gen coming to AMD soon? Source? How soon? This month? Next month? End of the year?

How good is it? Will it have massive input lag? Artifacting?

Oh right. no one knows anything.

I’m certainly rooting for it since i’m on a 3080 but let’s be honest here.
 
Last edited:
Donno, but it seems that whatever card I try, the framerate is never quite good enough, and the graphical features get increasingly intensive and numerous. Also, adoption of DX12 has not been quite as fast as I'd hoped, and the differences are obvious in certain titles like Assassins Creed Valhalla.

So, the ability to render more frames is welcome as an option, and these features are always nice to have (e.g. they are easily configured), particularly in a few years when GPUs start to show their age again.
 
Last edited:
Looking at the prices of used RTX 3080s, I might just try to get another one from CEX this year, if an equivalent card doesn't become available for ~£500, which I think these cards will fall too.

I did this before, and got a refund on a used £620 EVGA RTX 3080 (10GB), after the fans started to completely fail after just 3 months.

I wonder if EVGA was getting a lot of expensive cards returned, and so decided it wasn't worth selling them anymore?

Nvidia and AMD need to realise that they have to offer something that is at least as good as the last generation*, in raw performance, and at an equivalent or lower price, if they actually want to sell new gfx cards.

No doubt, they will keep whining about low demand (e.g. sales).

Also, I would never buy a gfx card from a place that doesn't allow easy returns (e.g. not Ebay)...

*This includes the numerous used cards like the RTX 3080/3070 that will be available to buy.
 
Last edited:
I'm all for DLSS3/FSR3. Sure, for competitive gamers the fact it won't reduce latency is an issue but if it makes single player games smoother then all the better. For example, most games I can run comfortably at 4K 60fps, so if I upgrade my TV to a 120Hz panel then I can enjoy some benefits without necessarily having to fork out for a new card too. As long as the IQ is there, of course.
 
Last edited:
Back
Top Bottom