• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

The main problem I have with Frame Gen isn't even the technology itself, it's how it's implemented in games.

In the two most recent AAA games I've played, Dragon Age and Indiana Jones, FG was just straight up broken at launch.

Imagine buying a game and the main selling point of your GPU just doesn't work for 1/2 weeks until they patch it.
 
I am super latency sensitive and play mostly first person games. The latency increase needs to be reduced by a factor of 3-5 before i consider using it. This on a 4090 as well, so I already have the highest possible base frame rate. The only game I used it is in MSFS 2024. It's fantastic there and has very few artefacts.
Digital foundary showed some latency figures

8jgIedz.jpeg
 
The few times I've enabled FG, it was like having motion smoothing on on my TV. Hated it.

It needs to be 90% the same across the board as traditionally processed frames for me to want to use it. And even then, my 3440x1440 monitor only goes up to 175hz.

I'm not against the technology in principal, but as it stands I've no interest in it.
I think this is how it should be discussed, as a motion smoothing tool and nothing else. It is certainly not extra performance.
 
Last edited:
Digital foundary showed some latency figures

8jgIedz.jpeg
I saw this video. Pretty interesting. My two takeaways were: 1) DLSS4 frame gen doesn't add much latency penalty for those extra 2 fake frames, and 2) the improvements to DLSS upscaling seems pretty nice with the new model - less ghosting being the main improvement. I look forward to that, even if i don't end up upgrading to the 5090.
 
The main problem I have with Frame Gen isn't even the technology itself, it's how it's implemented in games.

In the two most recent AAA games I've played, Dragon Age and Indiana Jones, FG was just straight up broken at launch.

Imagine buying a game and the main selling point of your GPU just doesn't work for 1/2 weeks until they patch it.
You buy games at launch? It's pretty much guaranteed to be bugged these days
 
Motion smoothing is exactly what it is. If your base framerate is 30fps then you're still playing at 30fps, with all the drawbacks apart from the perceived framerate.

I always thought it was just how the high framerate looked that appealed to me, and wasn't bothered about input lag etc until I actually tried it.
 
Last edited:
Hell yeah. You're not a real gamer unless you're playing games in the worst state they'll ever be in, at their most expensive price point.
Got to love us guinea pigs who buy the games at full price, beta test it, report the bugs, let it get fixed then let everyone else buy it at 50% off when it's been fixed :cry::cry:
 
Hell yeah. You're not a real gamer unless you're playing games in the worst state they'll ever be in, at their most expensive price point.
God bless game pass. I played an hour of Indiana Jones, looked at the lovely graphics and path tracing, then uninstalled it and carried on my way. Helps having 1gb internet so i can try any old game on a whim.
 
you are really missing out on a great game if you only played for 1 hour, it's brilliant.
I may revisit it again later on, but it was just such a slow start that I lost interested. Seems I'm not the only one based on what i read online. Granted, my attention span is that of a goldfish and i'm more at home in online FPS. Having said that, I recently played all the previously Playstation exclusives (TLOU, GoT, Spider Man, etc) and really enjoyed those.
 
most people probably don't realise it kills IQ

Got any proof of this? Because that line alone assumes that the difference is night and day, when the fact of the matter is unless badly implemented, then this is usually never the case and has not been since pre-DLSS 3.x gen upscaling.

Edit*
Your statement is only partially valid if someone is gaming at 1080P and then they use DLSS or any upscaling for that matter. The tech needs a decent baseline internal data resolution to work with to produce good results. Generally speaking, nothing in this thread or forum is in the context of 1080p gaming.


Someone on reddit just yesterday was having a fit about DLSS ghosting still to this day and pointed to Ratched & Clank, either their system was running low fps or something because the reality is the latest DLSS version has very little ghosting and near perfect IQ just like when using DLAA. I installed the game and checked for myself using 4K DLSS Performance:


The game's confetti particles in this scene are nearly perfectly rendered, might see one or two ghost trails for a very brief moment if you look closely, but otherwie it's basically perfect, the moiire is not a DLSS artefact, it's just the pattern of the carpet as is seen in other render modes and games in materials like this.
 
Last edited:
I may revisit it again later on, but it was just such a slow start that I lost interested. Seems I'm not the only one based on what i read online. Granted, my attention span is that of a goldfish and i'm more at home in online FPS. Having said that, I recently played all the previously Playstation exclusives (TLOU, GoT, Spider Man, etc) and really enjoyed those.
it's does take a bit of time for it to get going, honestly feel the first level was not a great idea but once your a couple of hours in it starts to flow well. anyway lets not derail this into game talk. I am sure it plays very well on your 4090!
 
Back
Top Bottom