• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Which one though, FG exists in many games, and its implementation by the game devs dictates how good the IQ is, just like DLSS, just like FSR etc.

Not to mention they have all stated that you really need to be slowing down footage or stopping it entirely to capture these awful issues
 
Which one though, FG exists in many games, and its implementation by the game devs dictates how good the IQ is, just like DLSS, just like FSR etc.
First off DLSS3 on the 4090 isn't indicative of what you get on the rest of the 40 series stack, it will present the least artifacts/temporal instability due to the sheer speed/vram of the GPU.

Games I've seen in the flesh on a 4070, DL2, there can be about 4 ropes on view, SM MM, multiple waypoints/labs etc flash randomly all over the screen at the same time and you can get parts of skyscrapers present in the sky.
 
Last edited:
Ok that's fair, I have not seen what FG is like on the lower end 40 series so cannot comment on that. It makes sense though because the sheer power of the 4090 has shown me what even just normal RT looks like vs how I saw it on the 3080 Ti in games like Cyberpunk where the latency response was much slower, and thus the result was lower IQ due to the slower GPU vs the 4090.

I bet it is something that will continually evolve though too, just like how DLSS upscaling did from v 1 to v2.
 
Last edited:
Ok that's fair, I have not seen what FG is like on the lower end 40 series so cannot comment on that. It makes sense though because the sheer power of the 4090 has shown me what even just normal RT looks like vs how I saw it on the 3080 Ti in games like Cyberpunk where the latency response was much slower, and thus the result was lower IQ due to the slower GPU vs the 4090.

I bet it is something that will continually evolve though too, just like how DLSS upscaling did from v 1 to v2.

I've been mega impressed with FG on the 4080 in geforce now. Pretty much what the youtubers etc. say is true though, your base fps needs to be 60+ fps in order to get a good experience surrounding not noticing latency. That's why FSR 3 for consoles will not be a big deal as they would be playing base of 30 fps, unless amd have somehow managed to beat nvidias workaround to deal with latency all through their magical drivers.

It really is game changing tech though, people who say it's just tv motion interpolation and the usual nonsense of not being anything new have clearly not used/seen it for themselves.

Can't wait for dlss 4 and the 50xx series now :p :p :D
 
First off DLSS3 on the 4090 isn't indicative of what you get on the rest of the 40 series stack, it will present the least artifacts/temporal instability due to the sheer speed/vram of the GPU.

Games I've seen in the flesh on a 4070, DL2, there can be about 4 ropes on view, SM MM, multiple waypoints/labs etc flash randomly all over the screen at the same time and you can get parts of skyscrapers present in the sky.

What you blabbing on about, 60+ fps and FG shows minimal artefacts whether on a 4070 or 4090.
 
I've been mega impressed with FG on the 4080 in geforce now. Pretty much what the youtubers etc. say is true though, your base fps needs to be 60+ fps in order to get a good experience surrounding not noticing latency. That's why FSR 3 for consoles will not be a big deal as they would be playing base of 30 fps, unless amd have somehow managed to beat nvidias workaround to deal with latency all through their magical drivers.

It really is game changing though, people who say it's just tv motion interpolation and the usual nonsense of not being anything new have clearly not used/seen it for themselves.

Can't wait for dlss 4 and the 50xx series now :p :p :D

That's what I'll be spending my pension money on! :D
 
I've been mega impressed with FG on the 4080 in geforce now. Pretty much what the youtubers etc. say is true though, your base fps needs to be 60+ fps in order to get a good experience surrounding not noticing latency. That's why FSR 3 for consoles will not be a big deal as they would be playing base of 30 fps, unless amd have somehow managed to beat nvidias workaround to deal with latency all through their magical drivers.

It really is game changing tech though, people who say it's just tv motion interpolation and the usual nonsense of not being anything new have clearly not used/seen it for themselves.

Can't wait for dlss 4 and the 50xx series now :p :p :D

Got quite a long wait, Q1 2025 going by Nvidia's own info and other info coming out, Best case scenario is October next year.

Imagine the 6000 series with DLSS5, The game will play itself so you can then go outside and experience real world ray tracing which is DLSS 9000+ ! :D
 
Last edited:
Ok that's fair, I have not seen what FG is like on the lower end 40 series so cannot comment on that. It makes sense though because the sheer power of the 4090 has shown me what even just normal RT looks like vs how I saw it on the 3080 Ti in games like Cyberpunk where the latency response was much slower, and thus the result was lower IQ due to the slower GPU vs the 4090.

I bet it is something that will continually evolve though too, just like how DLSS upscaling did from v 1 to v2.

Balanced reply gets a reply.

Forgot latency, when I got the 4070 on CP although it looked very very smooth, it felt far from it and got switched back off, haven't tried it since.

There's a whole lot of variables involved and waiting to see how the 79XTX fares on FG, but I'm not holding my breath on a 65" QD Oled.
 
He is right tho, frame generation artifacts are framerate dependent and not GPU tier dependent.
He isn't.

Artifacts are worse at lower fps.

FG works above a minimum threshold, if a 4090 can pump out 150 fps at X settings and the 4070 only manages 100 fps on lower than X settings, its artifacting more the slower the fps.

Go and look at a 4060 trying to do FG, it's comical at times.
 
Last edited:
He is right tho, frame generation artifacts are framerate dependent and not GPU tier dependent.

Exactly. Pretty much everyone has said that frame generation needs 60 fps to be good. Much like how free/g sync isn't a silver bullet for anything below 60 fps either.
 
Last edited:
Forgot latency, when I got the 4070 on CP although it looked very very smooth, it felt far from it and got switched back off, haven't tried it since.
That will likely be because path tracing in CP is super GPU heavy, and with FG enabled, the baseline fps even if it's 60fps results in high render and/or input latency which Reflex can only do so much to improve, so having the much higher baseline on say a 4080 and 4090 results in not noticing the only then small increase in latency when FG is enabled.

Sadly the only way around that is having a 4080 at the minimum, a 4090 obviously has no issue as it path traces without FG at 80fps on average at 1440p.
 
Last edited:
That will likely be because path tracing in CP is super GPU heavy, and with FG enabled, the baseline fps even if it's 60fps results in high render and/or input latency which Reflex can only do so much to improve, so having the much higher baseline on say a 4080 and 4090 results in not noticing the only then small increase in latency when FG is enabled.

Sadly the only way around that is having a 4080 at the minimum, a 4090 obviously has no issue as it path traces without FG at 80fps on average at 1440p.
It was just using ultra RT'ing@1080p at the time, hopefully it's improved since then.

Agree with you on the 4080/90 baseline, that's where best results will be IMO.
 
Last edited:
  • Like
Reactions: mrk
The new tech is exciting but I'm still concerned about how much they'll use it to manipulate every GPU series going forward.

We're now at a stage where it's longer between each gen and when it does come you're expected to drop £1500 for the card that offers the most value for money.

If you step down a model, you're still paying over a grand and the performance drop off is gigantic. Hopefully the 5 series doesn't follow the same pattern but...
 
Last edited:
Back
Top Bottom