• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD's FSR3 possibly next month ?

Framegen + 4K native resolution is definitely a step up from FSR2 in terms of image quality. I've tested it with a mod in Cyberpunk that adds framegen, and it looks the same as native. Forcing off Vsync in the control panel is a good idea as well.

I used to have a RTX 3070, then a RTX 3080, and the quality of DLSS, particularly for distance objects, was never as good as running at native 4K. There was always some aliasing that DLSS quality didn't manage to catch. Good temporal AA can help, but resolution is the main factor.

There's some issues with Cyberpunk and frame gen, I found that I had to run the benchmark first with 'DLSS' mode enabled in game, before the frame generation mod would activate (I can easily tell, because I set an ingame FPS limit of 120, and my Rivatuner FPS counter suddenly jumps to 240). You can also just launch the game with DLSS upscaling enabled, then switch it off after the game launches.

One thing I noticed in Cyberpunk, is that enabling anti-lag seemed to reduce 1% low framerates by quite a bit, so may not be worth enabling.
 
Last edited:
Last edited:
Finally found out what the issue in Cyberpunk is with Nuken9's frame gen mod! Credit goes to a random post I found on the Nexus.

As I mentioned in several previous posts, although this mod appears to 'work' it does absolutely nothing for motion clarity in Cyberpunk i.e. there is exactly the same amount of 'motion ghosting'/ motion clarity with the mod on vs off. The FPS may show a higher figure, but it has zero effect. This is because all that's happening is "frame doubling", there is no interpolation going on at all, which is why it looks and feels no different.

THE PROBLEM/ THE FIX?
It's still HDR related! Turn off HDR and the mod works as it should.
With HDR off there is definite interpolation going on now and better motion clarity that you would expect from going from ~60fps to ~100fps. The downside is now that the mod's working as it should I now see more artefacts and the frame pacing isn't great.

So even though the latest versions of Nukem9's mod 'works' with HDR for this game - in the sense that it doesn't crash anymore! - the mod does not work with HDR on at all. I bloody knew there was something going on.
I presume therefore that all those who said #worksfineonmyPC were playing without HDR.....
 
Last edited:
Has anyone tried frame generation (via modding) with Alan Wake II?

The performance in this video seems very good, more than 2x the frame rate:

I might try it, but would like to know what people's impressions are.

Case in point. Sounds like just frame doubling (or more).
 
Finally found out what the issue in Cyberpunk is with Nuken9's frame gen mod! Credit goes to a random post I found on the Nexus.

As I mentioned in several previous posts, although this mod appears to 'work' it does absolutely nothing for motion clarity in Cyberpunk i.e. there is exactly the same amount of 'motion ghosting'/ motion clarity with the mod on vs off. The FPS may show a higher figure, but it has zero effect. This is because all that's happening is "frame doubling", there is no interpolation going on at all, which is why it looks and feels no different.

THE PROBLEM/ THE FIX?
It's still HDR related! Turn off HDR and the mod works as it should.
With HDR off there is definite interpolation going on now and better motion clarity that you would expect from going from ~60fps to ~100fps. The downside is now that the mod's working as it should I now see more artefacts and the frame pacing isn't great.

So even though the latest versions of Nukem9's mod 'works' with HDR for this game - in the sense that it doesn't crash anymore! - the mod does not work with HDR on at all. I bloody knew there was something going on.
I presume therefore that all those who said #worksfineonmyPC were playing without HDR.....

I'm using HDR.

EDIT:

If it is HDR related, might be worth trying your gsync ultimate monitor out (if it has true hdr support) as the gsync module handles hdr better/differently than adaptive sync displays (depending on the model)
 
Last edited:
I'm using HDR.

EDIT:

If it is HDR related, might be worth trying your gsync ultimate monitor out (if it has true hdr support) as the gsync module handles hdr better/differently than adaptive sync displays (depending on the model)

Bah, there's always one that ruins the theory isn't there :p

The old monitor I've got with a Gsync module isn't HDR unfortunately (Acer X34 ultrawide). There's definitely something going on then and I'm not going mad, as this other guy has found and reported the exact same findings I observed. Presumably then it's either the mod itself, the role a Gsync module plays in conjuction with HDR and this mod, a combination of those, something else, or who knows.

Annoying thing is the only game I really wanted to use frame gen with is Cyberpunk when using path tracing, so it's a bit of a ****** as I don't really want to sacrifice HDR for Cyberpunk. And to be honest it still feels better and smoother without the mod at all too, can still get ~60fps most of the time.
 
Last edited:
Bah, there's always one that ruins the theory isn't there :p

The old monitor I've got with a Gsync module isn't HDR unfortunately (Acer X34 ultrawide). There's definitely something going on then and I'm not going mad, as this other guy has found and reported the exact same findings I observed. Presumably then it's either the mod itself, the role a Gsync module plays in conjuction with HDR and this mod, a combination of those, something else, or who knows.

Annoying thing is the only game I really wanted/ need to use frame gen with is Cyberpunk when using path tracing, so it's a bit of a ****** as I don't really want to sacrifice HDR for Cyberpunk. And to be honest it still feels better and smoother without the mod at all too.

Yeah it's definetly very weird this whole FSR 3 thing as like in avatar, some sites report it as being flawless yet as per Alex/DF and my own testing, it's got issues still with frame pacing, although I imagine it's just more down to those sites/people not being as observant or/and only testing literally in the opening area of the game where the issues weren't present.

One other thing worth trying is the disabling of vignette (need a mod from nexus mods to do this) as this has been shown to cause issues with upsampling and more importantly frame gen.

To me it's definetly not as good as the 4080 DLSS 3/FG in pretty much every way but it is definetly an improvement over no FG on my setup. I have a feeling the gsync module could be a big factor here perhaps but that's just an assumption and nothing more. Also, perhaps maybe dithering? For HDR I use true 10 bit instead of 8 bit + frc, probably won't have any impact but may be worth looking into.
 
@Nexus18 It is all a bit weird, early days I guess. It's all a bit of a mess really at the moment with a few official implementations and then FSR3 frame gen mods, and in some cases the mods being better for some games!

I do use a mod for cyberpunk to disable the permanent vignette actually, and have tried with and without that. No difference. I've also tried true 10bit vs 8bit with dithering for HDR and it makes no difference for Cyberpunk and this frame doubling/ interpolation not really being active issue. The only fix is to disable HDR. For Cyberpunk will wait to see if later versions of Nukem9's mod sort it, or see what the official FSR3 frame gen patch brings which will probably be soon. But that won't allow the use of DLSS, so will be a bit ***** really for 20/30 Nvidia series users.
 
Last edited:
@Nexus18 It is all a bit weird, early days I guess. It's all a bit of a mess really at the moment with a few official implementations and then FSR3 frame gen mods, and in some cases the mods being better for some games!

I do use a mod for cyberpunk to disable the permanent vignette actually, and have tried with and without that. No difference. I've also tried true 10bit vs 8bit with dithering for HDR and it makes no difference for Cyberpunk and this frame doubling/ interpolation not really being active issue. The only fix is to disable HDR. For Cyberpunk will wait to see if later versions of Nukem9's mod sort it, or see what the official FSR3 frame gen patch brings which will probably be soon. But that won't allow the use of DLSS, so will be a bit ***** really for 20/30 Nvidia series users.

Indeed, official FSR 3 will always be a no go for me for as long as it is locked to FSR 2 upscaling so thankfully we at least now have this mod.

3 other things:

- have you used windows hdr calibration tool? (presuming you use w11)
- disabled auto hdr if enabled (it's pretty poor most of the time anyway)
- tried display port instead of hdmi or vice versa
 
Last edited:
Indeed, official FSR 3 will always be a no go for me for as long as it is locked to FSR 2 upscaling so thankfully we at least now have this mod.

3 other things:

- have you used windows hdr calibration tool? (presuming you use w11)
- disabled auto hdr if enabled (it's pretty poor most of the time anyway)
- tried display port instead of hdmi or vice versa


Yep, in my thorough testing in various games now, FSR2 upscaling compared to DLSS upscaling is just rubbish. I hope FSR upscaling improves but I don't see how given the difference in basic starting points of tech.

To answer your Qs:
- Yes, my screens are calibrated using the Win11 HDR cal tool
- Just tried disabling Auto HDR. No difference
- Not tried that, kind of stuck with Display port for my current setup. I very much doubt that would have an effect anyway.

I've just done some really thorough testing with HDR on vs off for Cyberpunk, and it's DEFINITELY, 100% that, that's causing the issue. Interestingly you don't even have to disable HDR at the system/ windows level to see the difference - You can just turn it on and off in the game menu (Which doesn't really turn HDR 'off' at the display / system level) and observe the interpolation actually being on vs off.
 
Last edited:
Yep, in my thorough testing in various games now, FSR2 upscaling compared to DLSS upscaling is just rubbish. I hope FSR upscaling improves but I don't see how given the difference in basic starting points of tech.

To answer your Qs:
- Yes, my screens are calibrated using the Win11 HDR cal tool
- Just tried disabling Auto HDR. No difference
- Not tried that, kind of stuck with Display port for my current setup. I very much doubt that would have an effect anyway.

I've just done some really thorough testing with HDR on vs off for Cyberpunk, and it's DEFINITELY, 100% that, that's causing the issue. Interestingly you don't even have to disable HDR at the system/ windows level to see the difference - You can just turn it on and off in the game menu (Which doesn't really turn HDR 'off' at the display / system level) and observe the interpolation actually being on vs off.

Very strange that. HDMI/DP probably won't make a difference but it can have different methods of how hdr/the image is sent (especially on TVs, which iirc, you're using). Also, noticed you said "screens", maybe worth trying to disconnect one display here, again probably won't impact at all but I have noticed some issues before with HDR and having my 2 displays connected at same time.

I can only assume there is some combo issues at play here. I do remember AMDs injection of fsr 3/fmf or whatever it was called not officially supporting hdr and it may still be the case in some way, perhaps in combo with adaptive sync displays.

Tried capturing screenshots in motion but alas it doesn't show well, might grab a desk phone holder for the phone cam to capture footage direct of the display as it is far better for showing motion fluidity etc. than a recording uploaded to youtube.
 
It seems like some of you guys are really struggling to enjoy frame gen, plenty of hand wringing :D.

I question whether it's worth it on very high end, premium priced cards cards like the RTX 4090, and the upcoming 4090 Super. I imagine it could help these kind of cards (and future GPU) to run at very high resolutions like 6K and 8K, when this tech becomes ore widely used. For now, it's possible to get 8K TVs for around £1000 or more

Also, if the increased delay or input lag is gonna bother you, it's probably not worth using. Still, beats 20 FPS.

If you need extra performance to reach 60-70 FPS at 4K, I think it works pretty well. It looks better than DLSS or FSR upscaling to me, you don't have to use either if you prefer native res. People with mid and low end cards can benefit as well, and so will consoles at some point.

If you have a card with 8 or 10GB VRAM, and want to run at 4K native, it's probably not gonna be enough, so nothing new there.

RT needs so much GPU power, that I can't really comment (certainly a good way to tank game performance!), but some have had good results with frame gen on.
 
Last edited:
  • Like
Reactions: J.D
It seems like some of you guys are really struggling to enjoy frame gen, plenty of hand wringing :D.

I question whether it's worth it on very high end, premium priced cards cards like the RTX 4090, and the upcoming 4090 Super. I imagine it could help these kind of cards (and future GPU) to run at very high resolutions like 6K and 8K, when this tech becomes ore widely used. For now, it's possible to get 8K TVs for around £1000 or more

Also, if the increased delay or input lag is gonna bother you, it's probably not worth using. Still, beats 20 FPS.

If you need extra performance to reach 60-70 FPS at 4K, I think it works pretty well. It looks better than DLSS or FSR upscaling to me, you don't have to use either if you prefer native res. People with mid and low end cards can benefit as well, and so will consoles at some point.

If you have a card with 8 or 10GB VRAM, and want to run at 4K native, it's probably not gonna be enough, so nothing new there.

RT needs so much GPU power, that I can't really comment (certainly a good way to tank game performance!), but some have had good results with frame gen on.

As shown, it depends on the game. When it works, it works very well, when it doesn't work be that the game, the mod or the persons setup then it's not worthwhile using.

Given a 4090 ***** the bed in some games now especially launch day titles, yes it also needs fg especially if you are a high refresh rate gamer. Also, in my case, for when I game at 175hz 3440x1440, I would need it in most games to get close to that refresh rate.

Running any FG with a base of 20 fps is awful, if you can tolerate it, fair enough. Absolute lowest native/base fps I would go is 40.
 
Good read this:



Yesterday, “fake frames” was meant to refer to classical black-box TV interpolation. It is funny how the mainstream calls them “fake frames”;
But, truth to be told, GPU’s are currently metaphorically “faking” photorealistic scenes via drawing polygons/triangles, textures, and shaders. Reprojection-based workflows is just another method of “faking” frames, much like an MPEG/H.26X video standard of “faking it” via I-Frames, B-Frames and P-Frames.
That’s why, during a bit of data loss, video goes “kablooey” and turns into garbage with artifacts — if a mere 1 bit gets corrupt in a predicted/interpolated frame in a MPEGx/H26x video stream. Until the next full non-predicted/interpolated frame comes in (1-2 seconds later).
Over the long-term, 3D rendering is transitioning to a multitiered workflow too (just like digital video did over 30 years ago out of sheer necessity of bandwidth budgets). Now our sheer necessity is a Moore’s Law slowdown bottleneck. So, as a shortcut around Moore’s Law — we are unable to get much extra performance via traditional “faking-it-via-polygons” methods.
The litmus test is going lagless and artifactless, much like the various interpolated frame subtypes built into your streaming habits, Netflix, Disney, Blu-Ray, E-Cinema, and other current video compression standards that use prediction systems in their compression systems.
Just as compressors have original knowledge of the original material, modern GPU reprojection can gain knowledge via z-buffers and between-frame inputreads. And “fake it” perceptually flawlessly, unlike year 1993’s artifacty MPEG1. Even the reprojection-based double-image artifacts disappear too!
TL;DR: Faking frames isn’t bad anymore if you remove the “black box” factor, and make it perceptually lagless and lossless relative to other methods of “faking frames” like drawing triangles and textures
 
Last edited:
Running any FG with a base of 20 fps is awful, if you can tolerate it, fair enough. Absolute lowest native/base fps I would go is 40.
You might be right, 40 is probably a good baseline. Even at 30, frame generation by itself might not double the frame rate to 60 anyway.

If 60 can't be maintained in some games, I'd be looking at using some upscaling as well.
 
Last edited:
So, my framerate is generally 70+ in Cyberpunk at 4K Ultra (no upscaling). It's impressive tbh.

But, when I enable Ray Tracing (all set to ultra), it becomes a blurry mess, huge amounts of ghosting constantly.

So, it's pretty framerate dependent. EDIT - Also, riding round on a bike at high speed, you can see some ghosting if you move the camera left to right quickly, but it looks quite similar to motion blur (post processing).
 
Last edited:
What are people's impressions of Intel's XeSS upscaling with the Ultra quality preset? How does it compare to native resolution in your opinion?

I noticed that the base renderer in Cyberpunk 2077 has some visual artifacts on things like billboards, these are not present when either XeSS or FSR are enabled (I only tested at the top presets). So, in this game, the overall visual quality of XeSS might be better in some situations. Maybe there are issues with the temporal AA (at native), not sure.
 
Last edited:
Back
Top Bottom