• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD's FSR3 possibly next month ?

I do recall that A 3080 with FSR3 (mod) beats a 7900 XTX also using FSR3 in the same ray traced games does it not?
I don't have a clue as I don't enable RT'ING, as it's horendous value for the outlay.

Loving Avatar everything RT is set to low and highest textures for high FPS, the 3080 would flat out break at these settings.

I did have a 4090 too, but sent it back and got the 79XTX instead.

Thanks for the progress AMD in helping older NV card users not pick a new AMD card
Think that's affecting NV's pocket more than AMD's tbh as DLSS3 was mostly the only reason to upgrade.
 
I don't have a clue as I don't enable RT'ING, as it's horendous value for the outlay.

Loving Avatar everything RT is set to low and highest textures for high FPS, the 3080 would flat out break at these settings.

I did have a 4090 too, but sent it back and got the 79XTX instead.


Think that's affecting NV's pocket more than AMD's tbh as DLSS3 was mostly the only reason to upgrade.

Got evidence for that? Didn't see a 3080 flat out breaking at all here with most things on max (including textures) so if using low RT, it should be doing pretty well and most gpus are struggling to run this max without upscaling at 4k so not sure why you would expect a a 3-4 year old 3080 to be doing better than the recent greatest gpus?

y2vJLwIh.png
 
Last edited:
Frank_Azor.png



FSR3_Available_and_Upcoming_games.png
 
I've only tested FSR 3 very little but so fare the upscaling has been working very well and has been a positive experience. FG though is another matter. I've tried it in Starfield and Cyberpunk and no matter what my base fps is the experience absolutely stinks to high heaven. Cyberpunk for example, I can have a base fps of 90-100, enable FG in Radeon Settings and the fps shoots to 200 but it feels so stutteri and unplayable(it also gets worse for some reason if my mouse is set to a polling rate of 8000 vs 1000). Starfield is more of the same, using the ingame FG option. FPS goes from 70 to whatever triple digits but the feel and look is just stuttering and horrible. Might be my hardware combo, might be something else, or it might just be that FG stinks I don't know. Btw Starfield also seems to hate when my mouse is set to 8000hz and FG is on.
 
Last edited:
The hitching you see in this is the game is shader caching, this part of the game is the oldest part, this planet, this city is 7 years old, its doesn't have any of the later optimisation techniques they have developed since, its actually in the process of being redone.

To try it i went to the worst case scenario, Loreville with the clouds on Very High

The colour is weird because i had HDR on.

So, it works, its not problem free but better than i expected, its a strange experience because you can feel the lag of 30 FPS but it looks, and it does look like 60 FPS, i did have AMD's overlay on the right hand side which shows you the FPS with AFMF on, but it didn't record its own overlay.

The only thing that i noticed was when panning back over the city the yellow neon was slightly stuttery on panning, but it was at 25 FPS render.

And that's the thing, this is not the proper Frame Gen stuff, AMD have that too but like Nvidia the game needs to support it, this is just something AMD did so you can have an admittedly poorer version of it in any game, even AMD say its not problem free and you shouldn't even use it unless you're running 60 FPS or more render frames.

Ultimately i think its good that AMD put the work in to give us this, even if its not perfect, it does work, even at 30 FPS, now i'm sure that some tech jurno could spend a couple of hours analysing that video frame by frame to point out all sorts of problems and call it _____ because as Linus and others have admitted negative videos are the ones that get views.

But i'm, not doing that, i'm just playing the game, and it is an improvement vs not running it, even at 30 FPS.

So would i run it daily? In short: No, not as things are, the long version is while yes it does look smoother it still feels like 30 FPS and there is some weirdness about it in Star Citizen, because of the low FPS, so until the frame rates in SC improve i'd rather keep it as is, i don't know why but i feel that if it feels 30 FPS laggy then it should look 30 FPS laggy.

However, am going to try it in other games that are 60 FPS +.

So for all the negative tech videos collecting clicks from AMD's work i'd like to say good job AMD for giving us this option and putting the work in, because why bother if everyone is just going to hate on it for clicks when you your self say its not perfect but its there if you want to try it, thank you!

Hopefully SC will get proper Frame Gen.


AFMF is not perfect, but at least it exists and with that it can be improved on.
 
Last edited:
That channel fell off quick. I've been looking at his vids for ages and few years ago his vids got tens of thousands to hundreds of thousands of views, and his newest stuff gets like 500 views now
 
Alex going hard on FSR 2 upscaling, here's hoping, amd take notice and actually focus on resolving the core issues which is the main thing holding it back from competing with DLSS:

3yAEq1f.png


ddpud5P.png


aSmJ3jm.png
 
Last edited:
The only takeaway I ever get now from Alex's opinion is DLSS is champ, but it's a broken record at this point, unless DF is receiving funding, Alex needs to read the room more because more Nvidia users prefer native than upscaling never mind AMD users as shown here and everywhere else.

Even Alex's co host John managed to slip out on a live stream that DLSS often breaks-you could even notice his uncomfort after saying it.

The problem Alex has is his audience is mostly running 1080p and most will only use upscaling because they either need to, or their screen is small enough to help hide the artifacts-it certainly isn't for IQ improvements@1080p.

Upscaling's main benefit is providing longevity, Nv's identified that now by paywalling FG and more are sure to arrive.

If your reading this and you enjoy/prefer upscaling, good for you, but personally don't have any need/want/preference for upscaling currently as my hardware is fast enough, I'll only reassess my current preferences if/when upscaling tech becomes mandatory to run games or I stop buying new hardware.
 
The problem Alex has is his audience is mostly running 1080p and most will only use upscaling because they either need to, or their screen is small enough to help hide the artifacts-it certainly isn't for IQ improvements@1080p.

I use 3x1080p screens which at 27"/screen is definitely not small and should see issues sooner than someone at 1440p or 4k.
From my experience, FSR is decent at higher res. DLSS is good overall.
 
Loving Avatar everything RT is set to low and highest textures for high FPS, the 3080 would flat out break at these settings.
Got evidence for that? Didn't see a 3080 flat out breaking at all here with most things on max (including textures)
Forgot all about this, my previous reply was deleted because I just laughed and didn't provide any context, but yes, here you go.

Afop_20Gb.png


Previously highest I'd seen was 18+ but after a 4+hr game play and noticed it hit 20GB+ vram usage and why I said my old 3080 would flat out break at these settings, which CBase explains the reasons why:

Avatar: Frontiers of Pandora shows extremely detailed surfaces in places, which also require a good portion of graphics card memory. 8 GB is no longer recommended for maximum graphic details in Full HD including upsampling; it should be at least 10 GB. The editors recommend 12 GB for WQHD and 16 GB for Ultra HD.

The information is no guarantee that it will be sufficient for a longer playing time.

FWIW, if your basing your knowledge on how new gen gpus perform off of entirely maxed out game settings in AFOP, you're doing it wrong, they are simply showing comparative performance at maximised settings-which most users won't run because performance generally is too low.:)

Edit-That vram screenshot is from The Plains area which is more demanding than Kinglor.

Seen 4090 screenshots using over 19Gb in The Plains too.
 
Last edited:
I use 3x1080p screens which at 27"/screen is definitely not small and should see issues sooner than someone at 1440p or 4k.
From my experience, FSR is decent at higher res. DLSS is good overall.
Your running DLSS/FSR from a higher base output than a single 1080p screen.

For example-1080p DLSS performance upscales@540p, therefore 15" laptop panel will look better than a 27" panel, a 27" panel will look better than a 42" panel- the bigger the screen the worse upscalling looks.
 
Last edited:
Your running DLSS/FSR from a higher base output than a single 1080p screen.

For example-1080p DLSS performance upscales@540p, therefore 15" laptop panel will look better than a 27" panel, a 27" panel will look better than a 42" panel- the bigger the screen the worse upscalling looks.

Might explain why i can see DLSS is not perfect, 32 Inch screen at 1440P.
 
The only takeaway I ever get now from Alex's opinion is DLSS is champ, but it's a broken record at this point, unless DF is receiving funding, Alex needs to read the room more because more Nvidia users prefer native than upscaling never mind AMD users as shown here and everywhere else.

Even Alex's co host John managed to slip out on a live stream that DLSS often breaks-you could even notice his uncomfort after saying it.

The problem Alex has is his audience is mostly running 1080p and most will only use upscaling because they either need to, or their screen is small enough to help hide the artifacts-it certainly isn't for IQ improvements@1080p.

Upscaling's main benefit is providing longevity, Nv's identified that now by paywalling FG and more are sure to arrive.

If your reading this and you enjoy/prefer upscaling, good for you, but personally don't have any need/want/preference for upscaling currently as my hardware is fast enough, I'll only reassess my current preferences if/when upscaling tech becomes mandatory to run games or I stop buying new hardware.

Its not just DF, everytime Upscaling is mentioned, be the video about XEsS, FSR or Microsoft AI Super Resolution (Yes that's a thing coming soon) or just plainly almost anything about any GPU they have to go in to a 5 or 10 minute diatribe about how DLSS is better than anything.

Its like those annoying activists, they can't help it, when ever there is something where the virtues of DLSS can be shoehorned in they will, not just shoehorn it in but it takes over as the dominant theme of the video, it is tiering, you click on a video about the H323 Windows patch and the first thing they talk about and for the majority of the video is DLSS.

What about the Windows patch you mentioned in your Video title?

Its like Nvidia have a pay per minute of talking about DLSS program, it feels like that.
 
Last edited:
Back
Top Bottom