• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD's FSR3 possibly next month ?

Its not just DF, everytime Upscaling is mentioned, be the video about XEsS, FSR or Microsoft AI Super Resolution (Yes that's a thing coming soon) or just plainly almost anything about any GPU they have to go in to a 5 or 10 minute diatribe about how DLSS is better than anything.

Its like those annoying activists, they can't help it, when ever there is something where the virtues of DLSS can be shoehorned in they will, not just shoehorn it in but it takes over as the dominant theme of the video, it is tiering, you click on a video about the H323 Windows patch and the first thing they talk about and for the majority of the video is DLSS.

What about the Windows patch you mentioned in your Video title?

Its like Nvidia have a pay per minute of talking about DLSS program, it feels like that.
Lol, I get what you're saying, but it's like a weekly show now on DF.

Hub and Nexus have mentioned a few times over the years how vendors can try and guide them to talk about features to put them in the best possible light and ask(pressure:p) why they're not if they don't.

The viewing target audience is predominantly running NV so they are probably more than aware of what they're doing anyway.

Just wait for DF to do a deep dive on Switch2 DLSS* and show us why it's better than FSR on the OG Switch with 100000 X magnification comparisons on an 8" panel.:p
 
Lol, I get what you're saying, but it's like a weekly show now on DF.

Hub and Nexus have mentioned a few times over the years how vendors can try and guide them to talk about features to put them in the best possible light and ask(pressure:p) why they're not if they don't.

The viewing target audience is predominantly running NV so they are probably more than aware of what they're doing anyway.

Just wait for DF to do a deep dive on Switch2 DLSS* and show us why it's better than FSR on the OG Switch with 100000 X magnification comparisons on an 8" panel.:p

I think even their target audience would get tired of it when every other video they click on is a digression from the subject they wanted to see to a long rambling DLSS advert.

I like the card that i have, if everyother video i clicked on no matter hat the GPU related subject worked its way to a 5 minute shill piece on Sapphire GPU's i would start to lose my ______ sanity, no one needs that much confirmation bias FFS...

I think this is part of the reason the hate for Nvidia as a company is growing and growing, everything related to this subject ends up being about Nvidia, its like a horrible smell that seeps in to everything and you can't get rid of it.

As for DF, we all know they actually are Nvidia shills don't we? Hasn't that been pretty much established? If not... HOW?
Everything Nvidia they put out should be taken in that context, they are an advertising arm for Nvidia.

With an AMD GPU now, yes there is a difference between FSR and DLSS, its a slight difference that you can only see when every video on the internet is telling you constantly there is a difference, why do you think there is so much of it? Nvidia are afraid people will forget and then stop noticing. Gota' keep the reasoning up for a £550 GPU that's equivalent to a £450 AMD GPU. The moment you stop thinking DLSS is in this case worth £100 Nvidia have to start dropping to only 70% margins.
 
Last edited:
The only takeaway I ever get now from Alex's opinion is DLSS is champ, but it's a broken record at this point, unless DF is receiving funding, Alex needs to read the room more because more Nvidia users prefer native than upscaling never mind AMD users as shown here and everywhere else.

Even Alex's co host John managed to slip out on a live stream that DLSS often breaks-you could even notice his uncomfort after saying it.

The problem Alex has is his audience is mostly running 1080p and most will only use upscaling because they either need to, or their screen is small enough to help hide the artifacts-it certainly isn't for IQ improvements@1080p.

Upscaling's main benefit is providing longevity, Nv's identified that now by paywalling FG and more are sure to arrive.

If your reading this and you enjoy/prefer upscaling, good for you, but personally don't have any need/want/preference for upscaling currently as my hardware is fast enough, I'll only reassess my current preferences if/when upscaling tech becomes mandatory to run games or I stop buying new hardware.

Difference is, there are people using DLSS unlike FSR, despite the way it was made to make it accesible to everyone, turns out, a handful on here and elsewhere hardly anyone wants to use it because of how bad it is more so for lower res and lower presets, at least with dlss as evidenced by various sites, it can offer as good or better than native (which no one has debunked with their own evidence yet [unless cherry picked scenes] other than "trust me bro").

I believe you stated this before about John and turns out he was referring to a specific game iirc i.e. the same John who also states this often:

9Y6dR1p.png


That is true, you ideally don't want to use it at 1080p although as shown, the results can sometimes be better than native as shown in hubs latest video and again, they don't seek to resolve some of their main issues which could be easily solved by updating to the latest dlss version and switching to preset c, which is fine, but this is pc where you tweak and with dlss tweaks and dlss swapper, it's not exactly a hardship to do.

Nvidias current FG is not able to work on older hardware because it uses the optical flow accelerator, so they aren't technically paywalling that. Now what they are stopping is from providing a software solution like amds but again, a company with 85+% marketshare and in this to make profit, what do you expect.....

Based on the games I play, a 7900xtx/4070 would not be delivering the FPS I desire as good at 4k or 3440x1440 175hz, therefore, it's either using upscaling or reduce settings drastically, I know what I rather have but each to their own.

Forgot all about this, my previous reply was deleted because I just laughed and didn't provide any context, but yes, here you go.

Afop_20Gb.png


Previously highest I'd seen was 18+ but after a 4+hr game play and noticed it hit 20GB+ vram usage and why I said my old 3080 would flat out break at these settings, which CBase explains the reasons why:



FWIW, if your basing your knowledge on how new gen gpus perform off of entirely maxed out game settings in AFOP, you're doing it wrong, they are simply showing comparative performance at maximised settings-which most users won't run because performance generally is too low.:)

Edit-That vram screenshot is from The Plains area which is more demanding than Kinglor.

Seen 4090 screenshots using over 19Gb in The Plains too.

It seems you haven't watched Alexs video or/and read on how snowdrop engine works:


The VRAM meter allows the user to see how the GPU memory is used by Avatar: Frontiers of Pandora. The VRAM meter does not estimate GPU memory usage, but shows the current values tracked by the game. This is why the VRAM meter is only available when changing graphics settings while already loaded into the game. It is not shown when changing graphics settings in the Main Menu as the values would not be representative. When graphics settings have been changed, but not yet applied a warning will indicate that the VRAM allocation only changes after settings have been applied. Some settings will only have a noticeable effect on the VRAM allocation, after the player moves through the world.


The VRAM meter is split into 4 distinct parts, in order from left to right:


  • The saturated blue part represents the GPU memory used by the game excluding the memory used by the texture streamer.
  • The light blue part represents the memory allocated by the texture streamer.
  • The purple part represents currently unused GPU memory.
  • The grey part represents GPU memory that is not available to the game, as it is used by other apps or the operating system.

The texture streamer in Snowdrop will always try to allocate most of the remaining GPU memory. It will try to keep a certain amount of VRAM (around 350 Mb) free, to account for spikes in memory allocation patterns. When other systems require more VRAM the texture streamer will adjust and stream out more detailed texture mips. On systems with lots of VRAM available, the texture streamer might not fill up the remaining available GPU memory, as all textures are already streamed in with the desired detail mip.


The VRAM meter will turn the Texture streamer segment yellow once it drops below 1000 Mb and red once it drops below 500 Mb. The visual quality of the game will suffer greatly if the texture streamer is restricted to such low values. Users are encouraged to lower their output resolution or Scaling Quality setting (both found in the Video Settings) if the VRAM meter turns yellow or red.

i.e. if the vram is there, it will be allocated but the game/engine is smart enough to know where/what to allocate, obviously as shown, if you are going to dial everything to max, you will face issues, vram will of course to a certain extent especially at native high res limit performance but as shown, grunt etc. is also a major impact i.e. you need to have the grunt and vram if you wish to whack everything up.

I've put a 100 hours in and have not seen a single issue down to vram and it's running mostly max settings with dlss balanced (because not enough grunt). This game shows what is possible with good/proper vram management unlike some other games.

The plains area is definetly harder on perf. though, it's more cpu bound.
 
Last edited:
I moved from a 3080 to a 7900XT my first AMD card in years. And I must say I miss DLSS, FSR is nowhere near as good for me. If FSR is not improved then I will be going back to Nvidia
Seems like an expensive semi-sidegrade. Have you tried FSR 3 yet? It seems FSR 2.* is very hit-and-miss depending on the implementation, which is unfortunate.
 
Right now Helldivers 2 is the biggest game there is, there is a lot of video content on it and i have seen a lot of it, not one, not a single one has mentioned DLSS at all, the game does not have it, and everyone seems completely oblivious to that.

Because tech jurnoes aren't making a huge fuss about it, to remind you its something you can't live without.

Won't be long tho... i suspect before Nvidia catch on. This game is a total surprise.
 
Last edited:
I moved from a 3080 to a 7900XT my first AMD card in years. And I must say I miss DLSS, FSR is nowhere near as good for me. If FSR is not improved then I will be going back to Nvidia

It is the main reason I won't consider amd at all but now having DLDSR and the most important one, even more so than DLSS is now RTX HDR.

Right now Helldivers 2 is the biggest game there is, there is a lot of video content on it and i have seen a lot of it, not one, not a single one has mentioned DLSS at all, the game does not have it, and everyone seems completely oblivious to that.

Because tech jurnoes aren't making a huge fuss about it, to remind you its something you can't live without.

Won't be long tho... i suspect before Nvidia catch on. This game is a total surprise.

Upscaling does not make or break a game so I'm not sure what relevance it has if a game has dlss or not?

Also, it's probably not talked about because the game runs pretty well as it is. Well supposedly for amd the performance is pretty poor:


Here's my 4090 and 7900XTX comparison video. Nvidia definitely seem to have an advantage in this game compared to Radeon. I'm working on some RTX 3080Ti benchmarking at the moment and am seeing similar advantages between it and 6800XT. Normally the 3080Ti is a bit faster than the 6800XT in Helldivers 2 its around 35% when comparing my current notes and that about sort of advantage ads up when looking at this video comparison.

The devs have said they will be adding DLSS though and I'll definetly be using it not just for extra perf but because the games AA is so bad, without AA, it shimmers like mad and with AA, it still shimmers and then has blur in motion. There is upscaling too but it is FSR 1 so looks awful.
 
It is the main reason I won't consider amd at all but now having DLDSR and the most important one, even more so than DLSS is now RTX HDR.



Upscaling does not make or break a game so I'm not sure what relevance it has if a game has dlss or not?

Also, it's probably not talked about because the game runs pretty well as it is. Well supposedly for amd the performance is pretty poor:




The devs have said they will be adding DLSS though and I'll definetly be using it not just for extra perf but because the games AA is so bad, without AA, it shimmers like mad and with AA, it still shimmers and then has blur in motion. There is upscaling too but it is FSR 1 so looks awful.
It’s an outlier for sure. Devs need to concentrate on the game not crashing first. Both friends on Nvidia cards were crashing as well as myself. Shame really as it’s my game of the year so far.
 
It’s an outlier for sure. Devs need to concentrate on the game not crashing first. Both friends on Nvidia cards were crashing as well as myself. Shame really as it’s my game of the year so far.

Helldivers 2?

A friend of mine with a 7900 XTX is very unstable, constantly crashing unless running the lowest settings.

I have a 7800 XT, i run everything maxed out 1440P and its rock solid, can play for hours and not a single slight hickup let alone a crash, weird... :confused:
 
It’s an outlier for sure. Devs need to concentrate on the game not crashing first. Both friends on Nvidia cards were crashing as well as myself. Shame really as it’s my game of the year so far.

So far no crashes here. Game runs very well for all that is happening on screen, very pretty on the whole but also pretty bland/awful looking at times especially with raster methods e.g. SSR

b9xEZjP.png


Game is great fun though which is what counts although will need more content soon.
 
Your running DLSS/FSR from a higher base output than a single 1080p screen.

For example-1080p DLSS performance upscales@540p, therefore 15" laptop panel will look better than a 27" panel, a 27" panel will look better than a 42" panel- the bigger the screen the worse upscalling looks.
Is still 1080 / per screen, overall resolution doesn't count as it does for 4k for instance. It's higher res than 1080p overall, but A LOT higher field of view. 1440p is lower res, but will offer higher image quality / screen than overall 5760x1080p. Ergo, FSR is less than ideal in my case and DLSS is king.

Also DLSS is an extremely usefully AA tool for vegetation and other small assets making it perfect for the likes of RDR2.

Other than the above is also the distance from the screen to the player - just like any other situation where resolution of the "looked at" object is involved vs "the one who looks" :p
 
Is still 1080 / per screen, overall resolution doesn't count as it does for 4k for instance. It's higher res than 1080p overall, but A LOT higher field of view. 1440p is lower res, but will offer higher image quality / screen than overall 5760x1080p. Ergo, FSR is less than ideal in my case and DLSS is king.

Also DLSS is an extremely usefully AA tool for vegetation and other small assets making it perfect for the likes of RDR2.

Other than the above is also the distance from the screen to the player - just like any other situation where resolution of the "looked at" object is involved vs "the one who looks" :p

Native is better than any upscaling, can we not have GPU's that are powerful enough to not need upscaling? How did we get to the point that we are rewarding these GPU vendors for selling us weak over priced GPU's that need upscaling which Nvidia then also use as a marketing tool for "added value" price hikes? What because they do that better than AMD?

We are doomed... The Nvidia fan base has a lot in common with the Apple fan base.
 
Last edited:
Is still 1080 / per screen, overall resolution doesn't count as it does for 4k for instance. It's higher res than 1080p overall, but A LOT higher field of view. 1440p is lower res, but will offer higher image quality / screen than overall 5760x1080p. Ergo, FSR is less than ideal in my case and DLSS is king.

Also DLSS is an extremely usefully AA tool for vegetation and other small assets making it perfect for the likes of RDR2.

Other than the above is also the distance from the screen to the player - just like any other situation where resolution of the "looked at" object is involved vs "the one who looks" :p

RDR 2 is one of the worst games for it's native image, with or without TAA based scaling, the only acceptable way to play that game regardless of your res is either:

- with TAA on medium
- with dlss (and making sure to update to later versions of dlss due to the issues the game original dlss implementation had)
- ideally with dlss on performance and DLDSR

Without a TAA based anti aliaser, the game downright looks broken especially with fur/hair:


DLSS is superior in motion too (that's using the old version of dlss too, with the new version, it is even better and with DLDSR, it's in a league of its own):

L8j7yb2.jpg


TAA is a blurry mess:

TTSLtQX.jpg


As Alex/DF recent video on taa demonstrated, we aren't going to be getting away from any kind of upsampling any time soon and sadly as shown, without some kind of TAA based method, games can look just awful but if you aren't sensitive to aliasing, shimmering and so on well then more power to those people, it's utterly immersion breaking for me. It's somewhat ruining helldivers 2 for me at the moment as the game shimmering is very distracting and the grunt isn't there to use DLDSR.

Native is better than any upscaling, can we not have GPU's that are powerful enough to not need upscaling? How did we get to the point that we are rewarding these GPU vendors for selling us weak over priced GPU's that need upscaling which Nvidia then also use as a marketing tool for "added value" price hikes? What because they do that better than AMD?

We are doomed... The Nvidia fan base has a lot in common with the Apple fan base.

In which games? Even HUB have concluded dlss/upscaling at times can be better than native. Gamer nexus, tpu, computerbase, pcgamershardware, oc3d have all stated similar things, surely they aren't all nvidia shills? And if so, again, the onus is on you and others to debunk their evidence with yours showing otherwise. Surely if this really is the case, it would be pretty easy to show it?

For sheer IQ, DLDSR combined with DLSS is trumping all though.

If you can create hardware to achieve the visuals we see nowadays without having to sacrifice overall visual fidelity then I would be contacting amd, nvidia and intel for a job...

You also still seem to think that making such software solutions is free..... it most definetly is not, software engineers etc. time to come up with these software features and continually improve upon them is not free. This is why amd like open source as they can throw their solutions over the fence and let others improve it for them hence why the uptake for FSR has been so poor including xbox and ps 5 still lacking in their uptake, probably why microsoft and sony are looking into doing their own solutions now.
 
Last edited:
Native is better than any upscaling, can we not have GPU's that are powerful enough to not need upscaling? How did we get to the point that we are rewarding these GPU vendors for selling us weak over priced GPU's that need upscaling which Nvidia then also use as a marketing tool for "added value" price hikes? What because they do that better than AMD?

We are doomed... The Nvidia fan base has a lot in common with the Apple fan base.
For me not always, because live I've said, it solves some aliasing issues that regular AA or resolution doesn't do at all or not as good.

Just for performance sake, let's assume performance scales linearly. If 10% of the die is dedicated for hardware responsible for proper upscaling while offering at up to 100% more performance (or more) than the entire die area, I'm all up for for the upscalers. They are simply the next "trick" to get more with less. Don't forget than even in regular rasterization (which still is all about "fakery" still), where you have "pure" rendering, not everything scales with resolution and plenty of assets are scaled to 4k or whatever resolution you're using and (from experience), I think some games were actually internally upscaling to "native" anyway, even before DLSS, as they seem very washed out - Mafia 3 comes easily to mind, only started to look better if downscalled from 1440p or above.
 
I played a bit of re3 yesterday cause it got added to gamepass and in this game TAA > Native. Native gaming often sucks unless you have a fetish for jaggies
 
Last edited:
For me not always, because live I've said, it solves some aliasing issues that regular AA or resolution doesn't do at all or not as good.

Just for performance sake, let's assume performance scales linearly. If 10% of the die is dedicated for hardware responsible for proper upscaling while offering at up to 100% more performance (or more) than the entire die area, I'm all up for for the upscalers. They are simply the next "trick" to get more with less. Don't forget than even in regular rasterization (which still is all about "fakery" still), where you have "pure" rendering, not everything scales with resolution and plenty of assets are scaled to 4k or whatever resolution you're using and (from experience), I think some games were actually internally upscaling to "native" anyway, even before DLSS, as they seem very washed out - Mafia 3 comes easily to mind, only started to look better if downscalled from 1440p or above.

DLSS can be applied as AA, its actually where the tech originates from, instead its used to boost weak rasterization performance of modern GPU's for a $ premium.
 
Last edited:
DLSS can be applied as AA, its actually where the tech originates from, instead its used to boost weak rasterization performance of modern GPU's for a $ premium.

Oh, come on... is basically there with AMD which doesn't have dedicated HW for it where it comes to 4080 vs 7900xtx and 4090 "destroys" everything.

I highly doubt any one of the GPU players are holding back "true" performance vs upscaling tech. As per TPU GPU Database: 980ti vs 1080ti is around 67% extra. 1080ti vs 2080ti is only 31% indeed (but you get some new tech to play with), and then you go from 55% (3090) to 80% faster (3090ti) - both vs 2080ti. 4090 vs 3090 is 64% and 41% against 3090ti, but probably they could get back some extra performance with a "fuller" chip and basically all are within the same ballpark +/- - or a higher difference in RT/PT I'd guess.

Upscalers just give you some extra stuff before you need to drop settings significantly. Again, how much of the chip is dedicated to that and how much you could realistically gain by using hw dedicate to raster?
 
Last edited:
Back
Top Bottom