I can already answer that, you just need a Ryzen 5000 series CPU and a RDNA2 GPU for the best SAM performance across multiple games. As to why some games show big gains and some show none, that's above my pay grade. What I do know is that there is no logical reason to disable SAM using the configuration above, that's why it's automatically enabled during driver installation on compatible systems. It's not enabled globally because there's a long list of games where performance suffers. That's why the excuses used to justify keeping it off don't wash with me.
If HUB and the testing was done on older CPUs, sure I'd go along with keeping it off. They don't test on older CPUs though, not for their hardware/game benchmark tests for the most part. Typically you don't want to artificially limit system performance when testing graphics cards as you want to show the best possible performance each card can offer.
If you want to show limited performance on older CPUs where bottlenecks may be present, fair enough. That's a different video though.
Again, I'm not disputing any of those bits......

I just want to know why some games see no benefit or "arguably" in some cases, a decline i.e. despite what you and a couple others are trying to have us believe..... it doesn't come across as just being a simple on/off switch if results are all over the place depending on game, hardware....
Surely is it not in amds best interest to ensure that game developers know how to get the best from SAM/rebar? Hence not worth bringing it to attention of whoever it is in amd? Would be a good PR piece at the very least as you like to make clear many times now how much superior SAM is to resize bar, even though they are technically the exact same or to better put it, achieve the same objective i.e. this:
If he's not going to believe actual users with the issue and reputable sites such as pcgh and computerbase, he's not going to believe that bloke.![]()
And there we go again with this point... Where have I or others stated we don't believe the likes of tommy, gerard and those 2 tech sites?
Again, see this point:
Except that's the thing... I and others have "noted" it and offered our inputs as to "potential" reasons why certain people are getting that issue, no one is not saying it's not happening for "certain" systems/users...
It's a certain band of people who don't want to look at the bigger picture or "acknowledge" some other "facts" i.e. why are some 3090 users experiencing fps drops? Why does it seem to be a widespread problem across a whole range of nvidia gpus and not amd? Why did likes of tommy experience this issue even at 1440P and using FSR with NO HD texture pack? (although he recently stated that no longer happens but didn't answer my question as to if that was from upgrading his cpu or/and adding 16GB more RAM.....) Why is my 3080 not showing the issue (unless I enable rebar and don't use FSR)? Why have the developers stated they are looking into the issues if it's purely just vram amount and nothing else (even when people are experiencing the issues without the HD texture pack)? Why do HUB etc. not note the same experience with the 3080 in their recent 3080 vs 6800xt video? Perhaps because they didn't enable rebar in that testing? Why does the FPS remain at single digits and not return to higher FPS like it does in every other game when there is a legit shortage of vram i.e. as per my cp2077 video.... Maybe a vram management issue? Hence why the developers might be looking into......
These are the questions that no one can or will come back to after all this time..... So it's not a clear cut "vram and nothing else" that a small minority are leading it on to be when you look at the whole picture hence the:
Rather than just taking a valid observation and knowing this is indeed a thing to consider
Being very true but it's one side who aren't doing that.....
Time for a popcorn.gif trying to watch Wrinkly explain something he does not understand.![]()
Pretty sure wrinkly works in development, in which case, he will be a lot more clued up than the majority here

I 100% said multiple times that happened ONLY while trying to recover the game by reducing settings after the arse falls out it!
ONLY happened AFTER the game ran out of vram on 4k HD texture pack/Rt/max settings and then trying ingame to try and recover it.
(Remember-trying to recover a slideshow here)It would not recover no matter reducing whatever settings all the way down to 1440p with FSR.
If it runs out-forget recovering WITHOUT a restart.
If you reload the game with reduced settings of course it runs 1440p FSR!
I gave up replying to you about it because you don't listen!
The tpu FC6 performance slides state: Max Details, RT off, don't see any mention of running the HD pack either:
https://www.techpowerup.com/review/far-cry-6-benchmark-test-performance/4.html
Well maybe you should have made that more clear the previous times I asked you to explain


Ok, but you do realise that from that "same" article and your very own quote, he stated in the summary right here:
What's really interesting is VRAM usage. I measured well over 10 GB with the 4K, HD Texture pack, and ray tracing combo, which does stutter sometimes on the 10 GB GeForce RTX 3080, but runs perfectly fine with cards offering 12 GB VRAM or higher. I guess I was wrong when I said that the RTX 3080's 10 GB will suffice for the foreseeable future.
I'm amazed at your patience @Nexus18.
Bottom line, Microsoft, Nvidia and Sony all plonked for 10GB this generation, a 25% increase over previous for PC. Two highly unoptimised AMD sponsored titles require more on PC, therefore Microsoft, Nvidia and Sony all got it wrong according to our resident knitting circle. Elden Ring is a good example of how to stream data, which at 1440p uses 4-5GB of VRAM, with very little CPU / GPU overhead.
I suspect Nvidia just isn't putting the effort in to RBAR at the moment as they have no incentive to do so while developing both Lovelace and Hopper. But as you say, it would be good to get some low level input.
Someone has to educate these folks




