• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Radeon Resizable Bar Benchmark, AMD & Intel Platform Performance

Neutrality? Just call me Swiss Turnip!

"Selecting a graphics card... is very much like making love to a beautiful woman. You have to carefully choose one, considering its features and longevity, before smoothly inserting her into your system. Your RAM must be sufficient, or you could end up being disappointed, and sometimes for that extra 10% it can be helpful to get SAM involved..."
 
Basically this Bill:



Rather than just taking a valid observation and knowing this is indeed a thing to consider (reference many a forum poster that said they didn't buy a 3070/80 card due to worrying about its longevity) we had to wait for tech experts (but not really experts as they have as much knowledge than most of us and are regular folk that need no worshipping).

I appreciate people trying to be neutral though. :)

Except that's the thing... I and others have "noted" it and offered our inputs as to "potential" reasons why certain people are getting that issue, no one is not saying it's not happening for "certain" systems/users...

It's a certain band of people who don't want to look at the bigger picture or "acknowledge" some other "facts" i.e. why are some 3090 users experiencing fps drops? Why does it seem to be a widespread problem across a whole range of nvidia gpus and not amd? Why did likes of tommy experience this issue even at 1440P and using FSR with NO HD texture pack? (although he recently stated that no longer happens but didn't answer my question as to if that was from upgrading his cpu or/and adding 16GB more RAM.....) Why is my 3080 not showing the issue (unless I enable rebar and don't use FSR)? Why have the developers stated they are looking into the issues if it's purely just vram amount and nothing else (even when people are experiencing the issues without the HD texture pack)? Why do HUB etc. not note the same experience with the 3080 in their recent 3080 vs 6800xt video? Perhaps because they didn't enable rebar in that testing? Why does the FPS remain at single digits and not return to higher FPS like it does in every other game when there is a legit shortage of vram i.e. as per my cp2077 video.... Maybe a vram management issue? Hence why the developers might be looking into......

These are the questions that no one can or will come back to after all this time..... So it's not a clear cut "vram and nothing else" that a small minority are leading it on to be when you look at the whole picture hence the:

Rather than just taking a valid observation and knowing this is indeed a thing to consider

Being very true but it's one side who aren't doing that.....
 
why are some 3090 users experiencing fps drops? Why does it seem to be a widespread problem across a whole range of nvidia gpus and not amd? Why did likes of tommy experience this issue even at 1440P and using FSR with NO HD texture pack? Why does it seem to be a widespread problem across a whole range of nvidia gpus and not amd?
That's anecdotal as we have no information to go on about those users you found from the UBI forum. The reason we put weight behind the likes of Tommy/Gerard is both are savvy users with high end PCs running 3080s. These are not novices from the internet. And their findings are validated by multiple tech media outlets, that's the thing.

Tommy can speak to the questions asked, but as I understand it he only solved the issue by disabling the HD Texture pack. Certainly for Gerard that was the case and he confirmed as much already in the Far Cry 6 thread.

When I had a 3090 I did not experience any issues with Far Cry 6, with or without ReBar and that was a month or so ago. Sure it was 10-15% slower when overclocked than my overclocked Toxic in FC6, but it didn't have any FPS drops. As I recall @gpuerrilla is using a 3090 and completed Far Cry 6 at 4K max settings without FPS drops too. So, if there was a big issue with FPS drops not related to VRAM, we'd have seen it too, right?

Most likely those users you reference on the UBI forums have local system issues, certainly there is not enough evidence to suggest it's an issue with the game that the developers are trying to fix. I think that ship sailed when you used that excuse previously when the game launched in the FC6 thread.

As for this question - Why does it seem to be a widespread problem across a whole range of nvidia gpus and not amd? Widespread problem seems a bit of a stretch, but that can likely be answered by looking at the computerbase chart I posted earlier. Only a few Nvidia GPUs actually meet the minimum requirements of 12-16GB to run the 4K Texture pack yet we have every man and his dog trying to run it and then wondering why they have problems.
 
Your rather forgiving @LtMatt if you can recall on your point about the minimum requirements, I'm sure I mentioned it was basically pebkac due to the instructions stating do not install unless you have x vram. I remember someone lost their mind at such as they have to be correct about everything and like a dog with a bone wont drop it. It was reiterated by yourself and a couple of others, yet still it was the broken games fault, other people all over the internet shouting about it (ubiiiiiii forum!! [godfall]), yet don't think to look at the developers suggestion to again not install the HD texture pack as it wont run well at all. Tommy needs to invest in more system RAM amongst other things..

A bit rich demanding evidence from 'reputable sources' yet when it comes to ubisoft forum happy to take their issues as gospel. :rolleyes: The ship most definitely sailed and has circumnavigated the globe thousands of times since. HUB Steve thinks so too. :cry:
 
That's anecdotal as we have no information to go on about those users you found from the UBI forum. The reason we put weight behind the likes of Tommy/Gerard is both are savvy users with high end PCs running 3080s. These are not novices from the internet. And their findings are validated by multiple tech media outlets, that's the thing.

Tommy can speak to the questions asked, but as I understand it he only solved the issue by disabling the HD Texture pack. Certainly for Gerard that was the case and he confirmed as much already in the Far Cry 6 thread.

When I had a 3090 I did not experience any issues with Far Cry 6, with or without ReBar and that was a month or so ago. Sure it was 10-15% slower when overclocked than my overclocked Toxic in FC6, but it didn't have any FPS drops. As I recall @gpuerrilla is using a 3090 and completed Far Cry 6 at 4K max settings without FPS drops too. So, if there was a big issue with FPS drops not related to VRAM, we'd have seen it too, right?

Most likely those users you reference on the UBI forums have local system issues, certainly there is not enough evidence to suggest it's an issue with the game that the developers are trying to fix. I think that ship sailed when you used that excuse previously when the game launched in the FC6 thread.

As for this question - Why does it seem to be a widespread problem across a whole range of nvidia gpus and not amd? Widespread problem seems a bit of a stretch, but that can likely be answered by looking at the computerbase chart I posted earlier. Only a few Nvidia GPUs actually meet the minimum requirements of 12-16GB to run the 4K Texture pack yet we have every man and his dog trying to run it and then wondering why they have problems.

How? They have posted their pc stats of which were no slouches, iirc, various setups being intel and amd cpus.

One of them said they modified their page file, which you then put their issue purely down to that, which it could very well be.... but again, no one can confirm that except ubi developers.

Again, you say "multiple" tech media outlets as if making it out like several sites, when it is just 2 to my knowledge? Meanwhile we have TPU and HUB who haven't shown/stated the same issues with FPS dropping to single digits? No idea on other sites but can't say they have stated anything, as if they had, I'm sure we would be hearing about it..... In terms of users, I have uploaded my footage and stated that when I too enable rebar I get the same issue (with no FSR) but without rebar, no issues.... Why is it when you can clearly see my 10gb vram exceeding, fps doesn't plummet and frame latency is a smooth line? Surely if it were "vram and nothing" else, in theory, my 3080 should have completed locked up and dropped to single digit fps or at the very least, shown huge frame latency stuttering? PS. was meaning to post this a while back, but remember you were saying the graphical corruption was down to running out of vram and nothing else, well that was fixed by a nvidia driver update:

Windows 10/11 Issues
- [Windows11][Far Cry 6]: Geometric corruption occurs in the benchmark and in gameplay. [3441540]

Can't be bothered going back in threads but Tommy for 100% stated the issue was also happening at 1440p with fsr and when the hd texture pack was disabled. How do you explain that? Bearing in mind, he exceeds the specs for the requirements without the hd texture pack and that was at 1440p with fsr too.... iirc, Gerard stated the same except he also mentioned he was getting a lot of CTD, again, something I and many others have not noted....

Again, you're kind of proving my point with the whole "just because 1-2 people experience it or don't experience it" doesn't mean you can apply it across the board... until it is unanimous, you can't just take 1-2 people and 2 sites and say "vram and nothing else"...

Also, if you look at the ubi forums, you would see it is happening on a wide range of nvidia cards even with >=12GB vram.

Except it didn't as not remember all my posts with screenshots showing texture issues etc., something you were adamant was 100% down to vram and nothing else, developers announced they found the issue and were looking into, fast forward 2-3 months (?), developers released a patch, which fixed that issue for several users on here and just so happens they dropped their 16GB vram requirement down to 12gb vram requirement.....

Like I said all along in the FC thread as per the texture loading issue, maybe there is something on nvidias driver end they need to sort for those seeing fps drops or/and maybe it is a game side thing too....
 
Your rather forgiving @LtMatt if you can recall on your point about the minimum requirements, I'm sure I mentioned it was basically pebkac due to the instructions stating do not install unless you have x vram. I remember someone lost their mind at such as they have to be correct about everything and like a dog with a bone wont drop it. It was reiterated by yourself and a couple of others, yet still it was the broken games fault, other people all over the internet shouting about it (ubiiiiiii forum!! [godfall]), yet don't think to look at the developers suggestion to again not install the HD texture pack as it wont run well at all. Tommy needs to invest in more system RAM amongst other things..

A bit rich demanding evidence from 'reputable sources' yet when it comes to ubisoft forum happy to take their issues as gospel. :rolleyes: The ship most definitely sailed and has circumnavigated the globe thousands of times since. HUB Steve thinks so too. :cry:

zzzzz, as per usual no substance and going purely on other peoples experiences/opinions I see:

I would go further and use my own experience, I mean if you are an enthusiast or have been working in the field for long enough you should already know what's a turd or mediocre. If you base your whole purchase on someone else's opinion you would have to be a novice or plain lazy.

:cry:

Clearly your memory has gone again or perhaps as per usual, you don't read the links that are presented right in front of you and instead want to be lazy or/and a novice by "echoing" other peoples posts to suit your narrative..... as you don't seem to recall of the "texture loading/rendering" issue being fixed by a patch, which the developers stated many times that they had found the issue and would be rolling out a fix.... Shock horror how the vram requirements dropped from 16gb vram to 12gb vram too, who knows, if they do find the issue, which seems to be affecting "some" 3090 owners, maybe that will requirement will drop from 12 to 10gb :cry:

PS.

Thanks for once again further proving this post of mine or rather proving your own point :cry:

Except that's the thing... I and others have "noted" it and offered our inputs as to "potential" reasons why certain people are getting that issue, no one is not saying it's not happening for "certain" systems/users...

It's a certain band of people who don't want to look at the bigger picture or "acknowledge" some other "facts" i.e. why are some 3090 users experiencing fps drops? Why does it seem to be a widespread problem across a whole range of nvidia gpus and not amd? Why did likes of tommy experience this issue even at 1440P and using FSR with NO HD texture pack? (although he recently stated that no longer happens but didn't answer my question as to if that was from upgrading his cpu or/and adding 16GB more RAM.....) Why is my 3080 not showing the issue (unless I enable rebar and don't use FSR)? Why have the developers stated they are looking into the issues if it's purely just vram amount and nothing else (even when people are experiencing the issues without the HD texture pack)? Why do HUB etc. not note the same experience with the 3080 in their recent 3080 vs 6800xt video? Perhaps because they didn't enable rebar in that testing? Why does the FPS remain at single digits and not return to higher FPS like it does in every other game when there is a legit shortage of vram i.e. as per my cp2077 video.... Maybe a vram management issue? Hence why the developers might be looking into......

These are the questions that no one can or will come back to after all this time..... So it's not a clear cut "vram and nothing else" that a small minority are leading it on to be when you look at the whole picture hence the:

Rather than just taking a valid observation and knowing this is indeed a thing to consider

Being very true but it's one side who aren't doing that.....

PS. PS. I see you still haven't figured out that Steve is referring to the 3070 8GB and not the 3080 10GB, oh lordy :cry:


@Woodsta888 @TNA got any popcorn left? :D
 
Your rather forgiving @LtMatt if you can recall on your point about the minimum requirements, I'm sure I mentioned it was basically pebkac due to the instructions stating do not install unless you have x vram. I remember someone lost their mind at such as they have to be correct about everything and like a dog with a bone wont drop it. It was reiterated by yourself and a couple of others, yet still it was the broken games fault, other people all over the internet shouting about it (ubiiiiiii forum!! [godfall]), yet don't think to look at the developers suggestion to again not install the HD texture pack as it wont run well at all. Tommy needs to invest in more system RAM amongst other things..

A bit rich demanding evidence from 'reputable sources' yet when it comes to ubisoft forum happy to take their issues as gospel. :rolleyes: The ship most definitely sailed and has circumnavigated the globe thousands of times since. HUB Steve thinks so too. :cry:
I know I'm banging my head against a brick wall, I should give up. :cry:
 
I know I'm banging my head against a brick wall, I should give up. :cry:

It's pointless going on about it anymore, been through the exact same points many times, most of which the questions are often ignored or not answered sufficiently thus pointless so best to just agree to disagree and all that until something of concrete evidence/substance comes up.....

If want to continue this, keep to one of the many other vram related threads or the fc 6 thread.




Matt can you maybe get someone in amd to do an in-depth look at Sam/rebar and what's really required to get the best from it as would love to know why some games see no difference, is there something game developers need to do on their side? Would have expected fc 6 to have seen a big benefit with sam/rebar being amd sponsored and vram....
 
Matt can you maybe get someone in amd to do an in-depth look at Sam/rebar and what's really required to get the best from it as would love to know why some games see no difference, is there something game developers need to do on their side? Would have expected fc 6 to have seen a big benefit with sam/rebar being amd sponsored and vram....
I can already answer that, you just need a Ryzen 5000 series CPU and a RDNA2 GPU for the best SAM performance across multiple games. As to why some games show big gains and some show none, that's above my pay grade. What I do know is that there is no logical reason to disable SAM using the configuration above, that's why it's automatically enabled during driver installation on compatible systems. It's not enabled globally because there's a long list of games where performance suffers. That's why the excuses used to justify keeping it off don't wash with me.

If HUB and the testing was done on older CPUs, sure I'd go along with keeping it off. They don't test on older CPUs though, not for their hardware/game benchmark tests for the most part. Typically you don't want to artificially limit system performance when testing graphics cards as you want to show the best possible performance each card can offer.

If you want to show limited performance on older CPUs where bottlenecks may be present, fair enough. That's a different video though.
This guys channel isnt bad @LtMatt although he might not be on the official reputable sources list ;)

He does have both AMD and nvidia cards (6800XT; 3080 12Gb) so might have some snippets from his comparison/charts etc.
If he's not going to believe actual users with the issue and reputable sites such as pcgh and computerbase, he's not going to believe that bloke. :p

I like his channel though and do watch his content sometimes. :)
 
Can't be bothered going back in threads but Tommy for 100% stated the issue was also happening at 1440p with fsr and when the hd texture pack was disabled. How do you explain that? Bearing in mind, he exceeds the specs for the requirements without the hd texture pack and that was at 1440p with fsr too

I 100% said multiple times that happened ONLY while trying to recover the game by reducing settings after the arse falls out it!

ONLY happened AFTER the game ran out of vram on 4k HD texture pack/Rt/max settings and then trying ingame to try and recover it.

(Remember-trying to recover a slideshow here)It would not recover no matter reducing whatever settings all the way down to 1440p with FSR.

If it runs out-forget recovering WITHOUT a restart.

If you reload the game with reduced settings of course it runs 1440p FSR!



I gave up replying to you about it because you don't listen!

The tpu FC6 performance slides state: Max Details, RT off, don't see any mention of running the HD pack either:

https://www.techpowerup.com/review/far-cry-6-benchmark-test-performance/4.html
 
I'm amazed at your patience @Nexus18.

Bottom line, Microsoft, Nvidia and Sony all plonked for 10GB this generation, a 25% increase over previous for PC. Two highly unoptimised AMD sponsored titles require more on PC, therefore Microsoft, Nvidia and Sony all got it wrong according to our resident knitting circle. Elden Ring is a good example of how to stream data, which at 1440p uses 4-5GB of VRAM, with very little CPU / GPU overhead.

I suspect Nvidia just isn't putting the effort in to RBAR at the moment as they have no incentive to do so while developing both Lovelace and Hopper. But as you say, it would be good to get some low level input.
 
I'm amazed at your patience @Nexus18.

Bottom line, Microsoft, Nvidia and Sony all plonked for 10GB this generation, a 25% increase over previous for PC. Two highly unoptimised AMD sponsored titles require more on PC, therefore Microsoft, Nvidia and Sony all got it wrong according to our resident knitting circle. Elden Ring is a good example of how to stream data, which at 1440p uses 4-5GB of VRAM, with very little CPU / GPU overhead.

I suspect Nvidia just isn't putting the effort in to RBAR at the moment as they have no incentive to do so while developing both Lovelace and Hopper. But as you say, it would be good to get some low level input.

You're comparing Game Consoles to high end Graphics Cards.
 
Time for a popcorn.gif trying to watch Wrinkly explain something he does not understand. :cry:

Right exactly.... he heard somewhere that the PS5 SSD is really fast so there is no need for VRam capacity because PCIe 4/5 SSD's.....
 
Ideally from system RAM at the moment before falling back to slower drives. Where did you think?


Your System RAM is magnitudes slower than your VRAM, any slowdown in texture streaming is a slowdown in performance.

The system will not do it unless it really needs to, but if your texture files are too large they end up shunted to System RAM, what you get then is a bottleneck in streaming and you end up with the effective memory bandwidth of an Integrated GPU.
 
Time for a popcorn.gif trying to watch Wrinkly explain something he does not understand. :cry:

I'm a self taught developer who started with the 1k ZX81 and a copy of programming the Z80 by Rodney Zaks. I also did some 6502 and 68000 before landing on x86 at 16 where I started professionally developing bespoke systems. At 21 I started my own little company which still, at 53, affords me a somewhat relaxed life style. I mention my early experience with software as it taught me to be efficent with both code and data.
 
Back
Top Bottom