• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Far Cry 6 GPU performance not bad at all but is severely bottlenecked by CPU



I ran the benchmark myself just to see what I get. As I don't notice any FPS dips like that when driving around. I would have to concur with LtMatt. And do wonder if you are noticing stutters while driving with fps dipping to the sub 60 fps?

Not seeing any severe dips to 31 fps @ 4k or 49 fps @3440x1440 21.9 as what my benchmark shows. It was highlighted by various sites and people here when the game first came out that the benchmark wasn't a great reflection of actual gameplay performance.

Just thinking, I didn't even try to play at 4k with no FSR, only attempted the benchmark, silly me :) Will give that a go later/tomorrow and report back, although a bit pointless really given that at 4k, no GPU has the grunt if you want max settings and high FPS so you'll want to be using FSR/DLSS/NIS tech.
 
Not seeing any severe dips to 31 fps @ 4k or 49 fps @3440x1440 21.9 as what my benchmark shows. It was highlighted by various sites and people here when the game first came out that the benchmark wasn't a great reflection of actual gameplay performance.

Just thinking, I didn't even try to play at 4k with no FSR, only attempted the benchmark, silly me :) Will give that a go later/tomorrow and report back, although a bit pointless really given that at 4k, no GPU has the grunt if you want max settings and high FPS so you'll want to be using FSR/DLSS/NIS tech.
I stop using MSI Afterburner because of the performance delta with it enabled wasn't worth it for me. I now fully trust Radeon now. Not sure what they call the OC'ing feature now. Doing so has provided both lower latency and higher FPS. Performance results in this game have been fabulous to be frank. I haven't tried FSR as of yet though. That is still disabled. Hmm, you giving me ideas... I'll try it on ultra only at 2k/4k just to see what it does... BRB...
 
I stop using MSI Afterburner because of the performance delta with it enabled wasn't worth it for me. I now fully trust Radeon now. Not sure what they call the OC'ing feature now. Doing so has provided both lower latency and higher FPS. Performance results in this game have been fabulous to be frank. I haven't tried FSR as of yet though. That is still disabled. Hmm, you giving me ideas... I'll try it on ultra only at 2k/4k just to see what it does... BRB...

Can't say I've ever had any issues with RT and MSI AB tbf, do miss radeons drivers big time and having one go to panel/software for everything though!

Personally wouldn't advise it for 1440P, it enhances TAAs artifacts and over sharpens far too much imo i.e. worse aliasing/shimmering and ghosting/trailing (switch between SMAA and TAA to really see the difference in motion clarity) issues but at 4k, it isn't as noticeable and the perf. increase you get is worth it.
 
Can't say I've ever had any issues with RT and MSI AB tbf, do miss radeons drivers big time and having one go to panel/software for everything though!

Personally wouldn't advise it for 1440P, it enhances TAAs artifacts and over sharpens far too much imo i.e. worse aliasing/shimmering and ghosting/trailing (switch between SMAA and TAA to really see the difference in motion clarity) issues but at 4k, it isn't as noticeable and the perf. increase you get is worth it.


Ok here is the update results using FSR set to Ultra Quality. I really couldn't tell the difference while in motion though. This is a quick run though. I will keep FSR off as I really don't need it for now, tehe.
 
Just thinking, I didn't even try to play at 4k with no FSR, only attempted the benchmark, silly me :) Will give that a go later/tomorrow and report back, although a bit pointless really given that at 4k, no GPU has the grunt if you want max settings and high FPS so you'll want to be using FSR/DLSS/NIS tech.
This is not true, high end GPUs can run 4K max settings with high FPS just fine in Far Cry 6 at maximum settings, with RT + HD Textures.

Looking at your 4K benchmark, I have higher minimum FPS than your average FPS. No need for image reconstruction to get good FPS.

6900 XT 4K HD Textures, RT + Max Settings.
AMTRMDq.png

3080 4K HD Textures, RT + Max Settings.
pAA4okL.png

37% higher average FPS and 103% higher minimum FPS. Guess that extra video memory comes in handy as per the requirements.

I am fairly sure a 3090 would produce similar numbers to the 6900 XT also.
 
This is not true, high end GPUs can run 4K max settings with high FPS just fine in Far Cry 6 at maximum settings, with RT + HD Textures
....
I am fairly sure a 3090 would produce similar numbers to the 6900 XT also.

Dont be silly Matthew!!!

Seems to only be certain people with a certain bias/loyalty to one company and another person trying to justify his extra expensive purchase who are defending the game :D

:cry::cry: not one to broken record is he..
 
This is not true, high end GPUs can run 4K max settings with high FPS just fine in Far Cry 6 at maximum settings, with RT + HD Textures.

Looking at your 4K benchmark, I have higher minimum FPS than your average FPS. No need for image reconstruction to get good FPS.

6900 XT 4K HD Textures, RT + Max Settings.
AMTRMDq.png

3080 4K HD Textures, RT + Max Settings.
pAA4okL.png

37% higher average FPS and 103% higher minimum FPS. Guess that extra video memory comes in handy as per the requirements.

I am fairly sure a 3090 would produce similar numbers to the 6900 XT also.

Again, go see guru3d, DF etc. comments on the benchmark and gameplay performance comparison....
And again, where have I said a 3080 should be matching or even beating a 3090/6900xt??? :confused: Isn't it well regarded that a 3090/6900xt is about 15% better than a 3080/6800xt???? (comparing stock for stock). I've noticed this quite a bit with people comparing 3080 to a 6900xt and not sure why given the completely different price range as well as different tiers said cards fall in, is the 3080 that good in some peoples eyes? :cry:

That depends on the individual doesn't it though? If you have a high refresh rate display, you want to get as much FPS as possible, think it is safe to say, 99% of people on this forum would take the extra perf. at 4k if they have an option too and as per your comment earlier back, seems you agree:

I could easily play at 4K with FSR UQ since it looks good. I can notice the small drop in image quality, but all those extra FPS make it worth it IMO if you need the extra frames.

As an example, my 6900 XT, overclocked can handle 4K max settings easily, with FPS generally 60-90 range, averaging way over 60.

However, my 6800 XT is around 20%+ slower, so that brings the FPS down so using that card I would choose to use FSR UQ at 4K max settings for the small hit to image quality, which is largely unnoticeable when playing normally and I'd happily take those extra FPS.

Given that my frame latency is nice and smooth AND HD textures are working as they should be, 3080 10GB looks to be working without issue.


PS.

More and more users reporting the issues with textures are fixed, guess fingers shall remain in the ears though :cry:

https://discussions.ubisoft.com/top...rry-in-game-post-here/1366?lang=en-US&page=69
 
Last edited:
You said a high end GPU cannot handle 4K max settings, I showed you it can.

My FPS numbers are similar in game to the benchmark and it’s an accurate representation of performance seen in game of various parts of the islands.

My comments on FSR still applies, but no need to use FSR if grunt is available, which was clearly mentioned.

Looking at your FPS, you would need to use FSR as the experience would not be great. You can pretend it Is to try and save face, but your numbers tell a different story.

Since you earlier said FSR is terrible in this game, it looks like you have a bit of a dilemma on your hands.
 
You said a high end GPU cannot handle 4K max settings, I showed you it can.

My FPS numbers are similar in game to the benchmark and it’s an accurate representation of performance seen in game of various parts of the islands.

My comments on FSR still applies, but no need to use FSR if grunt is available, which was clearly mentioned.

Looking at your FPS, you would need to use FSR as the experience would not be great. You can pretend it Is to try and save face, but your numbers tell a different story.

Since you earlier said FSR is terrible in this game, it looks like you have a bit of a dilemma on your hands.

Important point in bold:

"no GPU has the grunt if you want max settings and high FPS so you'll want to be using FSR/DLSS/NIS tech"

Again, a subjective thing, to some 60 fps is considered high, to some 80 fps is considered high, to some only 100+ fps is considered high.

Once again, I have already said, FSR UQ is "required" on a 3080 if I want max settings @ 4k.... Although I have yet to try it without FSR in actual gameplay, tomorrow I'll report back on that.

Given you still doubt that the textures issue is fixed for cards with =<10GB VRAM despite there being numerous posts on it now, you won't believe me regardless what I post, even if a hundred videos are posted proving you wrong as per usual :cry:

And again, just in case you missed it or choose to ignore it..... I have already stated I don't use FSR at 1440P because it harms IQ "overall" considerably but at 4k, I would because it is "required" and the reduction in IQ with FSR isn't as noticeable @ 4k. Saying that, since NIS provides better quality than FSR, I would probably use that instead if it works, the perks of having access to all kinds of goodies ;) However, for my normal gameplay sessions, I reduce a few settings to high that make little to no visual difference but cost a lot i.e. volumetric clouds and shadows because the 3080 doesn't achieve said FPS I like/require on my 3440x1440 144HZ display because of the lack of "grunt" and it's better turning them down than using FSR at that res.....
 
KHQfR.gif

I unsubed and completely forgot about this game. Too busy enjoying Age of Empires 4 :D

Waiting for the new super cards to come out so I sell my 3070 to CEX for a nice juicy profit which will pay for the Super card :p
 
Well done on completing it 100% mate.

Please do try what I suggested though above and see if the driver has anything to do with it. Am curious because it ran fine on a 6700 XT long before this patch and the ones before it and that also has 12GB.

Haha. I don't think TNA cares either way. :p

It's just fingers in ears Nexus singing his lonesome song as usual. :cry:

I will be doing some detailed testing tonight but I noticed something new with the update yesterday. Using the launch day NVIDIA drivers enables Resizable BAR for the game and vram usage is increased and the game barely uses any system Ram for caching. After 20 minutes of playing the game it slowed to a crawl at 30 fps as vram usage exceeded 12GB (system Ram of 400mb used for textures) and there were still no blurry textures. Seems the game is now saturating the vram completely and starts stuttering when it exceeds 12GB.

With the newest drivers which disable resizable bar, I noticed that the game doesn’t cross 11.5gb of vram usage but the shared system Ram usage is huge at 2.5Gb of allocation and the game is now using 15GB of system Ram of the 32GB I now have. With resizable bar enabled, Ram usage doesn’t cross 13gb.

I think that’s why NVIDIA disabled rebar support in the newer drivers for this game.
Seems like a couple of people are having that (shadows flickering) looking at nvidia reddit, can't say I have noticed it yet but I'm stuck in night time atm so might be more noticeable during day time (given that iirc, only the sun casts ray tracing shadows?)

How about max settings with RT and HD textures + FSR UQ @ 4k? Play fine and no texture issues?

EDIT:

My RAM usage is defo much higher now, before patch it was around 10GB..... Wonder if they are doing some caching to system RAM as a fix for the texture issue?

Ram usage has increased and it’s peaking at 14.9GB for me while it never crosses 11GB pre patch. 16GB of Ram is right on the edge if you are using the HD pack.

This is not true, high end GPUs can run 4K max settings with high FPS just fine in Far Cry 6 at maximum settings, with RT + HD Textures.

Looking at your 4K benchmark, I have higher minimum FPS than your average FPS. No need for image reconstruction to get good FPS.

6900 XT 4K HD Textures, RT + Max Settings.
AMTRMDq.png

3080 4K HD Textures, RT + Max Settings.
pAA4okL.png

37% higher average FPS and 103% higher minimum FPS. Guess that extra video memory comes in handy as per the requirements.

I am fairly sure a 3090 would produce similar numbers to the 6900 XT also.

Haven’t run the benchmark post patch yet but I think your 6900XT is way faster than the 3090 based on those results. I get 70 fps average with 64 minimum and 3090 would be ~2% faster. The minimums entirely depend on cpu and not vram in my experience as they shot up from 45 to 64 fps just my upgrading from 9900k to 12700k.
 
I will be doing some detailed testing tonight but I noticed something new with the update yesterday. Using the launch day NVIDIA drivers enables Resizable BAR for the game and vram usage is increased and the game barely uses any system Ram for caching. After 20 minutes of playing the game it slowed to a crawl at 30 fps as vram usage exceeded 12GB (system Ram of 400mb used for textures) and there were still no blurry textures. Seems the game is now saturating the vram completely and starts stuttering when it exceeds 12GB.

With the newest drivers which disable resizable bar, I noticed that the game doesn’t cross 11.5gb of vram usage but the shared system Ram usage is huge at 2.5Gb of allocation and the game is now using 15GB of system Ram of the 32GB I now have. With resizable bar enabled, Ram usage doesn’t cross 13gb.

I think that’s why NVIDIA disabled rebar support in the newer drivers for this game.


Ram usage has increased and it’s peaking at 14.9GB for me while it never crosses 11GB pre patch. 16GB of Ram is right on the edge if you are using the HD pack.



Haven’t run the benchmark post patch yet but I think your 6900XT is way faster than the 3090 based on those results. I get 70 fps average with 64 minimum and 3090 would be ~2% faster. The minimums entirely depend on cpu and not vram in my experience as they shot up from 45 to 64 fps just my upgrading from 9900k to 12700k.
Cheers, so it was a combination of a driver update and a game update which ultimately resolved the issue.

I see your minimum FPS are also 100% higher than the 10GB 3080, which is what I see too, which is the most important metric here other than the average FPS.

I think this is why the developer recommends a 12GB minimum for 4K resolution using maximum settings (which is what the debate has been about all this time despite the goal posts being moved recently) and why the 3080 chugs according to Nexus's numbers when trying to run it.
 
Last edited:
Seems like some intentionally (at least when we look at old posts ;)) seem to be forgetting about the CPU optimisation in this game and the potential driver overhead from nvidia. I would like to think a 12700k and highly tweaked 5950x (which no doubt had the AC unit beside it along with the windows open for the benchmark run :cry:) did better than an undervolted 5600x :D

Yes you make a good point. I did not consider CPU overheard when I was comparing minimum FPS so that probably explains it.

I think you hit the nail on the head. Look at the comment in the video you posted, user with 3090 and 5950X only getting 73 FPS average at 1440P.
39vVZCt.png

Yet I get 107 average FPS at 1440P, max settings same CPU but different GPU, so there has to be some CPU driver overhead there surely.

The benchmark is okay but should not be used as a true measure of actual performance in my opinion unless you run at 4K max settings.

I don't see how it makes any difference being at 4k.... If the benchmark is flawed for other res. then chances are it isn't going to be a 100% accurate representation for gameplay at 4k either..... Maybe because it fits a certain narrative?

It looks more like the benchmark is just "buggy" on NVidia hardware, there's something very wrong when the stated minimum fps doesn't bear any resemblance to the graph right next to it and it can't be VRAM related if it's also happening on RTX3090.

To be fair if you watch the full video, even my utilisation drops half way through, but at the start it's up over 90%. Our GPUs are just too fast. ;)

Goalposts time...

You said a high end GPU cannot handle 4K max settings, I showed you it can.

My FPS numbers are similar in game to the benchmark and it’s an accurate representation of performance seen in game of various parts of the islands.

My comments on FSR still applies, but no need to use FSR if grunt is available, which was clearly mentioned.

Looking at your FPS, you would need to use FSR as the experience would not be great. You can pretend it Is to try and save face, but your numbers tell a different story.

Since you earlier said FSR is terrible in this game, it looks like you have a bit of a dilemma on your hands.

I will be doing some detailed testing tonight but I noticed something new with the update yesterday. Using the launch day NVIDIA drivers enables Resizable BAR for the game and vram usage is increased and the game barely uses any system Ram for caching. After 20 minutes of playing the game it slowed to a crawl at 30 fps as vram usage exceeded 12GB (system Ram of 400mb used for textures) and there were still no blurry textures. Seems the game is now saturating the vram completely and starts stuttering when it exceeds 12GB.

With the newest drivers which disable resizable bar, I noticed that the game doesn’t cross 11.5gb of vram usage but the shared system Ram usage is huge at 2.5Gb of allocation and the game is now using 15GB of system Ram of the 32GB I now have. With resizable bar enabled, Ram usage doesn’t cross 13gb.

I think that’s why NVIDIA disabled rebar support in the newer drivers for this game.

Ram usage has increased and it’s peaking at 14.9GB for me while it never crosses 11GB pre patch. 16GB of Ram is right on the edge if you are using the HD pack.

Haven’t run the benchmark post patch yet but I think your 6900XT is way faster than the 3090 based on those results. I get 70 fps average with 64 minimum and 3090 would be ~2% faster. The minimums entirely depend on cpu and not vram in my experience as they shot up from 45 to 64 fps just my upgrading from 9900k to 12700k.

Yeah sadly nvidia have a lot of work to do with re-bar it seems but not sure how much they can really do given they don't have anything to do with the chipset/cpu side of things....

It has been stated a few times recently about how nvidia handle vram optimisation much better than amd so perhaps that is also why nvidia users don't see as much of a boost as what amd gpu users do with SAM?

Would be interesting to see a 3090 owner do some rbar on vs off comparisons.
 
That's not going to explain your 1 FPS, or even 30 FPS minimum. You are using a 5600X too, I forgot about that. So not like you can blame your CPU for your bad numbers.

That's your 3080 not meeting the minimum requirements for 4K, + HD Textures with maximum settings. ;)

Nothing magical used to generate those numbers either, not like much effort is required to topple your minimum FPS tbh. Even my Radeon VII probaly gets better minimum FPS than your 3080, but that would be because it has 16GB also. :D
FTFY :p ;) also dont forget #CP77fanz
Out of Context Nexus would be a good Twitter account. Everything's out of context, goal posts moving on a daily basis to suit the narrative. :cry:
 
Last edited:
That's not going to explain your 1 FPS, or even 30 FPS minimum. You are using a 5600X too, I forgot about that. So not like you can blame your CPU for your bad numbers.

That's your 3080 not meeting the minimum requirements for 4K, + HD Textures with maximum settings. ;)

Nothing magical used to generate those numbers either, not like much effort is required to topple your minimum FPS tbh. Even my Radeon VII probaly gets better minimum FPS than your 3080, but that would be because it has 16GB also. :D

Out of Context Nexus would be a good Twitter account. Everything's out of context, goal posts moving on a daily basis to suit the narrative. :cry:

Again where have I said the 1fps is down to anything other than the vram? I even said it in my very post after trying the new patch:

- texture issues are "completely" gone with max settings on both 3440x1440 (don't use FSR at this res. as it's crap in this game) and 4k, however, upon starting the benchmark for 4k with no FSR, it starts at 1 fps so obviously vram being completely choked (which I didn't have before) but..... once FSR UQ is enabled, runs without issues and again, no texture issues at all. Not sure if just me but noticing a lot more RAM being used.

I'm going to give FC 6 a go in bit in gameplay to see how it does. Also, a constant 1-3fps is not "normal" for vram issues unless you can show me other games where this happens too? I'll post some icarus screenshots later where the texture shading is maxed right up to show what an "expected" vram issue should really look like.

Remind me again, what you have your highly tweaked 5950x at? ;) Also, a 12700k is considerably better for it's single core perf.

Again, actually read the posts and technical pieces with evidence unlike your empty claims....

https://www.dsogaming.com/news/far-cry-6-suffers-from-major-cpu-single-threaded-issues/

And remind me again, why exactly are you comparing a 6900xt to a 3080? And tell me where I have stated that a 3080 should be matching a 6900xt/3090??? :confused:

Out of context, yet it is right there in black and white.... "benchmark not really a great representation of actual gameplay".... next moment "benchmark is very accurate representation of gameplay", @TNA pass me the popcorn :cry:
 
One of the tiny minority on the forums that quote stalk you on threads. Has to be a medical condition explaining the behaviour as its definitely a new phenomenon now people are keyboard warriors online addicted.

If people post misinformed **** or without backing up said posts with a hidden motive, I'll call them out and prove them wrong as I have done with every other claim so far, my favourite still being the "gsync adds input lag" :cry:
 
Back
Top Bottom