• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Far Cry 6 GPU performance not bad at all but is severely bottlenecked by CPU

One of the tiny minority on the forums that quote stalk you on threads. Has to be a medical condition explaining the behaviour as its definitely a new phenomenon now people are keyboard warriors online addicted.
He's precious. :cry:

It runs at stock settings with PBO and curve optimiser enabled. It will be at most a few percent faster than your 5600x in Far Cry 6 since the game only uses a handful of CPU threads. It won't make any difference at 4K though.

You didn't want to read technical pieces when numerous outlets (PcGamesHardware, Computerbase, TechPowerup) outlined the issues regarding the 3080 and video memory saturation in this game at 4K max settings.

In fact, you outright ignored them and the minimum requirements to run 4K with everything at max settings, even when the developer came out and clarified and released updated requirements KB article reiterating said requirements.
tZ7rlbd.png

The game is CPU limited at 1080P, a little at 1440P. It is not at all limited at 4K. Since this whole discussion is about 4K max settings and video memory usage requirements, it's irrelevant. Your bad numbers are at 4K, not 1080P.

Video memory saturation can show in multiple ways and it comes down to how the game engine handles memory. FPS can fall off a cliff (down to single digits), there can be stuttering and bad 1% lows/drops etc, the textures can load in at low quality, the application can crash. It depends on the game and not all engines handle it the same as we see over various games.

The comparison is made since you obviously forgot what you posted on the previous pages. Declaring that you are right and 10GB is sufficient with a 'I am right gif' and patting yourself on the back for being right all along. Clarifying performance is great and then posting those 4K numbers that show your definition of great can get in the bin. :p

I think maybe you get confused about what you have previously posted at times. :cry:
 
Had a 30 minute play there, just running through jungle, shooting any patrols about

- At 4k with no fsr, max settings, RT and hd textures, lowest dip I seen was 49 but as you can see from the frame latency graph, perfectly playable "if" you're happy with that FPS
- At 4k with FSR UQ, max settings, RT and hd textures, lowest dip I seen was 60 but as you can see from the frame latency graph, perfectly playable too "if" you're happy with that FPS

vOyGXP5.jpg

QDwEIjG.jpg

Will play later on for longer periods and get to day time setting to see fps then.



It runs at stock settings with PBO and curve optimiser enabled. It will be at most a few percent faster than your 5600x in Far Cry 6 since the game only uses a handful of CPU threads. It won't make any difference at 4K though.

You didn't want to read technical pieces when numerous outlets (PcGamesHardware, Computerbase, TechPowerup) outlined the issues regarding the 3080 and video memory saturation in this game at 4K max settings.

In fact, you outright ignored them and the minimum requirements to run 4K with everything at max settings, even when the developer came out and clarified and released updated requirements KB article reiterating said requirements.
tZ7rlbd.png

The game is CPU limited at 1080P, a little at 1440P. It is not at all limited at 4K. Since this whole discussion is about 4K max settings and video memory usage requirements, it's irrelevant. Your bad numbers are at 4K, not 1080P.

Video memory saturation can show in multiple ways and it comes down to how the game engine handles memory. FPS can fall off a cliff (down to single digits), there can be stuttering and bad 1% lows/drops etc, the textures can load in at low quality, the application can crash. It depends on the game and not all engines handle it the same as we see over various games.

The comparison is made since you obviously forgot what you posted on the previous pages. Declaring that you are right and 10GB is sufficient with a 'I am right gif' and patting yourself on the back for being right all along. Clarifying performance is great and then posting those 4K numbers that show your definition of great can get in the bin. :p

I think maybe you get confused about what you have previously posted at times. :cry:

Emmmm what.....

I never disputed the issues, I posted screenshots showing my issues and other threads with 70+ pages where people were posting issues too...

It was you who was adamant that it was "only" because of vram and nothing else despite the developers acknowledging that there was an issue with vram/texture loading/management and here we are, it's fixed and a non-issue now :cry: Shall I go and quote some more posts showing said fingers in ears? :D

Again, re-read your posts where you have acknowledged there is an issue with the min fps even at 4k in the benchmark :D

My "i told you so" was in reference to the following:

- saying there was an issue with the game and it's texture/vram management (again, shouldn't have come as a surprise given the developers did say there was an issue....)
- perf., which going by my play testing is a noticeable step up whilst not having texture issues now

Again, list the games where vram causes FPS to drop to 1-3fps and stay at that regardless?

I was going to post icarus comparison there to show the frame latency difference (not the best game given its well documented optimisation issues) but it appears they have fixed/improved the perf. in the last patch as it isn't all over the place like before when the texture shading was set to 10GB, FPS is a lower with max texture shading but it is an improvement to before where the frame latency was all over the place:

AsQqPmQ.jpg

S2AI2f0.jpg

Given how horrible the game optimisation is, still no consistent fps drop to 1-3fps...

EDIT:

Also, for my own sake, can you post where I said performance is "great" now???? Again, my main point here is:

- textures issues fixed (at all res. and settings)
- perf. improved whilst retaining HD textures throughout (at all res. and settings, unless we stick to your goal post of the benchmark being "super duper accurate and the only representation for perf." now... :cry:
 
Last edited by a moderator:
2021 graphics :)
1.jpg


2.jpg
 
2021 graphics :)
1.jpg


2.jpg

Yup, not the best game for things in the distance. Also, those screenshots were taken in motion i.e. whilst moving so there will be some blurring/rendering happening, which won't help matters.

It's very pretty when admiring the scenery in certain parts though and also by far the best fire physics I've seen, seeing a wild forest fire spread is very cool but defo a lot of work to be done on the optimisation side!

I'll post some better screenshots later.
 
It can't be that bad. This is how the mountains looked on dx8. :)
Really it looks awful. If we compare it with the mountains from HZD then guerilla's game is from another planet.
sony studios really give good care to their textures, i have to give it to them. its mind blowing, considering how limited their memory pool on those horrible consoles :D god of war was similar, texture quality was out of this world. specifically on kratos/main characters. but environment looked really good. i was also blown away by uncharted 4 in 2016.
 
Yup, not the best game for things in the distance. Also, those screenshots were taken in motion i.e. whilst moving so there will be some blurring/rendering happening, which won't help matters.

It's very pretty when admiring the scenery in certain parts though and also by far the best fire physics I've seen, seeing a wild forest fire spread is very cool but defo a lot of work to be done on the optimisation side!

I'll post some better screenshots later.
Maybe it has good looking parts but the edges on that mountain are an insult for a game made in 2021. It really looks like something made 20 years ago. If they were sponsored somehow by Nvidia that is even worse since one of the strongest features of Nvidia is tesselation.
 

You disputed the requirements, the posts are still there. You disputed the tech press findings, the posts are still there. I guess these are the issues you apparently never disputed.

All that happened was a driver and later game update addressed the issue for Nvidia GPUs with textures being blurry.

The benchmark has a glitch at the end where the Min FPS drops a bit lower. My min FPS was 72 ish or thereabouts in the run I posted compared to your 30 FPS with a 10GB card, but at the last second of the benchmark it drops to 63. It's not an opimtisation issue, you just don't meet the minimum requirements so the experience is not the same.

You run the benchmark and it starts at 1 FPS you said, that is not normal and is not what i was referring to in the post you mentioned.

Your minimum FPS are 30 according to your results, over 100% lower than 12GB cards so there's an issue there clearly wouldn't you agree? What do you think that issue is?

Do you think it's because your GPU falls below the requirements or is it just an 'optimisation issue' only affecting GPUs with less than 12GB bearing in mind the requirements?
 
Looks back at thread title.

I must be a unicorn then that uses an overpriced card that I need to justify. Nothing to do with that mammoth thread started ages back about will the 3080 suffer from stingy VRAM beyond 2020?

Display is 4k. Capped to stay within the spec of the sync range of the monitor. Loaded HD texture pack from day one (read instructions). Had one crash playing the game till completed it. Graphics were good, only gripe was the cut scenes playing it early on where the line scrolled down (cant explain the actual name for this) but it was on the interweb with other users too so ignored it. Seemed to have gone away after one of the patches I guess, but the cut scenes calmed down second half with lots other distractions to do.

It will be my fault for enjoying this game especially as the label on the back for owning a 3090. It will be my fault that I have not bought, installed and played CP77 as it should have been what this thread was about in niche arguments.

Did I need a new CPU to play this? No. 3600 seemed to cope fine. :)

See you all in the next thread where we argue debate about VRAM and current gen computational bullhorses! :cry:
 
2021 graphics :)
1.jpg


2.jpg
Is that from Far Cry 6? Where did those images come from?

I've just checked the rocks, whilst not perfect they do look better than those screenshots appear to show.






Looks back at thread title.

I must be a unicorn then that uses an overpriced card that I need to justify. Nothing to do with that mammoth thread started ages back about will the 3080 suffer from stingy VRAM beyond 2020?

Display is 4k. Capped to stay within the spec of the sync range of the monitor. Loaded HD texture pack from day one (read instructions). Had one crash playing the game till completed it. Graphics were good, only gripe was the cut scenes playing it early on where the line scrolled down (cant explain the actual name for this) but it was on the interweb with other users too so ignored it. Seemed to have gone away after one of the patches I guess, but the cut scenes calmed down second half with lots other distractions to do.

It will be my fault for enjoying this game especially as the label on the back for owning a 3090. It will be my fault that I have not bought, installed and played CP77 as it should have been what this thread was about in niche arguments.

Did I need a new CPU to play this? No. 3600 seemed to cope fine. :)

See you all in the next thread where we argue debate about VRAM and current gen computational bullhorses! :cry:
:cry::cry:

I think you are right, probably time to agree to disagree and bow out of this one since we are not getting anywhere anyway.
 
Maybe it has good looking parts but the edges on that mountain are an insult for a game made in 2021. It really looks like something made 20 years ago. If they were sponsored somehow by Nvidia that is even worse since one of the strongest features of Nvidia is tesselation.

I generally find that with a lot of games nowadays tbh, draw distance and LOD being very poor, ubis games are the worse for this especially assassins creed odyssey, literally looks like ps 2 textures in the distance but get up close, they look great.

The rocks and mountains in particular in icarus are very poor though.

Not sure if the game is "sponsored" by nvidia, it does have RTX GI (infinite bunces too), DLSS and FSR though.

You disputed the requirements, the posts are still there. You disputed the tech press findings, the posts are still there. I guess these are the issues you apparently never disputed.

All that happened was a driver and later game update addressed the issue for Nvidia GPUs with textures being blurry.

The benchmark has a glitch at the end where the Min FPS drops a bit lower. My min FPS was 72 ish or thereabouts in the run I posted compared to your 30 FPS with a 10GB card, but at the last second of the benchmark it drops to 63. It's not an opimtisation issue, you just don't meet the minimum requirements so the experience is not the same.

You run the benchmark and it starts at 1 FPS you said, that is not normal and is not what i was referring to in the post you mentioned.

Your minimum FPS are 30 according to your results, over 100% lower than 12GB cards so there's an issue there clearly wouldn't you agree? What do you think that issue is?

Do you think it's because your GPU falls below the requirements or is it just an 'optimisation issue' only affecting GPUs with less than 12GB bearing in mind the requirements?

My "disputed claims" was around not seeing why the game requires so much vram given how new dawns textures looked and other games textures in comparison i.e. nothing worthy of needing the original "16GB" estimate. I also posted evidence showing that the texture issue happened on consoles and on some amd cards too where as you were adamant, it was only an issue on =<10GB gpus because of vram limitation and nothing else, there were even people with 3090s saying the issue occurred after 3 hours sessions, of which you also ignored, just like when joker encountered the drop to 1fps in his gameplay with the 3080ti yet taking a screenshot brought it right back up but no.... vram issue and nothing else.

Until someone comes along with the same CPU, RAM and post their results on a 3080ti or/and a 3080 paired with a 12700k, it's silly comparison. It seems to only be you who is insisting on making this comparison and again, as pointed out by several people/sites, the benchmark is not an "accurate representation" for gameplay, especially since the benchmark doesn't even run for me with no FSR yet in game it is fine.... so how is that an "accurate representation of the gameplay"?

Did you see my screenshots above showing 4k gameplay? Does that frame latency line look like vram issues or show any fps drops to 1? Looks like a nice smooth line to me....

Looks back at thread title.

I must be a unicorn then that uses an overpriced card that I need to justify. Nothing to do with that mammoth thread started ages back about will the 3080 suffer from stingy VRAM beyond 2020?

Display is 4k. Capped to stay within the spec of the sync range of the monitor. Loaded HD texture pack from day one (read instructions). Had one crash playing the game till completed it. Graphics were good, only gripe was the cut scenes playing it early on where the line scrolled down (cant explain the actual name for this) but it was on the interweb with other users too so ignored it. Seemed to have gone away after one of the patches I guess, but the cut scenes calmed down second half with lots other distractions to do.

It will be my fault for enjoying this game especially as the label on the back for owning a 3090. It will be my fault that I have not bought, installed and played CP77 as it should have been what this thread was about in niche arguments.

Did I need a new CPU to play this? No. 3600 seemed to cope fine. :)

See you all in the next thread where we argue debate about VRAM and current gen computational bullhorses! :cry:

Not sure why you have posted all that, don't think anyone is questioning your enjoyment or your experience, the only thing that this whole thread was about was:

- texture issues because "vram"
- perf. issues because "vram"

If certain people had just accepted and acknowledged that there was an issue with the game and its vram/texture management, this thread would have been very short but nope....

Would be interesting to see a benchmark from your setup?

Given the overhead issues which hardware unboxed have shown when using weaker/older cpus with nvidia gpus, you are probably being held back a good bit, especially when you look at shazs massive improvement from his cpu upgrade.

I think you are right, probably time to agree to disagree and bow out of this one since we are not getting anywhere anyway.

Couldn't agree more on that point. This thread is done now since the "main" issues are fixed and the gpus appear to be performing as they should now.

EDIT:

Anyone with a 3070? If so how is the experience after the patch?
 
Last edited:
I can't get higher than 5-8% CPU usage on my 5900x with this game. Running a 3080ti at 1440p and tried from low-ultra settings.. getting between 15-35 fps. Can't figure this out! None of my cpu threads are getting high usage so doesn't look like that single core issue.
 
But don't you see the flaw in that statement..... even if the gpu is not capable of achieving "good" fps in the first place regardless of vram i.e. see 6800xt, all the vram but not enough grunt either hence why it's fps doesn't look to be "much" better than my 3080 (again, taking into account, recording costing a few fps).... As we have all said and witnessed, cards not having enough grunt will be the problem first thus settings either need to be reduced or/and fsr/dlss needs to be turned on, all of which reduce vram usage too, which essentially makes this whole thread and "concerns" completely pointless.


Performance from my 3080 looks to be on par with a 6800xt unless we look at matts special 6800xt which somehow seems to get a solid 20 fps over pc gamers hardware benchmark/results (and apparently their results are the best/trustworthy.... ignoring the fact they are from release day....)

Had some spare time today so wanted to follow up with a video similar to what you posted above, only this is a 30 minute recording. Have a look and see what you think of the performance and FPS numbers in comparison.

Far Cry 6, 4k + Max Settings + HD Textures + Ray Tracing.
 
Last edited:
Back
Top Bottom