• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Far Cry 6 GPU performance not bad at all but is severely bottlenecked by CPU

I am just thankful the game runs fine with the stock textures. The situation with the 8GB RTX 3070 even without the HD Pack enabled is dire even at 1440p ultra settings. Look at these PS2 textures lol. I am seeing reports even the 3080 is dropping low-res assets without the pack at 4k.

Far-Cry-6-Screenshot-2021-10-29-12-20-55-24.png
 
I am just thankful the game runs fine with the stock textures. The situation with the 8GB RTX 3070 even without the HD Pack enabled is dire even at 1440p ultra settings. Look at these PS2 textures lol. I am seeing reports even the 3080 is dropping low-res assets without the pack at 4k.

Far-Cry-6-Screenshot-2021-10-29-12-20-55-24.png

Red and green ghosting can be seen clearly on objects towards both left and right edges of the screen e.g. tree trunks. What's causing that?
 
I had kept the game on hold until they fixed this issue with the textures and I was still in Madrugada. I think I'll just wait until I get that 12900k as I think the HD Textures are playable to me with FSR UQ but the stutters just ruin the experience, which may be due to the game extracting the limits of my 9900k.

My main gripe with the game is its VRAM usage and lowering settings to high doesn't really do much for that. Notable gains I see are only by disabling RT or enabling FSR


I don't think its utilizing VRAM efficiently on NVIDIA cards. If you check several YT benchmarks, the peak usage on an RTX 3090 is just about 13GB and even that's rare. Most of the time its below 12GB. I have tested this multiple times and have found that when the memory usage goes past 10.8GB, those low resolution assets start streaming.

On top of the NVIDIA issues, the game is running horribly on Intel processors. Just to see how bad it really is, I ran the benchmark with FSR Quality at 4k and averaged 69 FPS with GPU usage around 70%. A 3090 paired with a 5900X using the same settings at 4K is averaging 95 FPS. That's how bad it is.
its not bad with intel cpus. low cache intel cpus are obsolete for nextgen games

even the i5 12000 is coming up with 30 mb cache. time of 8-10 mb cache is over. you can only do so much with so few cache even if its effective. zen 3 huge cache demolishes intel in situations like this when game hammers a single core (like in csgo and valorant. check valorant, zen 3 demolishes any intel cpu)
 
They changed the minimum requirements for the HD texture pack to include 3080 Ti lol :D

https://www.ubisoft.com/en-au/help/far-cry-6/article/system-requirements-for-far-cry-6/000098943?isSso=true&connectSsoId=KQ/mf9yYLfIsy8c73FMaUpANkuqNvMUyViWuTPBn/ks=&refreshStatus=ok

"Please note that the minimum requirement for running the HD Texture Pack for Far Cry 6 is 12GB of VRAM. For 4K configurations, you will need 16GB of VRAM."

Using FSR, we can upscale 1440p to 4K so 12GB should be usable at 4K with it.
 
If you download and run the HD Texture Pack with lower VRAM capabilities, you will encounter performance issues while playing.

It's a crazy old world, we can swap out and install new graphics cards no problem, even DIY build PC's ourselves but some of us can't work out that 16 is a bigger no. than 10:p
 
It's a crazy old world, we can swap out and install new graphics cards no problem, even DIY build PC's ourselves but some of us can't work out that 16 is a bigger no. than 10:p

They were told a year ago 10gb was a rubbish amount of Vram for a flagship card but argued otherwise. Not even a year after launch its proven it is not enough.
 
Oh dear, seems people are still keeping their fingers in the ears, lordy lordy :cry: :o

Hello there!

Thank you for continuing to share your reports with us regarding the blurry textures experienced in-game. I'd like to reach out to you all with an update from the investigation. The development team have been looking at this issue further, and they will be deploying a fix in a future update! We don't yet have a confirmed ETA for when this fix will be released. Rest assured that once we have more details to share, they will be posted within this thread and via patch notes within the News & Announcements forum as soon as they are made available

Please note that if you are using a GPU with anything under 12GB VRAM, you will not be able to use the pack correctly. This is part of our minimum requirements. If you have less that 12GB of VRAM available, we would recommend uninstalling the HD texture pack. You can do this by checking the "Owned DLC" section for an uninstall button. If this option isn't there, you will need to upload your save file to the Cloud in New Game, then uninstall the game. Afterwards, reinstall the game, making sure not to select the HD Texture Pack, and then download your game save from the Cloud.

If you have over 12GB VRAM but under 16GB VRAM, please lower your graphics settings to 'High.'

Have a read at the most popular/viewed ubi thread and see it's not just a clear cut "zOMG 10GB not enough"....

https://discussions.ubisoft.com/top...urry-in-game-post-here/280?lang=en-US&page=14

Funnily the game is playing pretty much flawlessly for me on my measly 10GB 3080 now though :p Both at 3440x1440 and 4k

- settings are high (want more FPS when playing on my 3440x1440 144hz display and no gpu has enough grunt to provide the fps that satisfies my needs) except texture filtering at max and HD texture pack + RT on (motion blur, camera effects off)
- FSR turned off as it's awful but UQ is "usable" when playing at 4k (and if on, CAS is off as even more over sharpening galore, which further enhances TAA issues)

New patch isn't out on pc yet and no mention of vram memory management fix so doubt this patch will fix anything.

They were told a year ago 10gb was a rubbish amount of Vram for a flagship card but argued otherwise. Not even a year after launch its proven it is not enough.

Looking at how a 3080 performs compared to the competition in 99.8% of the games, I'm sure majority of 3080 owners aren't regretting their purchase especially the ones who got a FE for £650 ;)

https://www.techpowerup.com/review/geforce-rtx-3080-vs-radeon-rx-6800-xt-megabench/

While all our benchmarks are tested with ray tracing disabled and only few games today support the technology, ray tracing is here to stay. Comparing the Radeon RX 6800 XT with the RTX 3080 makes that NVIDIA has the upper hand in ray tracing performance crystal clear. The differences are significant, and if you believe ray tracing is the future, you should definitely consider the RTX 3080. On the other hand, some recent video game releases only have minimal ray tracing support that's just good enough to tick the "i haz ray tracing" checkbox without achieving the stunning fidelity improvements we were promised. In those titles, the RT performance hit is minimal, and both cards are surprisingly similar in FPS.

When the GeForce RTX 3080 released with 10 GB VRAM I was convinced it would be sufficient for the lifetime of the card. Recently, we have for the first time seen titles reach that amount at the highest settings in 4K, but only with ray tracing enabled. Examples of that are DOOM Eternal and Far Cry 6. While this shouldn't be a deal breaker, it's still something worth mentioning. AMD's RX 6800 XT does have 16 GB VRAM, which provides a small advantage here. With DirectStorage around the corner, memory requirements of games might actually go down because assets can be streamed in from the disk in a much more efficient manner.

And they didn't even factor ray tracing in because we all know how that flips the table entirely for one side :p ;)

PS. Wasn't the release date of the 3080 in September? So more than a year and we only have one game so far, which has an acknowledged issue with vram/texture management and just so happens to be sponsored by AMD and the requirement just so happens to match said sponsorships hardware specs :cry:
 
They were told a year ago 10gb was a rubbish amount of Vram for a flagship card but argued otherwise. Not even a year after launch its proven it is not enough.

These people are right in the sense that its ONE game with an OPTIONAL texture pack but hey given time the people screaming the loudest which I'm sure are not even 3080 owners so it doesnt even effect them will be right eventually.

It's a crazy old world, we can swap out and install new graphics cards no problem, even DIY build PC's ourselves but some of us can't work out that 16 is a bigger no. than 10:p

Well some people cant work out that not being able to run one game at the max settings does not mean card is bad.

In before well Jenson said iT wAs ThE fLaGsHiP cArD.

Which will be never if nvidia can influence! ;)

Nvidia ( which would never happen on a AMD game) or a modder could shut this down and release a 8/16K texture pack that could run on cards with 16GB+ vram and going by some posters logic all 16GB and below cards would now be rubbish. ;)
 
Last edited:
Funny how naive people are to think amd can do no foul play either :cry: Just buy whatever is best and suits said needs, really don't get this loyalty people have to certain companies. I detest nvidias practices (although they have gotten better/less anti consumer since the physx, tessellation days) but sadly they have the money thus they get more games/developers behind them and usually much better quality games that people care about, sorry, forgot..... godfallllllllllllllllllll! :cry: Thankfully amd are extremely competitive in the cpu space so hopefully will never have to go back to intel any time soon here.

These people are right in the sense that its ONE game with an OPTIONAL texture pack but hey given time the people screaming the loudest which I'm sure are not even 3080 owners so it doesnt even effect them will be right eventually.

Well some people cant work out that not being able to run one game at the max settings does not mean card is bad.

In before well Jenson said iT wAs ThE fLaGsHiP cArD.

Well nvidia or a modder could shut this down and release a 8/16K texture pack that would run on cards with 16GB+ vram and going by some posters logic all 16GB and below cards would now be rubbish. ;)

No doubt it is only a matter of time until 10gb and even 12gb vram for "4k" is not enough but by that time, whether people like it or not, the lack of rasterization and ray tracing grunt (especially for amd cards) are the "main" reasons why people will upgrade, not because of the lack of vram (who here has seriously upgraded their gpu just because they didn't have enough vram???? Unless you bought a fury x :o Come next gen gpus (for sake of argument), do you think people are going to go for the likes of a 3090 24GB over a 4070/4080 12/16GB next time round for "vram reasons"?), which will only affect what 1% of games in the next 1-2 years? If that?

If what techpowerup say with regards to direct storage is true, it will be interesting to see how that impacts vram usage.
 
Back
Top Bottom