• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Far Cry 6 GPU performance not bad at all but is severely bottlenecked by CPU

The game is using 6.8GB of my VRAM with no HD textures, which is merely 500MB less than what Cyberpunk uses and thats with all the ray tracing set to ultra at near max settings, and I am seeing PS3 era textures in several areas. They should just junk this engine as its terribly inefficient at using VRAM.
 
We can't be playing the same games. I still have 5 installed (with the HD Texture pack) and I honestly can't see how you think it is better graphically than 6. :confused:

Gonna have to put up some screenshots I think for some side by sides comparisons.
I never said 5 looked better, what are you talking about?
 
The game is using 6.8GB of my VRAM with no HD textures, which is merely 500MB less than what Cyberpunk uses and thats with all the ray tracing set to ultra at near max settings, and I am seeing PS3 era textures in several areas. They should just junk this engine as its terribly inefficient at using VRAM.
It has more to do with art direction than game engine.
 
The textures look great with the pack but without it, they look like PS3 era in so many areas. Some the textures without the pack are comparable to the original Crysis from 2007. Its clear they put no effort into the stock textures seeing as the consoles don't have a build without it.
 
Has there been a comparison between console high res textures and the PC's high res texture pack? someone mentioned that the console ones are half the size so are they a lot worse?
 
So I can now confirm, after a complete wipe and fresh win 10 install.

with a 3090fe. Rebar on or off has no effect on the stuttering at 1440p with the HD pack.

but with the HD pack off it is smooth as silk.

It’s the HD pack or drivers that must be at fault.

shame as the HD pack makes such a difference.
 
If you think 5 was better (or there has been no improvement between versions) then no words to say really.

I made no distinction between the two but as you are wondering, from what I watched 6 looks to have minor graphical improvements.

From what I gather people are complaining about what it looks like without hd textures and poor performance, if it did look low res and performance was significantly worse than 5 then I would be upset.
 
Has there been a comparison between console high res textures and the PC's high res texture pack? someone mentioned that the console ones are half the size so are they a lot worse?
File size doesn’t mean anything. PS5 has its own proprietary compression technique which is very good and almost everything on that platform is half the size of PC. Ubisoft claims it’s the same pack.
 
So I can now confirm, after a complete wipe and fresh win 10 install.

with a 3090fe. Rebar on or off has no effect on the stuttering at 1440p with the HD pack.

but with the HD pack off it is smooth as silk.

It’s the HD pack or drivers that must be at fault.

shame as the HD pack makes such a difference.
I don’t think it’s the drivers as disabling the hd textures fixes it. It’s far more likely the game developers didn’t optimise for Ampere at all. Since the game is a console port, it was fully optimised for the consoles and those carried over to AMD GPUs on PC.

The VRAM allocation on nvidia cards is completely broken and they need to patch it ASAP on priority. Those on high-end GPUs like the 3080 and above can brute force through that messed-up allocation with the pack disabled but users with the 3070 and below report low res textures and stuttering even with the pack disabled at 1440p. On the other hand, there are zero complaints from the mid-range AMD users in the long thread over on the Ubisoft forums. This is also clear when you see FC6 run on any RDNA 2 GPU. The game uses much more VRAM on the AMD cards relative to nvidia and does not stutter or show any issues with textures. The 6700XT actually provides a smoother experience than the 3080 Ti at 1440p, albeit at a lower frame rate simply because the game uses more of the 12GB VRAM on the former and there is no stuttering.

If they do not fix this allocation in a patch, it’s clear this is a deliberate crippling of nvidia cards in this game. Nvidia will never release a 16GB flagship card based on their past track record and the game developers should have developed a lite version of the texture pack with 10GB of VRAM requirements like they did in Watch Dogs Legion which was nvidia sponsored. At the moment, it’s either enjoy detailed textures on an AMD card or PS3 era textures on Nvidia.
 
I don’t think it’s the drivers as disabling the hd textures fixes it. It’s far more likely the game developers didn’t optimise for Ampere at all. Since the game is a console port, it was fully optimised for the consoles and those carried over to AMD GPUs on PC.

The VRAM allocation on nvidia cards is completely broken and they need to patch it ASAP on priority. Those on high-end GPUs like the 3080 and above can brute force through that messed-up allocation with the pack disabled but users with the 3070 and below report low res textures and stuttering even with the pack disabled at 1440p. On the other hand, there are zero complaints from the mid-range AMD users in the long thread over on the Ubisoft forums. This is also clear when you see FC6 run on any RDNA 2 GPU. The game uses much more VRAM on the AMD cards relative to nvidia and does not stutter or show any issues with textures. The 6700XT actually provides a smoother experience than the 3080 Ti at 1440p, albeit at a lower frame rate simply because the game uses more of the 12GB VRAM on the former and there is no stuttering.

If they do not fix this allocation in a patch, it’s clear this is a deliberate crippling of nvidia cards in this game.
Have you ever considered that the Nvidia driver might be handling memory management differently and that is causing the issue? Just a thought. :)
 
I don’t think it’s the drivers as disabling the hd textures fixes it. It’s far more likely the game developers didn’t optimise for Ampere at all. Since the game is a console port, it was fully optimised for the consoles and those carried over to AMD GPUs on PC.

The VRAM allocation on nvidia cards is completely broken and they need to patch it ASAP on priority. Those on high-end GPUs like the 3080 and above can brute force through that messed-up allocation with the pack disabled but users with the 3070 and below report low res textures and stuttering even with the pack disabled at 1440p. On the other hand, there are zero complaints from the mid-range AMD users in the long thread over on the Ubisoft forums. This is also clear when you see FC6 run on any RDNA 2 GPU. The game uses much more VRAM on the AMD cards relative to nvidia and does not stutter or show any issues with textures. The 6700XT actually provides a smoother experience than the 3080 Ti at 1440p, albeit at a lower frame rate simply because the game uses more of the 12GB VRAM on the former and there is no stuttering.

If they do not fix this allocation in a patch, it’s clear this is a deliberate crippling of nvidia cards in this game. Nvidia will never release a 16GB flagship card based on their past track record and the game developers should have developed a lite version of the texture pack with 10GB of VRAM requirements like they did in Watch Dogs Legion which was nvidia sponsored. At the moment, it’s either enjoy detailed textures on an AMD card or PS3 era textures on Nvidia.

This summarises some issues AMD owners used to experience a few gens back. However as the audience was using purported figures an 80/20% split (so maybe less than 20% of PC gamers on AMD GPU's, gamers had to lump it). This is where the mindshare of 'AMD cards are not as good are they' stereotypes formed because of this game development oversight.

This is unfortunately going to happen when you have sponsored titles and console ports where it is based on one companies hardware.
 
This summarises some issues AMD owners used to experience a few gens back. However as the audience was using purported figures an 80/20% split (so maybe less than 20% of PC gamers on AMD GPU's, gamers had to lump it). This is where the mindshare of 'AMD cards are not as good are they' stereotypes formed because of this game development oversight.

This is unfortunately going to happen when you have sponsored titles and console ports where it is based on one companies hardware.

But then surely we should have seen far more cases of this since amd has been in consoles since ps 3 and xbox one? I don't recall of a game having issues like this just for one gpu brand and if they were, they were fixed/improved at some point as it was an "issue/bug". In fact, amd sponsored games have historically been very good i.e. alien isolation

That ubi thread is now 22 pages long with multiple posts all over reddit and other forums where people are having issues more so on nvidia cards regardless of resolution and settings.

https://discussions.ubisoft.com/top...-in-the-game-is-blurry/422?lang=en-US&page=22

For the person asking about texture problems on consoles, it has also been reported there too although not as severe as what is happening on nvidia gpus, you can even see the difference when DF compared the PS5 to the xbox:

Uhf8rb8.jpg

So it's 1 of these or more likely a bit of both, imo, more so on ubis side:

- nvidia drivers need work
- ubisoft need to optimise/work/fix the game

Will be interesting to see if someone/modder can fix/improve these before ubi do as this happened with FC 3 as well, although, iirc fc 3 was a bit more simpler with just going and changing a few lines in the backend .ini files.
 
Typically NVIDIA never fixes issues related to AMD sponsored games. In RE Village, there is a severe performance drop associated with high volumetric lighting in several areas of the game only on NVIDIA GPUs. Dropping to medium fixes it but it’s an NVIDIA exclusive issue as AMD cards don’t have this. It’s never been fixed until now. Hopefully that’s not the cause of this.
 
I have uninstalled the game completely and reinstalled. It works better now in that HD textures are working more often, yet FPS tanks to single digits until I enable FSR and I do still get random texture loading issues. These are all indicitive of a VRAM limitation and not drivers.

Though having finally got it working and showing differences between HD on vs off, I think normal ultra with FSR looks alsmost as good as HD with FSR on. With FSR off you can see a bigger differece. With HD off and FSR on I get good FPS, sharp textures and no texture loading bugs.

So enable FSR, it adds some sharpening and looks great to the point where HD on is a case of pixel peeping to see differences. I recommend that anyone having texture issues need to verify the game is installed OK.
 
Back
Top Bottom