• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Far Cry 6 GPU performance not bad at all but is severely bottlenecked by CPU

I will really be considering an AMD GPU next gen if I manage to get one at MSRP. For the 3000 series, it was relative easier to land NVIDIA cards somewhat closer to MSRP compared to AMD. The NVIDIA driver overhead means I have to keep buying the latest Intel CPUs along with my GPU purchases to get the most out of the system.

To be fair, the 9900K is a 3 year old CPU and in the past 3 years we've seen more than just clock speed increases as architectures evolve, specifically around latency between the various sub-components. Sure, the overhead doesn't help, but old tech is old tech; even Alder lake is built with a mix of new and legacy architecture.

I feel that AMD's 3D chip stacking is going to be another important leap.
 
To be fair, the 9900K is a 3 year old CPU and in the past 3 years we've seen more than just clock speed increases as architectures evolve, specifically around latency between the various sub-components. Sure, the overhead doesn't help, but old tech is old tech; even Alder lake is built with a mix of new and legacy architecture.

I feel that AMD's 3D chip stacking is going to be another important leap.

Yep looking forward to the new AMD`s could see a decent uplift.
 
To be fair, the 9900K is a 3 year old CPU and in the past 3 years we've seen more than just clock speed increases as architectures evolve, specifically around latency between the various sub-components. Sure, the overhead doesn't help, but old tech is old tech; even Alder lake is built with a mix of new and legacy architecture.

I feel that AMD's 3D chip stacking is going to be another important leap.

While that's true, upgrading a CPU every 3 years when playing at 4k shouldn't really be necessary. Ideally CPU bottlenecking at this resolution shouldn't be apparent until at least 5 years from release. The fact that I was seeing bottlenecking in so many games in just 3 years indicates that GPUs are far outpacing CPUs in generational performance improvements. Apparently, Ada Lovelace is going to deliver RTX 3090 performance as RTX 4070 while the 4090 is 2x as fast as 3090. If this is true, even the 12700K/12900K would be bare minimum for running these GPUs. This is the reason why I opted for 12700K over 12900K. If I have to upgrade frequently, makes sense to get the best value part. I had gone with the 9900K with the intention of running it for 5 years at least which didn't really work out.
 
While that's true, upgrading a CPU every 3 years when playing at 4k shouldn't really be necessary. Ideally CPU bottlenecking at this resolution shouldn't be apparent until at least 5 years from release. The fact that I was seeing bottlenecking in so many games in just 3 years indicates that GPUs are far outpacing CPUs in generational performance improvements. Apparently, Ada Lovelace is going to deliver RTX 3090 performance as RTX 4070 while the 4090 is 2x as fast as 3090. If this is true, even the 12700K/12900K would be bare minimum for running these GPUs. This is the reason why I opted for 12700K over 12900K. If I have to upgrade frequently, makes sense to get the best value part. I had gone with the 9900K with the intention of running it for 5 years at least which didn't really work out.

Fair, but like I said, I think it’s a case of CPU architecture remaining relatively stagnant until recently.

Newer engines and technologies are looking to shift CPU workloads to the GPU so this may become less of an issue in the near future.
 
its not bottlenecking at 4k when you use dlss performance to go down to 1080p. you're practically deceiving yourself if you really think you're playing a native 4k heavy game with dlss performance

if you ve used dlss quality on cyberpunk, the bottleneck would still shift to GPU. in ideal worlds, you want a gpu that can lift dlss quality at 4k for best visual quality, especially on such games and on such hardware. you're practically making huge sacrifices to image quality by rendering 2 million pixels instead of 8 million pixels and call it a 4k bottleneck. simple, its not.
 
its not bottlenecking at 4k when you use dlss performance to go down to 1080p. you're practically deceiving yourself if you really think you're playing a native 4k heavy game with dlss performance

if you ve used dlss quality on cyberpunk, the bottleneck would still shift to GPU. in ideal worlds, you want a gpu that can lift dlss quality at 4k for best visual quality, especially on such games and on such hardware. you're practically making huge sacrifices to image quality by rendering 2 million pixels instead of 8 million pixels and call it a 4k bottleneck. simple, its not.
But I am also running RT Reflections, RT Shadows and RT Lighting (which is maxed out at Ultra) which should shift the bottleneck to GPU even with DLSS Performance.
I could easily run DLSS Quality by turning off RT Lighting or Reflections but IMO, the slightly blurrier image is well worth turning on those effects as the end result looks better.

I do notice a bottleneck more in RT titles though as it seems the calculations are heavy on the cpu as well.
 
I do notice a bottleneck more in RT titles though as it seems the calculations are heavy on the cpu as well.

Not quite sure why that is in some games - possibly due to having to mix RT and non-RT effects - Quake 2's path tracing is insanely GPU heavy and barely touches CPU.
 
Not quite sure why that is in some games - possibly due to having to mix RT and non-RT effects - Quake 2's path tracing is insanely GPU heavy and barely touches CPU.

Its mostly in open world games as they make up most of the games I play. If its a game like Metro Exodus or Control, I hardly see much improvement except in 1% lows. In games like Cyberpunk and Watch Dogs Legion, the RT effects are already heavy on the CPU as its involved in the BVH structure but on top of that, you have a lot of AI NPCs constantly spawning in addition to the LOD management. Watch Dogs Legion and Cyberpunk are the worst in this regard. In the city center district for instance, on the 9900K@ 5ghz, just moving the mouse by 180 degrees saw GPU usage go down to 86% for 1-2 seconds as the NPCs spawned in and it goes back up to 98% after some time. With the 12700K, CPU usage spikes up to 86% but this is just for a split second but there is no drop in GPU usage. I can say that these 3 year old CPUs will not be able to handle ray tracing along with high NPC densities. People keep making memes about Cyberpunk on consoles but looking at how high the CPU usage is on even a 12700K, the devs were mad to try and get it running on the 2013 consoles. They literally had to empty the whole city to do it
 
just moving the mouse by 180 degrees saw GPU usage go down to 86% for 1-2 seconds as the NPCs spawned in and it goes back up to 98% after some time.

oh my gawd how can one live with that, kill the CPU with fire, quickly! :cry:

have fun having %1 lows below 70 fps in bf 2042 with 100+ fps averages, the next amd/intel cpu is waiting for you

easy game for devs and manufecturers ;) keep buying it ;)
 
Last edited:
Far Cry 6 performance seems to have increased for RDNA2 by around 5% from the launch driver of 21.10.1 > 21.11.3, two months. Been a few game updates as well during that time, so some improvements could have come from that also.

21.11.3
QPDsp9U.png

Vs
21.10.1
o3zJ8UU.png

The developers still need to fix the minimum FPS bug though in the benchmark sequence, where it drops right at the end of the bench but is not registered on the FPS graph.

As you can see in the 21.11.3 run the lowest minimum FPS were at 71 at the end, but shows 54 on the final screen.
 
Far Cry 6 performance seems to have increased for RDNA2 by around 5% from the launch driver of 21.10.1 > 21.11.3, two months. Been a few game updates as well during that time, so some improvements could have come from that also.

21.11.3
QPDsp9U.png

Vs
21.10.1
o3zJ8UU.png

The developers still need to fix the minimum FPS bug though in the benchmark sequence, where it drops right at the end of the bench but is not registered on the FPS graph.

As you can see in the 21.11.3 run the lowest minimum FPS were at 71 at the end, but shows 54 on the final screen.
With FSR Ultra Quality at 4k, does your VRAM usage cross 12GB with the 6900XT? Although my stutter issue is resolved, after playing an hour or more when my VRAM usage nears 10GB, the game is swtching to the low resolution assets
 
With FSR Ultra Quality at 4k, does your VRAM usage cross 12GB with the 6900XT? Although my stutter issue is resolved, after playing an hour or more when my VRAM usage nears 10GB, the game is swtching to the low resolution assets
Yes, was playing for a few hours earlier and saw it go up to and beyond 13GB at points. Generally between 12-13GB.

EDIT

@Shaz12 - Sorry I misunderstood, that was using native not FSR UQ.
 
Last edited:
New patch for Far Cry 6. Title Update 3 Deploying December 7th (ubisoft.com)

I bolded the important part. Let us never forget the 'bug' that 'will be fixed.' :cry: @gpuerrilla
HD Texture Pack -- Some assets appearing blurry

Developer comment: We have made some changes for the HD Texture Pack on PC that should decrease the blurriness that appeared for some players when using the HD Texture Pack. When looking into these reports, we are seeing players using graphics cards with less than 12 GB of VRAM available. When using the HD Texture Pack with less than the minimum required VRAM available, the performance and the look of the game can be worse than without the pack.
 
Let us never forget the fingers in ears approach, which is still going on it seems :D :cry:

Think my ubi connect is still active, if so, I'll give it a whirl to see what is what, as I mentioned earlier back, once I moved away from the starter area, there were no issues with low res. textures being stuck on my guns or trees, car hat and dashboard etc. (note, important point in bold: regardless of res. and settings)

Not sure "blurriness" is the word I would use to describe the problem either, it was outright low res. textures being stuck and not swapping in the HD ones....
 
Back
Top Bottom