• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Far Cry 6 GPU performance not bad at all but is severely bottlenecked by CPU

interesting stuff all around, really

i cant believe 1080ti is way below rtx 2060 and 5700xt can perform like 3060ti at 1080p and such

nvidia cards perform very weird per game; weird
 
We may just cancel pc gaming now at this stage, far cry 6 and AMD have ruined it all.

Not even an RTX branded goat simulator could make up for all these inferior console budget legacy games.


Nvidia needs to make a new RTX Quake to redeem of gaming
 
Ok so i have installed and played for a while. My setting are 1440P Ultra setting with ray tracing for reflections turned on and HD textures, My 5950 never went above 53% CPU usage but peak core usage was 100%. My 3080 TUF OC also hit 100% but VRAM usage was 7.3Gb.

Max FPS 101 and minimum 80fps average 92fps. Games looks lovely and i think is going to be a lot of fun, FYI this was approx 1 hour gameplay
 
Nice to see the usual Nvidia trolls out to claim it's crap based purely on the fact it is AMD spondsored. These are the same people who claim Cybercrap 2077 is "next gen".

At least Far Cry 6 has water physics and from what I can see it actually looks very nice graphically. Seeing plenty of reviews calling it visually stunning. Some others are saying it is mediocre but all that shows is that people have subjective views on this. I personally found CP2077 very mediocre and not once did I ever think "wow this is next level graphically". I found Watchdogs Legion much more realistic graphically.

Guru3D
"It's important to remember that visual quality is something you want and prefer while playing on a computer rather than a console, else you'd be playing on a console. Far Cry 6 really delivers in this regard; it's a visually appealing game, especially at high-quality and ultra settings."

TechRadar
"Far Cry 6 is a visually stunning title. As mentioned, Yara is an incredibly beautiful place to explore. No matter where you are on the island paradise, you can easily turn your immediate surroundings into a gorgeous screenshot".

I for one am glad it is not relying on ray tracing to become just acceptable graphically, like we see with CP2077. I don't find people saying "I can barely tell when RT is on of off", to be a negative.
It's the same few with every AMD Sponsored title/thread, that's why I have some people on ignore because the whole discussion gets side tracked with nonsense due to fanboyism.

I don't really understand what the big deal is. There's a few games where the 3080 has to turn down a few settings here and there or performance suffers due to lack of video memory.

Even 6900 XT and 3090 users have to turn settings down occasionally, just for different reasons. Life goes on and it's not the end of the world.

Anyway, here's the 6900 XT running at 4K true maximum settings, averaging 75 FPS and minimum of 60. It seems most videos on YouTube are not truly running max settings with FidelityFXCas On.

It runs very well indeed and there are no FPS drops down to single digits or texture issues that have been documented elsewhere on GPUs with less than 12GB of memory at 4K maximum settings.


Here is 1440P.

There's actually not many decent comparison videos out on YouTube yet, most people are turning certain settings off (Texture pack, Ray Tracing and or FidelityFXCas) but here is jokers 3080 TI running at what appears to be 4K maximum settings.


I will put up some actual gameplay next which should be a bit more demanding than the short benchmark sequence built in.
 
Last edited:
It's the same few with every AMD Sponsored title/thread, that's why I have some people on ignore because the whole discussion gets side tracked with nonsense due to fanboyism.

I don't really understand what the big deal is. There's a few games where the 3080 has to turn down a few settings here and there or performance suffers due to lack of video memory.

Even 6900 XT and 3090 users have to turn settings down occasionally, just for different reasons. Life goes on and it's not the end of the world.

Anyway, here's the 6900 XT running at 4K true maximum settings, averaging 75 FPS and minimum of 60. It seems most videos on YouTube are not truly running max settings with FidelityFXCas On.

It runs very well indeed and there are no FPS drops down to single digits or texture issues that have been documented elsewhere on GPUs with less than 12GB of memory at 4K maximum settings.


Here is 1440P.

There's actually not many decent comparison videos out on YouTube yet, most people are turning certain settings off (Texture pack, Ray Tracing and or FidelityFXCas) but here is jokers 3080 TI running at what appears to be 4K maximum settings.


I will put up some actual gameplay next which should be a bit more demanding than the short benchmark sequence built in.


From what I've seen so far FSR on Ultra Quality looks better than just native 4k - it's probably just because of the sharpening pass making it look nicer than TAA but either way it looks nicer to me so I'll be playing at 4k + FSR UQ + Ultra settings + RT on + HD texture pack
 
From what I've seen so far FSR on Ultra Quality looks better than just native 4k - it's probably just because of the sharpening pass making it look nicer than TAA but either way it looks nicer to me so I'll be playing at 4k + FSR UQ + Ultra settings + RT on + HD texture pack
Try enabling CAS when running native should give the best IQ.
 
From what I've seen so far FSR on Ultra Quality looks better than just native 4k - it's probably just because of the sharpening pass making it look nicer than TAA but either way it looks nicer to me so I'll be playing at 4k + FSR UQ + Ultra settings + RT on + HD texture pack
This appears to backup your findings somewhat, looks like the FSR implementation is excellent in Far Cry 6.
Far Cry 6 tech analysis: AMD’s FSR Evolved – Coreteks
 
Get about 60 fps in bench at 4K so put FSR on ultra looks pretty good imo. Only prob is that on the opening section of the game I am seeing heavy CPU bottlenecks at only 60 fps. Hitting over 90% on a single core while barely hitting 30% overall :|
 
Get about 60 fps in bench at 4K so put FSR on ultra looks pretty good imo. Only prob is that on the opening section of the game I am seeing heavy CPU bottlenecks at only 60 fps. Hitting over 90% on a single core while barely hitting 30% overall :|
If you are truly CPU limited, pay more attention to your GPU utilisation rather than CPU utilisation either per core or spread over multiple cores. GPU utilisation should be above 95% consistently if you are not CPU/API/Driver/Game engine limited.

I see limitation at 1440P, but at 4K GPU utilisation is at 99% for the most part in actual gameplay.
 
Last edited:
If you are truly CPU limited, pay more attention to your GPU utilisation rather than CPU utilisation either per core or spread over multiple cores. GPU utilisation should be above 95% consistently if you are not CPU/API/Driver/Game engine limited.

I see limitation at 1440P, but at 4K GPU utilisation is at 99% for the most part in actual gameplay.

I have FSR Ultra Quality on so I'm not sure what internal resolution it would be using for 4K but its very clearly CPU limited. I guess I could bang it on 4K native...

Here's a particularly bad spot (washed out look due to HDR):
gXGUkuV.jpg
 
I have FSR Ultra Quality on so I'm not sure what internal resolution it would be using for 4K but its very clearly CPU limited. I guess I could bang it on 4K native...

Here's a particularly bad spot (washed out look due to HDR):
gXGUkuV.jpg
Furry muff, I was running native and Ultra settings + HD Texture pack + RT + CAS enabled.

I expect once your get past the intro bit your utilisation will climb back up. There's lots of sequenced events in the intro, but that changes once you get into the open game world.

I saw some utilisation drops in that intro, so its normal. It pegs to 99% once that bit is over.

Game looks so much better with HDR on, i disabled it when i did my videos earlier.
 
cpu bottlenecks with 5800x at 1440p? wow crazy and near 60 fps? :confused: why this happen bad port bad ubisoft damn you bugsoft

so ryzun 5800 get cpu bound at 60 fps, it will mean ryzun 3700x get cpu bottleneck bound 45 fps right? 45 fps on FPS game must be rough and tough. damn you bugsooft damn you
 
Nvidia trolls...... ah yes, let's resort to the usual nonsense :D

- There is nothing wrong with the game itself, if you like FC games in general, you'll like this since it's the same but a different skin
- Graphics are good on the whole but the problem is, does it really look much better compared to new dawn and how long ago did that come out? But then again, it's not a surprise since the game was also made for last gen consoles
- AMD might not be holding back "gaming" but they are most definitely holding back "next gen graphical features" such as ray tracing, if you can't understand this/see why, then oh dear.... thankfully every technical site understands this
- ubisoft have acknowledge the problems with the game and are looking into it and we have plenty of cases/footage where we can see multiple cards having issues with textures as well as cases where the FPS completely drops regardless of vram (again see jokers video with a 3080ti) so until more in depth analysis comes, no one can say it is purely just because they game needs more vram, it might very well play better/smoother possibly (as techpowerup did say it felt smoother on cards with 11GB vram when not using FSR and max settings @ 4k) but there is no good reason FPS should be dropping to 5 fps on GPUs (again, even ones with more than 10GB vram)

I wonder if one potential issue could be with nvidias re-size bar setting, perhaps they have it enabled for this game but it might actually work better with it turned off here.

It's the same few with every AMD Sponsored title/thread, that's why I have some people on ignore because the whole discussion gets side tracked with nonsense due to fanboyism.

I don't really understand what the big deal is. There's a few games where the 3080 has to turn down a few settings here and there or performance suffers due to lack of video memory.

The problem is when people like yourself keep on insisting on there being problems when there's not and even when proved wrong time and time again, you still keep the fingers in the ears and post one liners without substantial enough evidence i.e. saying 3080 couldn't perform as well due to lack of 10gb vram in godfall and re villlage in the 10gb vram thread, however, when presented with various pieces of evidence, it turned out that nvidia/3080 had no issues with performance and took the lead when we looked at comparisons where ray tracing was turned on with the 6800xt/6900xt struggling once ray tracing was turned on in godfall (with max settings and no FSR @ 4k)
 
Last edited:
why no dlss in this game ??? dlss builds upon taa no?? these games always have taa , im pretty sure its a days work for ubisoft to add dlss they even add dlss to stupid old competitive shooter game r6 siege. is amd block dlss implementation in games? or its conspiracy, or just marketing dealz? why cant i utilize something i paid for
 
why no dlss in this game ??? dlss builds upon taa no?? these games always have taa , im pretty sure its a days work for ubisoft to add dlss they even add dlss to stupid old competitive shooter game r6 siege. is amd block dlss implementation in games? or its conspiracy, or just marketing dealz? why cant i utilize something i paid for

amd sponsored....
 
Have NVidia even launched a game ready driver for this yet? Mine is showing 20th Sept last updated. Looking forward to trying this but going to wait for a patch or two probably. Didn't even realise it was released already :)
 
i hope we can see more nvidia sponsored games then, i look forward to battlefield 20420 ,i think it will have dlss in its full launch. but no ray tracing... weird , considering bf5 was the first big game that incorporated ray tracing

btw i see reports of huge problems on geforce now. i think CPU performance of game hurts nvidia rigs hugely since they use 3.5 ghz cpus
 
If the FSR implementation is good does it really need DLSS? Both would be nice ofc but I can't see it happening :p
 
Back
Top Bottom