• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Far Cry 6 GPU performance not bad at all but is severely bottlenecked by CPU

AMD were cheaper while Intel were better atleast until Zen 2 then once Zen 3 came out AMD were better but not always cheaper.

If no one buys Intel and everyone just brought AMD then it won't be long before AMD start gimping their CPUs and charging extra for stuff like overclocking or SMT etc.



You pay more for Intel boards as the pins are on the board so with AM5 the boards won't be as cheap as your used too.

Is that all it is? How much does the socket cost?
 
AMD were cheaper while Intel were better atleast until Zen 2 then once Zen 3 came out AMD were better but not always cheaper.

If no one buys Intel and everyone just brought AMD then it won't be long before AMD start gimping their CPUs and charging extra for stuff like overclocking or SMT etc.



You pay more for Intel boards as the pins are on the board so with AM5 the boards won't be as cheap as your used too.


I buy wathever is better/cheaper at the time, intel for the last years have been behind specially when it comes to low/mid budget CPUS, even CPUS like the 9600k aged badly compared to their AMD counter part, the 3600, AMD ryzen 2600 aged fairly decently as well and it could already be found for under 100 quid years ago, 2 years ago you needed to spend over almost 400 quid on the intel side justfor the "priviledge" of having a CPU with hyperthreading, meanwhile you could get that on a 80 quid AMD CPU.
 
I buy wathever is better/cheaper at the time, intel for the last years have been behind specially when it comes to low/mid budget CPUS, even CPUS like the 9600k aged badly compared to their AMD counter part, the 3600, AMD ryzen 2600 aged fairly decently as well and it could already be found for under 100 quid years ago, 2 years ago you needed to spend over almost 400 quid on the intel side justfor the "priviledge" of having a CPU with hyperthreading, meanwhile you could get that on a 80 quid AMD CPU.
The 10400F/11400F have been very solid budget options over the past couple of years.
 
I have had a mixture when it comes to GPU's. For a long wile it was only AMD for me, but then they stopped being as competitive. There was a time when AMD always was cheaper for the same performance so I never bothered with Nvidia.

As for CPU's I started with AMD and again was exclusively AMD for like 5-6 CPU's all the way until Core2Duo come out. That was the time when AMD had big market share and where charging silly moneys for their CPU's (this is why I know as soon as they are in a position to do so they will pound you from behind just as hard as Nvidia). They proper milked the FX line as I recall, so I never got an FX CPU and went Intel. With Intel I had the Core2Duo, then the Q6600 I think it was, then a Sandy Bridge for a short while, then Haswell with the 4770K which I had overclocked to 4.7GHz and that lasted me around 6 or so years!

Back to AMD now again with the R5 3600 and now R9 5900X which I think I will keep for a long time and then hand down. Potentially may sell it and get the 3D Cache version at some point if on same socket if I need the performance.

Always been an AMD gpu person but the last amd gpus which were any good/competitive were the vegas imo, vega 56 was solid when UV + OC and still a great card today. Rdna 2 is very good too and if the stock and pricing situation had been better, I probably would have picked one up at the time but alas it wasn't so a £650 3080 was too good to refuse, that and ray tracing + dlss has come a long way since then so no regrets with the green side for now. Next gpu purchase will be based on price to perf. as per usual followed by ray tracing perf. and features such as DLSS/FSR/intels one. Unless RDNA 3/40xx is substantially better and ray tracing is pushed further in games, I'll probably be giving them a miss, will also depend largely on how much I could sell the 3080 on for.

CPU wise, amd had me since the ryzen 2600 launched and can't see myself going back to intel any time soon.
 
Always been an AMD gpu person but the last amd gpus which were any good/competitive were the vegas imo, vega 56 was solid when UV + OC and still a great card today. Rdna 2 is very good too and if the stock and pricing situation had been better, I probably would have picked one up at the time but alas it wasn't so a £650 3080 was too good to refuse, that and ray tracing + dlss has come a long way since then so no regrets with the green side for now. Next gpu purchase will be based on price to perf. as per usual followed by ray tracing perf. and features such as DLSS/FSR/intels one. Unless RDNA 3/40xx is substantially better and ray tracing is pushed further in games, I'll probably be giving them a miss, will also depend largely on how much I could sell the 3080 on for.

CPU wise, amd had me since the ryzen 2600 launched and can't see myself going back to intel any time soon.
What I have been finding is it is better to just sell and get latest stuff on release. People happy to pay near full price for old GPU’s (even before the shortage) so I say might as well sell, add a hundred or so on top and get latest gear. It is always fun having and testing shiny new GPU, so the joy from that alone is worth the cost I find. That’s my plan anyway. More than likely I will end up getting a 4070 or 4080 FE unless AMD get their **** together and offer me an option to buy at MSRP.
 
Something is definitely wrong with the VRAM allocation in this game. In the below video running a 3090, right off the bat the game is using 10GB and it never goes below this throughout the entirety of his run and peaks at 11GB several times. It didn’t cross 12GB at all.


On my 3080 Ti, despite having 12GB of VRAM, the game refuses to use over 9.5GB with FSR UQ and there is constant stuttering. It’s like the headroom which is set for the vram is set too aggressively. I think a proper solution would be for a modder to find out a way to trick the game into thinking the user has more VRAM, so it can use the entirety of the GPU VRAM instead of streaming from disk almost 3GB before it runs out.
 
Something is definitely wrong with the VRAM allocation in this game. In the below video running a 3090, right off the bat the game is using 10GB and it never goes below this throughout the entirety of his run and peaks at 11GB several times. It didn’t cross 12GB at all.

zqINlhY.jpg

My observations:

1. He's not running maximum settings, motion blur is off.
2. He has FSR enabled which lowers power draw and video memory usage as has been shown here vs here.
3. He has a streaming PC, so this is best case scenario for FPS and lower video memory usage as recording is not done on the same PC.
4. I can put up a video (if required) at the same settings, record on the same PC and I will get higher video memory usage and higher FPS.
 
Last edited:
^^

It's been a long day but I'm not sure what relevance most, if any of those points have with regards to the point shaz is making.... They aren't going to hamper vram usage "that" much.... FSR does not make that much of a dent to vram usage either unless using the lowest setting.

1. He's not running maximum settings, motion blur is off.

Really................ :o Talk about clutching at straws now :cry:
 
Something is definitely wrong with the VRAM allocation in this game. In the below video running a 3090, right off the bat the game is using 10GB and it never goes below this throughout the entirety of his run and peaks at 11GB several times. It didn’t cross 12GB at all.


On my 3080 Ti, despite having 12GB of VRAM, the game refuses to use over 9.5GB with FSR UQ and there is constant stuttering. It’s like the headroom which is set for the vram is set too aggressively. I think a proper solution would be for a modder to find out a way to trick the game into thinking the user has more VRAM, so it can use the entirety of the GPU VRAM instead of streaming from disk almost 3GB before it runs out.


The game uses 13gb on my 3090
 
zqINlhY.jpg

My observations:

1. He's not running maximum settings, motion blur is off.
2. He has FSR enabled which lowers power draw and video memory usage as has been shown here vs here.
3. He has a streaming PC, so this is best case scenario for FPS and lower video memory usage as recording is not done on the same PC.
4. I can put up a video (if required) at the same settings, record on the same PC and I will get higher video memory usage and higher FPS.
I replicated the same settings. Turned motion blur off and turned on FSR Ultra Quality.

I am not disputing that he may be getting lower VRAM usage. What I am wondering is why my VRAM usage is much lower than his at the same settings. Why doesn’t the game use at least 10GB out of my VRAM? I played 2 hours with these same settings and the highest my VRAM usage got was 9.5GB.

What I actually want is for the game to use more of my VRAM.
 
From a quick skim of the video it seems like they did a really good job with the textures in certain areas. Overall it is a good looking game. The bump map on the water works really well. (I assume they are not using an actual displacement map to save on triangle count)

I replicated the same settings. Turned motion blur off and turned on FSR Ultra Quality.

I am not disputing that he may be getting lower VRAM usage. What I am wondering is why my VRAM usage is much lower than his at the same settings. Why doesn’t the game use at least 10GB out of my VRAM? I played 2 hours with these same settings and the highest my VRAM usage got was 9.5GB.

What I actually want is for the game to use more of my VRAM.
What about in game? Were you playing on the same part of the map? Did you have access to the same guns, equipment, clothing, vehicles etc..

I believe that models and effects for guns and equipment that a player has access to will be cached.

Edit: I once heard that modern GPUs could actually push crazy high triangle counts if the wanted to. Does anyone know if this is true?
 
From a quick skim of the video it seems like they did a really good job with the textures in certain areas. Overall it is a good looking game. The bump map on the water works really well. (I assume they are not using an actual displacement map to save on triangle count)


What about in game? Were you playing on the same part of the map? Did you have access to the same guns, equipment, clothing, vehicles etc..

I believe that models and effects for guns and equipment that a player has access to will be cached.

Edit: I once heard that modern GPUs could actually push crazy high triangle counts if the wanted to. Does anyone know if this is true?
Not the same area or the same equipment. But it doesn’t matter because vram usage is above 10GB in pretty much all scenarios for these 16GB cards.

The below video is the starting area of the game where usage is 11.6GB on the 3090. In the same spot with the same equipment with the same graphics settings with FSR disabled, I only get 9.8GB of usage.

 
Nothing official but some people said the FPS is up to 10% higher. If this is a sign for the future it means that it will take 2 years before the Farcry6 will run as well on Nvidia as it does on Radeon cards. But one day it will.

I pop on for the daily missions for the opals. Wondered why my fps overlay was looking strong and thought it was just gradual improvement from patches. I did also update the chipset 'cos of the recent vulnerability warning but makes sense it was due to the nvidia driver. I updated it before playing FC6.
 
Back
Top Bottom