• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Far Cry 6 GPU performance not bad at all but is severely bottlenecked by CPU

I've come to the conclusion after a few hours on Far Cry 6 that Raytracing in these action packed games is kinda pointless. I don't spend much time looking at raytraced shadows and reflections when im out gunning the enemy, what is important to me is FPS! Where as a game where there's a lot of puzzle solving or looking around areas like a RPG i would spend more time looking at the environment.

Wouldn't be forming opinions on RT based on fc 6 :p Probably the worst implementation to date imo.

It works and looks lovely in plenty of action games, my favourites being:

- cyberpunk
- the ascent (this
- metro
- control

Exploring cyberpunk world without RT on just wouldn't be the same.

To be fair devs are probably being told/forced to shoehorn it in so that nvidia can showcase it as another title. Like you say in reality it should only be used if the setting needs it. Too much mirror/shiny/rain can end up spoiling a game.

A lot of developers are wanting to use it where they can given how much easier and quicker it makes a developers life, not to mention, it is much more "next-gen" in terms of visuals, even in a lot of non nvidia/amd sponsored titles, ray tracing is being added where possible. With FC 6, it definitely comes across like amd forced ubi to add it though just so that amd could say that they also "can do RT".

Also thought that to myself with some area some areas of cyberpunk but tbh, when out and about and you look at how it really is in real life with how puddles, glass buildings reflections are, it's extremely accurate.
 
Last edited:
To be fair devs are probably being told/forced to shoehorn it in so that nvidia can showcase it as another title. Like you say in reality it should only be used if the setting needs it. Too much mirror/shiny/rain can end up spoiling a game.

With this release AMD is also doing the same thing though. What is the purpose behind the textures being so high resolution that effectively only the AMD cards can use it and those of us with nvidia hardware having a very respectable 10-12GB are having to look at potato textures with only 6GB of utilisation. 10-12GB of requirement for the HD pack would have ensured compatibility with both AMD and Nvidia hardware and I am sure those 4GB worth of detail being lost isn’t going to result in any meaningful differences outside of pixel peeping.

They could at least have released a lite version of the HD pack with slightly lower resolution like how the modders are doing right now.
 
With this release AMD is also doing the same thing though. What is the purpose behind the textures being so high resolution that effectively only the AMD cards can use it and those of us with nvidia hardware having a very respectable 10-12GB are having to look at potato textures with only 6GB of utilisation. 10-12GB of requirement for the HD pack would have ensured compatibility with both AMD and Nvidia hardware and I am sure those 4GB worth of detail being lost isn’t going to result in any meaningful differences outside of pixel peeping.

They could at least have released a lite version of the HD pack with slightly lower resolution like how the modders are doing right now.
AMD would not be happy if they did though ;)
 
With this release AMD is also doing the same thing though. What is the purpose behind the textures being so high resolution that effectively only the AMD cards can use it and those of us with nvidia hardware having a very respectable 10-12GB are having to look at potato textures with only 6GB of utilisation. 10-12GB of requirement for the HD pack would have ensured compatibility with both AMD and Nvidia hardware and I am sure those 4GB worth of detail being lost isn’t going to result in any meaningful differences outside of pixel peeping.

They could at least have released a lite version of the HD pack with slightly lower resolution like how the modders are doing right now.


Game looks "fine" without the HD pack, any low resolutions going right now are Simply a bug that hopefully will be fixed at some point, you are basically asking a non-HD version of a HD pack which honestly doesn't make any sense, this texture pack was clearly made with the high end GPUs in mind, everyone else can simply not install it.
 
Game looks "fine" without the HD pack, any low resolutions going right now are Simply a bug that hopefully will be fixed at some point, you are basically asking a non-HD version of a HD pack which honestly doesn't make any sense, this texture pack was clearly made with the high end GPUs in mind, everyone else can simply not install it.
Yup, optional.
 
AMD would not be happy if they did though ;)
:D:D

Game looks "fine" without the HD pack, any low resolutions going right now are Simply a bug that hopefully will be fixed at some point, you are basically asking a non-HD version of a HD pack which honestly doesn't make any sense, this texture pack was clearly made with the high end GPUs in mind, everyone else can simply not install it.

Watch Dogs Legion, another Ubisoft title, also had a similar HD Pack and that only needed 10GB of VRAM because it was an NVIDIA sponsored game.

System requirements for Watch Dogs: Legion | Ubisoft Help

This is just AMD/NVIDIA sniping at each other through sponsored titles. Even the PS5 and XSX are running this same pack, so it doesn't make sense that you need a high end GPU to run it. The XSX is actually running the game at near native 4k. Does it seem normal to you that NVIDIA hardware is the only one which is incapable of running this pack?
 
:D:D



Watch Dogs Legion, another Ubisoft title, also had a similar HD Pack and that only needed 10GB of VRAM because it was an NVIDIA sponsored game.

System requirements for Watch Dogs: Legion | Ubisoft Help

This is just AMD/NVIDIA sniping at each other through sponsored titles. Even the PS5 and XSX are running this same pack, so it doesn't make sense that you need a high end GPU to run it. The XSX is actually running the game at near native 4k. Does it seem normal to you that NVIDIA hardware is the only one which is incapable of running this pack?


Nvidia hardware is running the pack just fine going by 3090 benchmarks, other Nvidia cards simply lack VRAM (which loads of people on this very forum had been predicted that would happen for over a year), if there is anyone to blame it would be Nvidia for gimping their own cards, HD texture pack is literally OPTIONAL so I genuinely can't understand the issue.
 
Nvidia hardware is running the pack just fine going by 3090 benchmarks, other Nvidia cards simply lack VRAM (which loads of people on this very forum had been predicted that would happen for over a year), if there is anyone to blame it would be Nvidia for gimping their own cards, HD texture pack is literally OPTIONAL so I genuinely can't understand the issue.
3090 is not really a pure gaming card. I don’t think 12GB of VRAM is low at all. Also the way NVIDIA’s GDDR6X works, they cannot have 16GB on their cards. They would need to go from 12 to 20 to and would significantly push up the MSRP of their cards and in turn the scalped prices.

I am playing the game with the pack off but I don’t understand what’s preventing the game from running on 12GB cards when the PS5 and XSX can run it.

XSX has 16GB GDDR6 memory of which only 10GB is fast memory with 320-bit interface, and 6GB will operate with 192-bit interface. In other words, XSX uses 10GB of fast memory for VRAM, and 6GB of slower memory for OS purposes + game operations. The game will avoid going over 10GB of VRAM because if that happened, all the memory will operate like the slow memory. So when the game is running on this setup with no blurry textures or stuttering, why can’t it run on NVIDIA hardware with 12GB of VRAM?

I am all for forward looking graphics settings. Psycho lighting in CP2077 is unachievable at 4K even on 3090. But the high requirements should have some solid basis which is missing here for the high res pack.
 
3090 is not really a pure gaming card. I don’t think 12GB of VRAM is low at all. Also the way NVIDIA’s GDDR6X works, they cannot have 16GB on their cards. They would need to go from 12 to 20 to and would significantly push up the MSRP of their cards and in turn the scalped prices.

I am playing the game with the pack off but I don’t understand what’s preventing the game from running on 12GB cards when the PS5 and XSX can run it.

XSX has 16GB GDDR6 memory of which only 10GB is fast memory with 320-bit interface, and 6GB will operate with 192-bit interface. In other words, XSX uses 10GB of fast memory for VRAM, and 6GB of slower memory for OS purposes + game operations. The game will avoid going over 10GB of VRAM because if that happened, all the memory will operate like the slow memory. So when the game is running on this setup with no blurry textures or stuttering, why can’t it run on NVIDIA hardware with 12GB of VRAM?

I am all for forward looking graphics settings. Psycho lighting in CP2077 is unachievable at 4K even on 3090. But the high requirements should have some solid basis which is missing here for the high res pack.
You can't compare the consoles since they are not using a native resolution and they are not using Ultra settings with all the extras that adds.
 
Isnt the game dynamic resolution on the consoles to begin with making any direct comparison a bit silly since a pc will be running the game on a locked 4k resolution so it will always be more demanding, also consoles dont have Ray tracing which i am assuming uses a decent amount of extra VRAM on the PC.
what makes you think a 3090 is not a "pure gaming card" but a 3080TI is? end of the day they are both overpriced GPUS that any consumer (gamer on this case) can buy if they want to.
I still think Nvidia is the only one to blame for not putting more VRAM on the 3080 to begin with specially when you compare it directly with the 6800XT and then messing again with the 3080TI, you brought up the 3080TI MSRP as if its any cheap to begin with, its like 80% more expensive than a 3080.....
 
3080TI MSRP as if its any cheap to begin with, its like 80% more expensive than a 3080.....

Well said. That’s exactly why I think it is good they went with 10GB for the 3080 as they would have charged a lot more if they designed a card that had more. Not like nvidia do not provide options for those who want more vram. As far as I am concerned this is the first game that has had issues with vram (no I don’t count godfall Matt :p) and that is because it is amd sponsored. Had it been nvidia sponsored it would have worked just fine with 10gb as the hd texture pack would have been optimised for that and not 16gb.

Let’s see if how many more games like this we will get between now and next gen cards. I remember estimating a handful of games that I would want to play back when the card had come out and the debate started.
 
You can't compare the consoles since they are not using a native resolution and they are not using Ultra settings with all the extras that adds.
DF has confirmed that the Xbox version is using high settings and it operates at 1872p-2160p. But I can still see blurry textures at 1440p high settings with the HD Pack, with ray tracing off, they just take longer to appear. The only way to get rid of them is to turn on FSR Ultra Quality but even at 1440p the game is stuttering with FSR.
Isnt the game dynamic resolution on the consoles to begin with making any direct comparison a bit silly since a pc will be running the game on a locked 4k resolution so it will always be more demanding, also consoles dont have Ray tracing which i am assuming uses a decent amount of extra VRAM on the PC.
what makes you think a 3090 is not a "pure gaming card" but a 3080TI is? end of the day they are both overpriced GPUS that any consumer (gamer on this case) can buy if they want to.
I still think Nvidia is the only one to blame for not putting more VRAM on the 3080 to begin with specially when you compare it directly with the 6800XT and then messing again with the 3080TI, you brought up the 3080TI MSRP as if its any cheap to begin with, its like 80% more expensive than a 3080.....

I can go down to 1440p, much below the consoles render resolution which is as a minimum 1872p, turn off ray tracing and it still stutters/ shows blurry textures with the HD pack.

The 3080 MSRP was just a myth set in a market where nvidia was clueless about market conditions. I tried to get one for months on end and it never worked no matter how many discord groups I joined or Twitter alerts I signed up for. Almost all of the 3080s are selling for double the MSRP in the real world and are close to 3080 Ti pricing. The 3080 Ti is realistically about 10-20% more expensive. Had they increased the MSRP, by giving the 3080 more ram, it would have gotten even worse than it is now. Watch how nvidia does precisely what you are saying with the 3080 Super in early 2022 and give it 12GB of RAM and price it at £999 and all these cards would seem worse value relative to 3080 Ti.

The 3090 is just obscenely priced. Not a single card below £2,500 and you are paying that simply for the 24GB of VRAM which is ludicrous for gaming. By the time you get close to saturating that VRAM, the card would already need settings to be turned down anyway. 3080 Ti provides virtually the same performance for £1,600. Doesn’t make any sense to buy the 3090 for gaming at all.

Looking at current market conditions, 3080 Ti is actually the only card at the top end which is reasonably close to MSRP and provides the most balanced performance.Using GDDR6X, 12GB is the best nvidia can sell for gaming without jacking up the 3080 pricing to 3090 levels in the real world and that extra 2GB isn’t material enough at all as proven by this game. All it will be doing is proving how the 3080 Ti is a better buy than the 3080 Super.
 
DF has confirmed that the Xbox version is using high settings and it operates at 1872p-2160p. But I can still see blurry textures at 1440p high settings with the HD Pack, with ray tracing off, they just take longer to appear. The only way to get rid of them is to turn on FSR Ultra Quality but even at 1440p the game is stuttering with FSR.


I can go down to 1440p, much below the consoles render resolution which is as a minimum 1872p, turn off ray tracing and it still stutters/ shows blurry textures with the HD pack.

The 3080 MSRP was just a myth set in a market where nvidia was clueless about market conditions. I tried to get one for months on end and it never worked no matter how many discord groups I joined or Twitter alerts I signed up for. Almost all of the 3080s are selling for double the MSRP in the real world and are close to 3080 Ti pricing. The 3080 Ti is realistically about 10-20% more expensive. Had they increased the MSRP, by giving the 3080 more ram, it would have gotten even worse than it is now. Watch how nvidia does precisely what you are saying with the 3080 Super in early 2022 and give it 12GB of RAM and price it at £999 and all these cards would seem worse value relative to 3080 Ti.

The 3090 is just obscenely priced. Not a single card below £2,500 and you are paying that simply for the 24GB of VRAM which is ludicrous for gaming. By the time you get close to saturating that VRAM, the card would already need settings to be turned down anyway. 3080 Ti provides virtually the same performance for £1,600. Doesn’t make any sense to buy the 3090 for gaming at all.

Looking at current market conditions, 3080 Ti is actually the only card at the top end which is reasonably close to MSRP and provides the most balanced performance.Using GDDR6X, 12GB is the best nvidia can sell for gaming without jacking up the 3080 pricing to 3090 levels in the real world and that extra 2GB isn’t material enough at all as proven by this game. All it will be doing is proving how the 3080 Ti is a better buy than the 3080 Super.
do you have vram consuming background applications =?

xbox sx / ps5 can probably allocate a full power 10-11 gb to the game alone. check your idle vram usage without entering the game. is it pure empty 10 gb?
 
you brought up the 3080TI MSRP as if its any cheap to begin with, its like 80% more expensive than a 3080.....

Well said.

The 3080Ti was always a terrible option. Lagged the originals by months, comes LHR only so mining is less profitable, still stingy on the VRAM but they could have nailed it with slightly above regular 3080 price @ £799 for FE. We all know the MSRP is myth because the AIB prices are nowhere near.

If you think the 3090 is obscene (which it is) then you cannot justify the Ti as its exactly that, obscene. Only reason these cards sell well is because the regular 3080FE is so hard to get. Purposely so my guess by nvidia.
 
the 3090 FE sometimes stays up for HOURS, i would rather buy that one instead of a scalped 3080 (i think 1400 is a LOT of money mind you but compared to the other options on the market is not so bad)
 
The 3080Ti was always a terrible option. Lagged the originals by months, comes LHR only so mining is less profitable, still stingy on the VRAM but they could have nailed it with slightly above regular 3080 price @ £799 for FE. We all know the MSRP is myth because the AIB prices are nowhere near.

If you think the 3090 is obscene (which it is) then you cannot justify the Ti as its exactly that, obscene. Only reason these cards sell well is because the regular 3080FE is so hard to get. Purposely so my guess by nvidia.
Can’t argue with that. 3080 Ti is a terrible option. Much better of getting a 3090 on release than that card imo.
 
DF has confirmed that the Xbox version is using high settings and it operates at 1872p-2160p. But I can still see blurry textures at 1440p high settings with the HD Pack, with ray tracing off, they just take longer to appear. The only way to get rid of them is to turn on FSR Ultra Quality but even at 1440p the game is stuttering with FSR.
Right, so you can't compare to the console since it is only using High Settings and no Ray Tracing.
 
Back
Top Bottom