• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Far Cry 6 GPU performance not bad at all but is severely bottlenecked by CPU

Caporegime
Joined
12 Jul 2007
Posts
40,435
Location
United Kingdom
Consoles do have the HD Texture pack, but they don't have RT as far as I know.

AMD was selling in the Uk up until January. That's how me and others got the 6800 XT for £580.

Outside the UK people are getting 6800 XT for that price and 6900 XTs for £875 and 6700 XT for £400 odd every week. Prices go up and down a few quid depending on exchange rates etc, but not by much.
 
Caporegime
Joined
4 Jun 2009
Posts
30,926
But consider, apart from the crap that was Godfall, this is the first game to have an issue and that is only when you want the HD Textures and play at 4K which only a very small percentage do (apparently they can’t see the difference between 1440p and 4K).

It is not like games will be coming any time soon that the standard textures and settings will make the game unplayable. HD Textures are optional as LTMatt likes to keep saying. Even consoles don’t have them as far as I understand, so why is it that suddenly after this game 10gb is not enough? Let’s discuss that?

I am still in the belief that one will run out of rasterisation performance and want to upgrade or lower settings generally speaking before 10GB VRAM. We see this today.

Funny thing is, godfall wasn't even an issue in the end, watch 6900xt/6800xt footage when ray tracing was enabled and the fps drop galore meanwhile the 3080 maintained very nice frame rate :p

Personally I think we have hit our peak for rasterization performance, going forward, the focus will be RT performance, we can see that game developers are desperate to use RT but sadly held back big time, even by ampere RT performance too.

AI/ML reconstruction/upsampling will also be the focus too, especially for RDNA 3.

Not disagreeing with you but its unfair to compare their popularity since AMD couldnt care less about the UK market to begin with making it basically nearly impossible to anyone living on the UK to grab their cards for MSRP, plenty of AMD fans ended up jumping ship since Nvidia was the only one providing cards at decent prices, at least grabbing a NVIDIA FE card is somewhat manageable with alerts and having your phone logged in and ready, making the 3060TI and 3080 pretty much unbeatable on their price range.

Yup but at the end of the day, who's fault is that?

It also didn't help matters when amd were providing 80% of their supply to consoles.

Was largely why I changed from amd to nvidia, I'm glad I went this way though as ray tracing is too good to miss out on, that and DLSS along side FSR plus intels version makes it the best pick of the bunch imo.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,184
Location
Greater London
Consoles do have the HD Texture pack, but they don't have RT as far as I know.

AMD was selling in the Uk up until January. That's how me and others got the 6800 XT for £580.

Outside the UK people are getting 6800 XT for that price and 6900 XTs for £875 and 6700 XT for £400 odd every week. Prices go up and down a few quid depending on exchange rates etc, but not by much.
Thought I read their HD Textures are not the same though? Much smaller size or something. Don’t recall where though. Having a quick Google the whole game is the same size on consoles as just the HD Texture pack on PC. So it either is not the same, or they decide to make the PC ones unoptimised just so it uses more vram :p


Funny thing is, godfall wasn't even an issue in the end, watch 6900xt/6800xt footage when ray tracing was enabled and the fps drop galore meanwhile the 3080 maintained very nice frame rate :p

Personally I think we have hit our peak for rasterization performance, going forward, the focus will be RT performance, we can see that game developers are desperate to use RT but sadly held back big time, even by ampere RT performance too.

AI/ML reconstruction/upsampling will also be the focus too, especially for RDNA 3.
Lol. So how comes LtMatt keeps banging on about Godfall then? :p

Nah I think we will see a lot more rasda power yet.
 
Caporegime
Joined
12 Jul 2007
Posts
40,435
Location
United Kingdom
Thought I read their HD Textures are not the same though? Much smaller size or something. Don’t recall where though. Having a quick Google the whole game is the same size on consoles as just the HD Texture pack on PC. So it either is not the same, or they decide to make the PC ones unoptimised just so it uses more vram :p

Lol. So how comes LtMatt keeps banging on about Godfall then? :p

Nah I think we will see a lot more rasda power yet.
Okay, I've not looked into what the consoles are doing that much, I just know they use the HD Texture pack as there are videos on YouTube showing it in action.

I don't keep banging on about it? I think if you look back through the thread, others mention Godfall a lot more than I have.

I've mentioned it in the past as after launch it had a 12GB Texture pack, so similar behaviour would be seen if any in depth testing had been conducted on various GPUs, using side by side comparisons.

In case you can't read between the lines, I've seen it tested on certain GPUs, but sadly no tech sites have covered it after the patch as the game was poop. :D

Sure people like to point to random YouTube videos and proclaim X Y Z, but that's not how in depth testing works.

Thank god it was done for Far Cry 6 though, watching the fall out has been a blast. :D
 
Caporegime
Joined
4 Jun 2009
Posts
30,926
Lol. So how comes LtMatt keeps banging on about Godfall then? :p

Nah I think we will see a lot more rasda power yet.

If evidence is provided, people go on ignore i.e. fingers go in the ears :p :D

Will say it is very hard to compare godfall though due to lack of side by side etc. comparisons so we have to resort to looking at different footage using similar specs but alas, obviously won't be exact same frames shown etc. but I did link like 5 videos showing various things to back up my points unlike some... :p

It will probably go hand in hand any improvements tbf, I defo think RT will get the focus this time round though, wouldn't be surprised if amd/RDNA 3 vastly surpass nvidia even, given they are taking a bashing for their lack of perf/support for it now and imo paying the price.... Wouldn't be the first time amd over addressed a weakness they had either e.g. tessellation improvement and to an extent, their vram after the fury x vram fiasco....
 
Soldato
Joined
30 Mar 2010
Posts
13,008
Location
Under The Stairs!
But consider, apart from the crap that was Godfall, this is the first game to have an issue and that is only when you want the HD Textures and play at 4K which only a very small percentage do (apparently they can’t see the difference between 1440p and 4K).

It is not like games will be coming any time soon that the standard textures and settings will make the game unplayable. HD Textures are optional as LTMatt likes to keep saying. Even consoles don’t have them as far as I understand, so why is it that suddenly after this game 10gb is not enough? Let’s discuss that?

I am still in the belief that one will run out of rasterisation performance and want to upgrade or lower settings generally speaking before 10GB VRAM. We see this today.

But thats not how it works, it doesnt matter if titles are considered crap be it high vram or heavy RT, if it runs out of vram or RT it's not enough.

If a RT games crap, it doesn't negate the fact that Nv is better at RT than AMD

You can't argue 'but that game and now the next one is rubbish' because it breached 10Gb and expect sane people to go along with your argument

Both vendors in that specific performance bracket, have their achilles heal, one with vram, the other with RT.

And 'by the time its needed it'll run out of grunt', users have diverse needs that they will run 40fps with all the bells and whistles on coz its good enough for them where as if you wan't high fps it won't supply the grunt.

Every incarnation of increased vram amounts always always ends up the bigger vram card has longer legs regardless the vendor, 2>1, 3>2, 4>3, 6>3, 6>4, 8>6 and now 16>10 have all ended up running higher settings than the lower gpu's in their usable lifetime, it's not new.
 
Caporegime
Joined
26 Dec 2003
Posts
25,666
Just buy a console, half the price of a decent graphics card and you don't have the shenanigans of AMD/NVidia sponsored titles going out of their way trying to expose weaknesses in their competitors hardware and/or playing up to their own strengths. You can actually enjoy playing the game rather than seeing your £1000 investment deliberately nobbled and spending all of your time trying to justify the purchase.

90% of the performance for 50% of the price plus lots of future exclusives.
 
Last edited:

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,184
Location
Greater London
Okay, I've not looked into what the consoles are doing that much, I just know they use the HD Texture pack as there are videos on YouTube showing it in action.

I don't keep banging on about it? I think if you look back through the thread, others mention Godfall a lot more than I have.

I've mentioned it in the past as after launch it had a 12GB Texture pack, so similar behaviour would be seen if any in depth testing had been conducted on various GPUs, using side by side comparisons.

In case you can't read between the lines, I've seen it tested on certain GPUs, but sadly no tech sites have covered it after the patch as the game was poop. :D

Sure people like to point to random YouTube videos and proclaim X Y Z, but that's not how in depth testing works.

Thank god it was done for Far Cry 6 though, watching the fall out has been a blast. :D

Only reason I said you bang on about it is every time the topic has come up you tend to bring it up from what I recall. Not saying you bring it up out of the blue :p

I agree, Far Cry 6 has provided a lot of entertainment in the VRAM space once again :D

But thats not how it works, it doesnt matter if titles are considered crap be it high vram or heavy RT, if it runs out of vram or RT it's not enough.

If a RT games crap, it doesn't negate the fact that Nv is better at RT than AMD

You can't argue 'but that game and now the next one is rubbish' because it breached 10Gb and expect sane people to go along with your argument


Well if it is crap no one will play the said game, like here, no one talks about it, someone made a thread and it was dead, so that is all I am saying. I am not arguing anything as the said game apparently does not even have the issue according to Nexus anyways. But again who cares about Godfall, find one person here (apart from LtMatt) if you can :p


Both vendors in that specific performance bracket, have their achilles heal, one with vram, the other with RT.
I never said otherwise though? I have said, if you are someone who likes to upgrade every 2-4 generations and running HD texture packs at 4K is a concern then go 6800 or 3090. Or go console. For most here though we change cards every couple of years, so you just sell it on and get the new cards which by then even Nvidia ones will have 16GB anyways. I always upgrade to the latest gen and based on that and what I know VRAM is not an issue for me this gen which is already through half it's life cycle. I am happy to not use a texture pack in the odd game between now and then if it comes to it.

And 'by the time its needed it'll run out of grunt', users have diverse needs that they will run 40fps with all the bells and whistles on coz its good enough for them where as if you wasn't high fps it won't supply the grunt.
Exactly and for those people they have all the reviews out there and they can make an informed decision on what to pick. Both the 3000 series and 6000 series are very close in performance and great buys. Trouble is price and availability. From what I can see it has been much easier to get a Nvidia card at MSRP than AMD. I have had 3 opportunity's without too much effort myself. My first ASUS 3080 was pre-ordered from here which I was like 20-30th or something like that in the queue but I cancelled out of frustration after just picking up a 3080 FE. Then later I sold that at a huge profit and got a 3070 FE.

If money is not an issue then one can just get a 3090 and have best of both worlds. Even more VRAM and RT performance to go with it.

Every incarnation of increased vram amounts always always ends up the bigger vram card has longer legs regardless the vendor, 2>1, 3>2, 4>3, 6>3, 6>4, 8>6 and now 16>10 have all ended up running higher settings than the lower gpu's in their usable lifetime, it's not new.
I do not argue against this either. What I again reiterate is, if you are buying for the long term and concerned about running HD packs then go with a card with higher vram.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,184
Location
Greater London
Just buy a console, half the price of a decent graphics card and you don't have the shenanigans of AMD/NVidia sponsored titles going out of their way trying to expose weaknesses in their competitors hardware and/or playing up to their own strengths. You can actually enjoy playing the game rather than seeing your £1000 investment deliberately nobbled and spending all of your time trying to justify the purchase.

90% of the performance for 50% of the price plus lots of future exclusives.
Yep and this works for many people. But I like having access to my Steam library and playing many games keyboard and mouse which I cannot with consoles. One thing I always hated about consoles is backward compatibility which is not much of an issue on the PC.
 
Soldato
Joined
30 Mar 2010
Posts
13,008
Location
Under The Stairs!
Only reason I said you bang on about it is every time the topic has come up you tend to bring it up from what I recall. Not saying you bring it up out of the blue :p

I agree, Far Cry 6 has provided a lot of entertainment in the VRAM space once again :D




Well if it is crap no one will play the said game, like here, no one talks about it, someone made a thread and it was dead, so that is all I am saying. I am not arguing anything as the said game apparently does not even have the issue according to Nexus anyways. But again who cares about Godfall, find one person here (apart from LtMatt) if you can :p



I never said otherwise though? I have said, if you are someone who likes to upgrade every 2-4 generations and running HD texture packs at 4K is a concern then go 6800 or 3090. Or go console. For most here though we change cards every couple of years, so you just sell it on and get the new cards which by then even Nvidia ones will have 16GB anyways. I always upgrade to the latest gen and based on that and what I know VRAM is not an issue for me this gen which is already through half it's life cycle. I am happy to not use a texture pack in the odd game between now and then if it comes to it.


Exactly and for those people they have all the reviews out there and they can make an informed decision on what to pick. Both the 3000 series and 6000 series are very close in performance and great buys. Trouble is price and availability. From what I can see it has been much easier to get a Nvidia card at MSRP than AMD. I have had 3 opportunity's without too much effort myself. My first ASUS 3080 was pre-ordered from here which I was like 20-30th or something like that in the queue but I cancelled out of frustration after just picking up a 3080 FE. Then later I sold that at a huge profit and got a 3070 FE.

If money is not an issue then one can just get a 3090 and have best of both worlds. Even more VRAM and RT performance to go with it.


I do not argue against this either. What I again reiterate is, if you are buying for the long term and concerned about running HD packs then go with a card with higher vram.

You did say otherwise, you said/implied 'as soon as they're replaced, no ones talking about them'-they will and they do, especially now because of the high buy in cost/actuall availability, that's why I originally quoted you, I disagreed with that bit and explained why, the post above in this quote, I'm much more agreable with.:)

Should add like everyone else, I can't get a sniff of AMD ref, we aren't important enough a market anymore.:p

I've had 60ti/70ti/80Ti's/90's FE's ready to purchase in the basket and haven't bought any of them, but If I could actually get my hands on a 80FE I would be all over it, even though 10Gb is not as future proof as 16Gb with the 68's upwards playing higher raster IQ:p
 
Caporegime
Joined
12 Jul 2007
Posts
40,435
Location
United Kingdom
I am not arguing anything as the said game apparently does not even have the issue according to Nexus anyways.
Said user did nothing of the sort, that users idea of proving something is finding a random video on YouTube, using the wrong settings or scene comparison and saying there, that proved my point, which is what happened with Godfall.

I also understand that the opposite has not been proven yet. We had one tester (Keith from wccfech great guy btw) who showed the issue (lower FPS on cards below 12GB vs other cards with higher memory) with benchmarks.

However, there was too much ambiguity to the testing so it was not clear cut, same with all these random YouTube videos that are posted alone and without context.

That said, at least multiple cards were tested by Keith over the same settings/bench mark scene (we don't know what area was tested etc, or for how long etc etc) and using the same system, so it certainly holds more weight than Jimmy Two Times on Youtube running different settings and only one GPU.

Does it prove anything either way though, no and I'm not the one saying different now.

The funny thing is said user referenced a video in Far Cry 6, 'hey look guys, a 3080 can run 4K max settings just fine so that proves my point that 10GB of video memory is not needed'.

Upon closer inspection of the video, it showed every single texture in the game to be running at low resolution due to the card not meeting the minimum video memory requirements. :D

That is the level of incompetence we are dealing with here and why anything said user says is not worth the paper it is written on when it comes to these things.

Not specifically meaning to call anyone out, but I am getting called out so time to set the record straight once and for all.

Now, if you stop mentioning me and Godfall, it will stop getting brought up TNA. :p
 
Last edited:
Caporegime
Joined
4 Jun 2009
Posts
30,926
This is gonna be fun, time to nuke said persons post from orbit some more! :D :cry:

So on godfall, your post where "supposed" poor performance and those "awful fps drops" on certain ampere cards compared to rdna 2 was apparently because of "VRAM limitations", however, turned out to be that no RT was used in provided "evidence/links", once RT was enabled, well results speak for themselves:

Sadly it can't be said for 100% certainty as no side by side videos with ray tracing turned on (at least not that I can see). Only one bar/graph chart by wccftech and even then it's not exactly a great deal of a difference going by that and certainly not a case of the "3080 having its trousers pulled down" and if it is, well then so does the 6800xt except no lube is used for it :D :p

64JABaz.png


3080:




Plenty of footage out there for people to see the full picture ;)

Enough said your honour.


Onto the next case, with regards to the FC6 3080 video I posted, I mainly posted that in response to the comments about "fps dropping to 5" to show that the game is behaving differently for all users i.e. jokers 3080 ti fps crashing to 5fps but somehow when a screenshot is taken, oh fps goes straight back up, must have downloaded some more vram by taking that screenshot :cry: Lets also ignore the multiple users/sites mentioning issues even when not using the HD texture pack on "all" platforms and continue with the fingers in the ears approach and linking to the same 2 sites over and over again.... At the end of the day, once again, ubi have acknowledge the issue and are working on it, see previous ubi games to see how they have a history of similar issues, not rocket science.... Also see other games where similar problems have been exhibited aka HZD.

No one should be surprised with your stance given you have never owned an nvidia card (unlike the majority here, heck even gregster has owned amd cards and that man had a site dedicated to him owning titans.....) and secondly, you work for AMD (iirc Roy hired you based on your love for amd on these very forums). As per usual nothing wrong with having loyalty for said product/brand, perfectly normal especially when it is such an enthusiast market but don't post misleading/PR/sales piece rubbish insinuating that it is "100% this and not that" with your fingers in the ears until there is 100% certainty... there's no need for that, as you know yourself, amd have a marketing department for that or at the very least, look at the full picture rather than just continually ignoring perfectly good pieces/discussion on said matter all because it doesn't fit said narrative.
 
Last edited:
Caporegime
Joined
12 Jul 2007
Posts
40,435
Location
United Kingdom
Ugh, had to read that just to re-confirm, but all videos using the wrong settings again as expected. Some things never change, but I’ll let the thread get back on topic of Far Cry6. :D
 
Caporegime
Joined
4 Jun 2009
Posts
30,926
Rather than just saying "not using right settings", say what settings should be used then... AFAIK, I can only see that fidelity fx thing not enabled (also not enabled in that 6800xt yet those fps drops are still there.....), it's enabled in 1 of the videos for the 3080 though and yet still not same perf. drop as shown on the 6800xt..... Funny that too as the "piece" you used to back up your argument had no RT turned on :cry:
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,184
Location
Greater London
You did say otherwise, you said/implied 'as soon as they're replaced, no ones talking about them'-they will and they do, especially now because of the high buy in cost/actuall availability, that's why I originally quoted you, I disagreed with that bit and explained why, the post above in this quote, I'm much more agreable with.:)

Should add like everyone else, I can't get a sniff of AMD ref, we aren't important enough a market anymore.:p

I've had 60ti/70ti/80Ti's/90's FE's ready to purchase in the basket and haven't bought any of them, but If I could actually get my hands on a 80FE I would be all over it, even though 10Gb is not as future proof as 16Gb with the 68's upwards playing higher raster IQ:p

Ah fair enough. I think you took it literally man. That is like a figure of speech like. There are people even talking about even older cards. But I was speaking relatively. I should have used the word relatively perhaps. To make my point. Relatively speaking not many are talking about last gen cards. Just go look at the latest 20 threads in Graphics Card forum :D

Damn you for making me read that nonsense again by quoting it, never again!
Right, I am quoting all his posts now :cry::p;)
 
Soldato
Joined
30 Mar 2010
Posts
13,008
Location
Under The Stairs!
Ah fair enough. I think you took it literally man. That is like a figure of speech like. There are people even talking about even older cards. But I was speaking relatively. I should have used the word relatively perhaps.
To make my point. Relatively speaking not many are talking about last gen cards. Just go look at the latest 20 threads in Graphics Card forum :D

:cool:That's partly why I'm pointing out these things from time to time, so that the reading of 20 odd threads of lots of folks laughing that it's plenty will literraly base a second hand purchase because of the wrong impression of how vram will play out over time.

(it just happened to be you that got quoted, I know you'll no take the nip:p as @Nexus18 would just go on and on anyway:p)
 
Back
Top Bottom