• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GPU Upgrade Time... is 12GB enough @ 1440p?

Status
Not open for further replies.
Managed to sell my 3070 Ti Founder's Edition for a good price. It always ran hotter than I liked and it seemed to be struggling @ 1440p in recent titles.

I'm sitting tight till the 4070 drops to see if that has any impact on the 4070 Ti or the 7900xt. I'm leaning towards the 4070 Ti as I'll be playing a lot of Diablo 4 and it will support the DLSS 3.0

However... 12GB RAM feels like it will start to struggle at 1440p / 165hz just like 8GB is. Maybe I'm over reacting, a 50% increase in VRAM should be fine for the next 5 years at 1440p surely?

If you're going to buy a card to last you a good few years then you should be looking at 16GB now, 12GB will shortly become the new 8GB and you don't want that.

:D
 
I wouldn't pay £800 for a 4070ti regardless of VRAM as its not even matching previous gen flagship performance which traditionally the 70 non ti has done for £500.

The best option right now is not to buy anything however if you have no choice and need something then the 2 cards I'd consider are a used 3090 for £650 or a 7900XT since they are available for just over £750 right now.
 
So you can confirm that 8GB cards will never be able to run this game at 1080p?
3070 running@1080p and a 10Gb 3080@4K here so I don't get your point.
nothing to do with the potential memory leak that is still a known issue?


Ignoring potential for a moment here, 8gb GPUs can already run it at medium settings, if patching it gives higher playable settings, but only moves the bar from medium to high TEXTURE related settings and it still physically can't run ultra texture related settings, is the game still badly optimised, or it simply doesn't have enough vram?

Since the last patch, it's crashed once in a cut scene on my end.

If it's potentially crashing on cutscenes, if they patch that out to stop a memory leak, then it's stopped a memory leak, it doesn't guarantee anything other than stopping a crash, it won't magically enable ultra texture settings on a gpu stuck on medium/high textures, that 100% would achieve higher texture related settings if there was a higher vram allocated to both my gpus.
 
In LOU it has idiot proof comprehensive settings descriptions stating how gpu intensive AND OR vram intensive each setting affects what part of the card takes the hit=GRUNT and VRAM, it's clear as day, love it.



As said in another thread, I'm running a mixture of settings from low to Ultra, the game informs you when settings are min/med/high gpu intensive and how intensive the vram is with the same min/med/high indicators.

Here's two screenshots, ones using a mixture of settings running just under the in game 10Gb indicator, the other is running with more non gpu intensive Ultra (high vram usage/min gpu usage) settings taking it over 10Gb:
shot_20230329_003500.jpg




shot_20230329_003707.jpg

Just like the dev in the vid I linked earlier says, safety features kick in and reduce the textures when you go above your vram limits.
 
I wouldn't pay £800 for a 4070ti regardless of VRAM as its not even matching previous gen flagship performance which traditionally the 70 non ti has done for £500.

The best option right now is not to buy anything however if you have no choice and need something then the 2 cards I'd consider are a used 3090 for £650 or a 7900XT since they are available for just over £750 right now.

Well said. This gen has been very disappointing unless you just did a yolo and went 4090 imo.
 
Can only distract excuses for so long eh.
Considering I only comment definitively about the vram heavy games I've got in my library, as I own both the 3070 AND the 10Gb 3080, I've got the unique position of probably being dismissed about running out of vram@1080p, 1440p and 4K more times than anybody else on the forum.:(


Edit

@Jestahr


Just in case this thread gets locked for discussing vram

End of the day, pick your poison, a 12Gb gpu isn't going to go in the bin if you run out of vram, just like my 70 and 80 haven't been thrown in the bin, you'll maybe have to lay off the texture settings a bit this gen, then a bit more the next gen if your keeping it 5 yrs.

If you go 12Gb you get DLSS 3 and FSR and better RT'ing vs the 7900xt being slower in RT, only has FSR a touch faster in raster and 8Gb's worth of possible higher texture settings and not having to wait for AAA games to be patched for vram problems.
 
Last edited:
So you can confirm that 8GB cards will never be able to run this game at 1080p?


nothing to do with the potential memory leak that is still a known issue?
So you can confirm that 8GB cards will never be able to run this game at 1080p?


nothing to do with the potential memory leak that is still a known issue?
Hi I have a 3060ti and have the game (TLOU) on High settings and I'm getting a very playable 60-80fps (with no obvious jerkiness) with awesome visuals. (Using 8.4gb vram- ie exceeding my buffer and spilling over into system RAM)

The game crashed everytime on Ultra.
 
Considering I only comment definitively about the vram heavy games I've got in my library, as I own both the 3070 AND the 10Gb 3080, I've got the unique position of probably being dismissed about running out of vram@1080p, 1440p and 4K more times than anybody else on the forum.:(


Edit

@Jestahr


Just in case this thread gets locked for discussing vram

End of the day, pick your poison, a 12Gb gpu isn't going to go in the bin if you run out of vram, just like my 70 and 80 haven't been thrown in the bin, you'll maybe have to lay off the texture settings a bit this gen, then a bit more the next gen if your keeping it 5 yrs.

If you go 12Gb you get DLSS 3 and FSR and better RT'ing vs the 7900xt being slower in RT, only has FSR a touch faster in raster and 8Gb's worth of possible higher texture settings and not having to wait for AAA games to be patched for vram problems.
I get what you're saying fella, I really do.

ALL I'm trying to say is that it's futile to use release day performance as a yardstick for the future. There isnt many options when a card falls over, it's bottlenecked by CPU, crashes, ranout of grunt and VRAM is saturated.

Same as when you showed us FC6 going down to 2fps. Doesn't do that still does it? Game also now states that to run HD texture pack it wants 11GB or VRAM - so they've done something with the game.

And the Dev in that video is talking about an engine and games that aren't out yet and talking about UE5. Much of what he says is opinion rather than based on facts. Lots of coulds etc and he doesn't look old enough to have masses of experience in game development, certainly in the past 5 years when he must have been active.

I just don't think, as like others that the gfx in TLOU is anything ground breaking and I don't think it should be swamping out 8GB @ 1080p. It just doesn't look that good and I wish it did. That Dev also says consoles with 16GB only ever use between 10-12GB for the game. Also that a console can easily on the fly switch between VRAM & system RAM and a console has lots of other processing options a PC doesn't have.The dev when asked what he thinks should be the minimum going forward is 16-20GB but he is talking UE5. Of course a dev is going to want more room to allow for poor optimization. That just means gamer pays more for a higher VRAM card and those on older cards wont buy the game........just doesn't make sales sense to make a game where it struggles @ 1080p on an 8GB card when I think the most common is 6GB. I get us enthusiasts will have the higher end and I see why many say if you want an easier time, spend more on a higher VRAM card. ALL I'm saying is - that game'll be fixed. The other option is to not pay MSRP for the game and wait for the performance issues to be optimized like they ALL are.

Of course, VRAM will increase over time. Is that the game engines, games have been released on recently, and the gfx of said games, aren't over and above other games that look as good and sometimes better, but don't run out of VRAM. I wish they did.

UE5 will be the test, and not on the performance of the first games release either :) Yes we can spend more money to get an easier time on release. But all games on release have been poor - particularly from AAA devs.

Thanks for the vid, was quite interesting and I learnt a lot about consoles.
 
Last edited:
I get what you're saying fella, I really do.

ALL I'm trying to say is that it's futile to use release day performance as a yardstick for the future. There isnt many options when a card falls over, it's bottlenecked by CPU, crashes, ranout of grunt and VRAM is saturated.

Same as when you showed us FC6 going down to 2fps. Doesn't do that still does it? Game also now states that to run HD texture pack it wants 11GB or VRAM - so they've done something with the game.

And the Dev in that video is talking about an engine and games that aren't out yet and talking about UE5. Much of what he says is opinion rather than based on facts. Lots of coulds etc and he doesn't look old enough to have masses of experience in game development, certainly in the past 5 years when he must have been active.

I just don't think, as like others that the gfx in TLOU is anything ground breaking and I don't think it should be swamping out 8GB @ 1080p. It just doesn't look that good and I wish it did. That Dev also says consoles with 16GB only ever use between 10-12GB for the game. Also that a console can easily on the fly switch between VRAM & system RAM and a console has lots of other processing options a PC doesn't have.The dev when asked what he thinks should be the minimum going forward is 16-20GB but he is talking UE5. Of course a dev is going to want more room to allow for poor optimization. That just means gamer pays more for a higher VRAM card and those on older cards wont buy the game........just doesn't make sales sense to make a game where it struggles @ 1080p on an 8GB card when I think the most common is 6GB. I get us enthusiasts will have the higher end and I see why many say if you want an easier time, spend more on a higher VRAM card. ALL I'm saying is - that game'll be fixed. The other option is to not pay MSRP for the game and wait for the performance issues to be optimized like they ALL are.

Of course, VRAM will increase over time. Is that the game engines, games have been released on recently, and the gfx of said games, aren't over and above other games that look as good and sometimes better, but don't run out of VRAM. I wish they did.

UE5 will be the test, and not on the performance of the first games release either :) Yes we can spend more money to get an easier time on release. But all games on release have been poor - particularly from AAA devs.

Thanks for the vid, was quite interesting and I learnt a lot about consoles.
Exactly, I wish tlou looked as good as the Vram usage suggested. When plague tale runs on 5gb vram at 4k ultra, no game that looks worse than it while requiring 3 times the Vram has an excuse but poor optimization. Period.
 
I get what you're saying fella, I really do.

ALL I'm trying to say is that it's futile to use release day performance as a yardstick for the future. There isnt many options when a card falls over, it's bottlenecked by CPU, crashes, ranout of grunt and VRAM is saturated.

Same as when you showed us FC6 going down to 2fps. Doesn't do that still does it? Game also now states that to run HD texture pack it wants 11GB or VRAM - so they've done something with the game.

And the Dev in that video is talking about an engine and games that aren't out yet and talking about UE5. Much of what he says is opinion rather than based on facts. Lots of coulds etc and he doesn't look old enough to have masses of experience in game development, certainly in the past 5 years when he must have been active.

I just don't think, as like others that the gfx in TLOU is anything ground breaking and I don't think it should be swamping out 8GB @ 1080p. It just doesn't look that good and I wish it did. That Dev also says consoles with 16GB only ever use between 10-12GB for the game. Also that a console can easily on the fly switch between VRAM & system RAM and a console has lots of other processing options a PC doesn't have.The dev when asked what he thinks should be the minimum going forward is 16-20GB but he is talking UE5. Of course a dev is going to want more room to allow for poor optimization. That just means gamer pays more for a higher VRAM card and those on older cards wont buy the game........just doesn't make sales sense to make a game where it struggles @ 1080p on an 8GB card when I think the most common is 6GB. I get us enthusiasts will have the higher end and I see why many say if you want an easier time, spend more on a higher VRAM card. ALL I'm saying is - that game'll be fixed. The other option is to not pay MSRP for the game and wait for the performance issues to be optimized like they ALL are.

Of course, VRAM will increase over time. Is that the game engines, games have been released on recently, and the gfx of said games, aren't over and above other games that look as good and sometimes better, but don't run out of VRAM. I wish they did.

UE5 will be the test, and not on the performance of the first games release either :) Yes we can spend more money to get an easier time on release. But all games on release have been poor - particularly from AAA devs.

Thanks for the vid, was quite interesting and I learnt a lot about consoles.

Amazes me that people are still pointing fingers at nvidia for what are well regarded broken games, which have been proven by multiple sources with heaps of evidence but nope, just pay more to brute force/avoid issues, Jenson and Lisa love these ones... :o I wonder if the patch due today for the last of us will simply add more vram to gpus or if just maybe..... they will "optimise" the game, unheard of in the development space, I know but hey, what do I know, not like I work in the industry... :cry: I would get the sack if I told our clients to just throw more hardware power/VMs at a hungry unoptimised app :cry:

EDIT:

The biggest problem as attested to by game developers and experts in the field is that games are designed for consoles which make proper use of direct storage and a fast NVME along with unified system memory and VRAM, when games get ported to PC, the games aren't properly configured to make use of the different setup on PC hence why issues such as witnessed in TLOU and forgotten are experienced, DF show this perfectly in their videos, speaking of which, if the game running worse on a PC of similar spec to a console and encountering such issues doesn't show this of well, I don't know what will, sorry I forget, nvidia shills.... :o

Considering I only comment definitively about the vram heavy games I've got in my library, as I own both the 3070 AND the 10Gb 3080, I've got the unique position of probably being dismissed about running out of vram@1080p, 1440p and 4K more times than anybody else on the forum.:(

Because you posted nothing to back up these so called experiences, only FC 6, all the other games are unknown of to be having supposed vram issues and nothing else on the internet was bringing up these issues i.e. metro EE, MSFS (unless sorry, modding [which we all agreed on many times before that]), RE 7 village
 
Last edited:
  • Like
Reactions: TNA
I get what you're saying fella, I really do.
ALL I'm trying to say is that it's futile to use release day performance as a yardstick for the future. There isnt many options when a card falls over, it's bottlenecked by CPU, crashes, ranout of grunt and VRAM is saturated.

I'm not just basing this round of games on not having enough vram because I'm judging it on release performance, I'm judging it on 8/10Gbs not being able to run ULTRA texture settings-period, whether they fix performance or not, fixing bugs won't give more available vram to make these cards go from low/med/high texture based settings to having enough vram to use higher med/high/Ultra texture based settings, that have zero gpu grunt penalty.

I'll flip your comment round and say it's futile to blame release day performance because it breaks 8 or 10GB's as a yardstick for the future either, don't know how long youv'e been gaming on PC but AAA games since I don't know how long back have had crap launches, there's a difference between crap launches due to bugs and crap launches because they only break 3070's and 3080's when you surpass their vram, it's not as if it's not been happening already many times.

Games are all recommending requiring X vram and folks lose their **** when a 3070 and 3080 run out of vram and here we are again.

Same as when you showed us FC6 going down to 2fps. Doesn't do that still does it? Game also now states that to run HD texture pack it wants 11GB or VRAM - so they've done something with the game.

FC 6 system requirements with the HD pack is still the same:
Please note that the minimum requirement for running the HD Texture Pack for Far Cry 6 is 12GB of VRAM. For 4K configurations, you will need 16GB of VRAM.

If you download and run the HD Texture Pack with lower VRAM capabilities, you will encounter performance issues while playing.
Just tried it, fully patched, FC6 it still can't run ultra settings@4K with HD texture pack because it doesn't have enough vram, in that engine it slideshows.

I'll idiot proof the proof this time to negate the 'there's no proof' excuses:

Here's my 3070,58003D/32Gb@1080p, yes 1080p on Forza, been out years now:
forza_1.jpg


forza_2.jpg


forza_3.jpg

It doesn't slideshow, it'll flush textures, that is the fallback in that engine.


shot_20230329_003500.jpg




shot_20230329_003707.jpg

Will flushed textures like these be fixed in a patch when they do/don't optimise textures when you go past the vram limit like this in tlou, doubt it as I can't see them fitting those textures into that allocation.

Myself and many, many forum 3070/80 users here have all been repeating over and over again that 8 and 10Gb IS choking the gpus outwith FC6, so why's it impossible newer games can't break 8/10Gb?

And the Dev in that video is talking about an engine and games that aren't out yet and talking about UE5. Much of what he says is opinion rather than based on facts. Lots of coulds etc and he doesn't look old enough to have masses of experience in game development, certainly in the past 5 years when he must have been active.

I just don't think, as like others that the gfx in TLOU is anything ground breaking and I don't think it should be swamping out 8GB @ 1080p. It just doesn't look that good and I wish it did. That Dev also says consoles with 16GB only ever use between 10-12GB for the game. Also that a console can easily on the fly switch between VRAM & system RAM and a console has lots of other processing options a PC doesn't have.The dev when asked what he thinks should be the minimum going forward is 16-20GB but he is talking UE5. Of course a dev is going to want more room to allow for poor optimization. That just means gamer pays more for a higher VRAM card and those on older cards wont buy the game........just doesn't make sales sense to make a game where it struggles @ 1080p on an 8GB card when I think the most common is 6GB. I get us enthusiasts will have the higher end and I see why many say if you want an easier time, spend more on a higher VRAM card. ALL I'm saying is - that game'll be fixed. The other option is to not pay MSRP for the game and wait for the performance issues to be optimized like they ALL are.

Of course, VRAM will increase over time. Is that the game engines, games have been released on recently, and the gfx of said games, aren't over and above other games that look as good and sometimes better, but don't run out of VRAM. I wish they did.

UE5 will be the test, and not on the performance of the first games release either :) Yes we can spend more money to get an easier time on release. But all games on release have been poor - particularly from AAA devs.

Thanks for the vid, was quite interesting and I learnt a lot about consoles.


When he said(54 mins) if you want to see high quality graphics close up and RT, it all has a cost and they decided on using 12Gb as a minimum-he's talking about ultra, not for entry, you don't get RT and close up hair strands on low settings, it still works on 8/10Gbs gpus within their lower textured budget.

The dev in that vid is using 'we' in conversation, he's part of the team, he could take his cap off and be totally bald for all we know, as new gen consoles are the lead platform, as you already pointed out consoles have superior memory management probably explains why 8 to 10Gb are struggling, PC can't juggle everything as efficiently as console.

Last but not least, doesn't matter whether you think the graphics in tlou are ground breaking or not, pretty good to me on mixed settings, and just gets better the higher settings you can go, plenty in the game thread saying they've been impressed.
 
Last edited:
He will be fine with 12GB… he will end up upgrading to the 5070 Ti later on, DLSS and frame gen will help during ownership.

If you are the type of person that has an upgrade itch then buying the 4070 Ti/7900 XT will satisfy that itch. By buying the higher models… you lose way too much money, look at how much the top end cards from the 30 series and 6000 series are worth now.
 
He will be fine with 12GB… he will end up upgrading to the 5070 Ti later on, DLSS and frame gen will help during ownership.

If you are the type of person that has an upgrade itch then buying the 4070 Ti/7900 XT will satisfy that itch. By buying the higher models… you lose way too much money, look at how much the top end cards from the 30 series and 6000 series are worth now.
If you plan to buy a 4070ti and 5070ti you'll end up spending the same had you just bought a 4090 yet you'll end up with less performance.

A 4070ti is 43% faster than a 3070ti but a 5070ti would need to be 67% faster than a 4070ti just to equal a 4090, can't see that happening especially with a smaller node jump next time from 5nm > 3nm.

High end 3000 series saw more depreciation because of cards like the 3080 at £650 offering within 12% the 3090 performance, this time however you got a 4080 at £1200 which sees a performance gap to the 4090 of 33%, almost a generational gap in itself.
 
Last edited:
I recently upgraded to a 4070ti and for 1440p. Sold my 3070 for good money also so i am happy so far with the upgrade. I change them every 2 years so the 12gigvram is not really an issue.
Great card, cheeky undervolt too.
 
I'm not just basing this round of games on not having enough vram because I'm judging it on release performance, I'm judging it on 8/10Gbs not being able to run ULTRA texture settings-period, whether they fix performance or not, fixing bugs won't give more available vram to make these cards go from low/med/high texture based settings to having enough vram to use higher med/high/Ultra texture based settings, that have zero gpu grunt penalty.

I'll flip your comment round and say it's futile to blame release day performance because it breaks 8 or 10GB's as a yardstick for the future either, don't know how long youv'e been gaming on PC but AAA games since I don't know how long back have had **** launches, there's a difference between **** launches due to bugs and **** launches because they only break 3070's and 3080's when you surpass their vram, it's not as if it's not been happening already many times.

Games are all recommending requiring X vram and folks lose their **** when a 3070 and 3080 run out of vram and here we are again.



FC 6 system requirements with the HD pack is still the same:

Just tried it, fully patched, FC6 it still can't run ultra settings@4K with HD texture pack because it doesn't have enough vram, in that engine it slideshows.

I'll idiot proof the proof this time to negate the 'there's no proof' excuses:

Here's my 3070,58003D/32Gb@1080p, yes 1080p on Forza, been out years now:
forza_1.jpg


forza_2.jpg


forza_3.jpg

It doesn't slideshow, it'll flush textures, that is the fallback in that engine.


shot_20230329_003500.jpg




shot_20230329_003707.jpg

Will flushed textures like these be fixed in a patch when they do/don't optimise textures when you go past the vram limit like this in tlou, doubt it as I can't see them fitting those textures into that allocation.

Myself and many, many forum 3070/80 users here have all been repeating over and over again that 8 and 10Gb IS choking the gpus outwith FC6, so why's it impossible newer games can't break 8/10Gb?




When he said(54 mins) if you want to see high quality graphics close up and RT, it all has a cost and they decided on using 12Gb as a minimum-he's talking about ultra, not for entry, you don't get RT and close up hair strands on low settings, it still works on 8/10Gbs gpus within their lower textured budget.

The dev in that vid is using 'we' in conversation, he's part of the team, he could take his cap off and be totally bald for all we know, as new gen consoles are the lead platform, as you already pointed out consoles have superior memory management probably explains why 8 to 10Gb are struggling, PC can't juggle everything as efficiently as console.

Last but not least, doesn't matter whether you think the graphics in tlou are ground breaking or not, pretty good to me on mixed settings, and just gets better the higher settings you can go, plenty in the game thread saying they've been impressed.

Dude's having a meltdown about vram at 02:25 in the middle of the night. Bro, just sell the damned 3080 and get yourself a 6900XT? Would barely cost anything doing the "upgrade" :D

My prediction? Tommyboy gets another Nvidia card like the 4070 or something with 12GB next and continues the bitching. One could call him RedRollo :cry:
 
Someone like me who plays with a 1440p screen and a 3060ti FE- should we wait till Rx8000 or 50 series to get a bigger jump at around the £800 mark? Just seems even 3090/3090ti/7900xt perf is still underwhelming…..

Btw TLOU (The Last of Us) plays buttery smooth with a bit of tweaking and on HIGH settings at 1440p and since the patch doesn’t crash. Even with 8gb vram!

That’s making me think to hold off till next year (Sept 2024) before upgrading.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom