• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GPU Upgrade Time... is 12GB enough @ 1440p?

Status
Not open for further replies.
Associate
Joined
18 Jan 2011
Posts
328
Managed to sell my 3070 Ti Founder's Edition for a good price. It always ran hotter than I liked and it seemed to be struggling @ 1440p in recent titles.

I'm sitting tight till the 4070 drops to see if that has any impact on the 4070 Ti or the 7900xt. I'm leaning towards the 4070 Ti as I'll be playing a lot of Diablo 4 and it will support the DLSS 3.0

However... 12GB RAM feels like it will start to struggle at 1440p / 165hz just like 8GB is. Maybe I'm over reacting, a 50% increase in VRAM should be fine for the next 5 years at 1440p surely?
 
As always it depends on what you want to play and at what settings. But let's assume you want the latest games at high settings.

Right now? Sure 12gb 4070 is plenty.

5years time? Maybe... To put it into terms, 2060 6gb launched in 2019 and while I wouldn't say it's completely fine at 1440p high 4 years later, it does just about managed 60fps in something like hogwarts legacy with dlss quality on.

At what point 4070 will go from fine to not fine, can't say.
 
For most games in general it might be ok to get away with 12GB; however if Hogwarts Legacy 4070ti 12GB shuttering issue caused by running out of vram (when playing with Raytracing at higher than 1080p) is any indication of what future games with RT be like, then no 12GB vram won't be enough. 8GB is already starting to have performance issue for 1080p in some games, so 12GB for 1440p won't be sufficient for much longer.

IMO if you are getting a high-end card today, don't settle for anything less than 16GB vram would be the smarter way to go.
 
Last edited:
I agree - spending 700+ no way I would settle for ‘just’ 12GB.

There’s been some real generational stagnation when it comes to vram from Nvidia. Just think, even though it was a top of the line product and it arguably could never use it all, the Titan X in 2015 had 12GB vram. Here we are 8 years later…
 
Last edited:
Many games will just use the consoles' capabilities as a baseline. The PS5 has 16GB of unified memory, probably up to 12GB of that useable as VRAM in practice depending on what are the other memory requirements. PC games will allow settings higher than that so if you want to do better than the consoles then it could become limiting.

See historical example: the PS4 had a similar setup with 8GB of memory and I think 6GB turned out to be the comfortable minimum up until near the end of its lifespan.

I suspect if you're sticking to 1440p 12GB will be fine for the most part and probably even 4k in a lot of cases if you're running at console-equivalent settings (which for the most part will be below max on PC).


Edit: just re-read and spotted the 165hz bit. Yeah in that case just get the most overkill thing you can get your hands on.
 
Last edited:
Managed to sell my 3070 Ti Founder's Edition for a good price. It always ran hotter than I liked and it seemed to be struggling @ 1440p in recent titles.

I'm sitting tight till the 4070 drops to see if that has any impact on the 4070 Ti or the 7900xt. I'm leaning towards the 4070 Ti as I'll be playing a lot of Diablo 4 and it will support the DLSS 3.0

However... 12GB RAM feels like it will start to struggle at 1440p / 165hz just like 8GB is. Maybe I'm over reacting, a 50% increase in VRAM should be fine for the next 5 years at 1440p surely?
I don't think 12gb will struggle. I think you'll need to drop textures at some point in some games
That's not struggling, you have to drop settings with all cards currently on the market.

With that said, for the same price I'd consider the 7900xt over the 70ti.
 
Managed to sell my 3070 Ti Founder's Edition for a good price. It always ran hotter than I liked and it seemed to be struggling @ 1440p in recent titles.

I'm sitting tight till the 4070 drops to see if that has any impact on the 4070 Ti or the 7900xt. I'm leaning towards the 4070 Ti as I'll be playing a lot of Diablo 4 and it will support the DLSS 3.0

However... 12GB RAM feels like it will start to struggle at 1440p / 165hz just like 8GB is. Maybe I'm over reacting, a 50% increase in VRAM should be fine for the next 5 years at 1440p surely?
You don't need DLSS 3 for diablo 4. I ran the beta on my 6700XT, which is slightly underclocked and undervolted, @ 3440x1440 and still hit triple digits most of the time. The only Nvidia card actually worthy of consideration in my book is the 4090 even though it is stupid money. The rest of Nvidia's lineup is just horrible price wise and even more so on the memory setup. Now granted, AMD aren't that much better value but they are still better IMHO and seeing how VRAM usage is really kicking off as of late I wouldn't settle for 12gb nor 16gb for a brand new generation GPU at that price tag. My 12gb framebuffer is getting used, even by older titles.

It's kinda silly really. If Nvidia just made sure their GPUs had a proper memory setup, instead of gimping them so hard, I would actually have an easier time recommending something like a 4070ti. It feels like an attempt at forced obsolescence IMHO. What is the point in having enough GPU horsepower run something like RTX if you don't have the memory to back it up?
 
Managed to sell my 3070 Ti Founder's Edition for a good price. It always ran hotter than I liked and it seemed to be struggling @ 1440p in recent titles.

I'm sitting tight till the 4070 drops to see if that has any impact on the 4070 Ti or the 7900xt. I'm leaning towards the 4070 Ti as I'll be playing a lot of Diablo 4 and it will support the DLSS 3.0

However... 12GB RAM feels like it will start to struggle at 1440p / 165hz just like 8GB is. Maybe I'm over reacting, a 50% increase in VRAM should be fine for the next 5 years at 1440p surely?
5 years?
Maybe, main issue is UE5 that alongside a console port may require more.
always depends what games you play.
12gb is minimum to buy today tho.
a game takes 3 to 5 years normally to make.
usually its not the ram that is the main issue but performance as 5 years means 2 to 3 generations of cards.
so an upgrade every 2 generations seems the sweetspot.
 
It's kinda silly really. If Nvidia just made sure their GPUs had a proper memory setup, instead of gimping them so hard, I would actually have an easier time recommending something like a 4070ti. It feels like an attempt at forced obsolescence IMHO. What is the point in having enough GPU horsepower run something like RTX if you don't have the memory to back it up?
Could not agreed more about this.

In the past at least it is like 2 years down the line Nvidia cards before vram become a limitation before the rest of the card (which is already not an ideal situation for those that don't upgrade every 2 years), but now the 4070ti is pretty much having vram limitation issue on day 1.

Nvidia is trying so hard to try to push their users/fans into overspending essentially like "Why buy 4070ti, when you should buy 4080; why buy 4080 when you should buy 4090".
 
If you like playing games on release, a card with higher VRAM will give you a better gaming experience - but can still stutter & crash.

All games on release have performance issues - if they didn't they would never need to add patches and optimization. Since games no longer come in a physical format, and not many release a beta test version before the release. THose paying MSRP ARE the beta testers. If you like stutter, glitches, crashes, falling through game worlds. Oddly you need to pay full price for the game.

All games get fixed and optimized and ALWAYS plays fine after a few months.

USing game performance on day one is marketed by techtubers to get some clicks, because controversy gets clicks. Do not use game release performance as a yardstick for future requirements. New game engines like Unreal 5 is the yardstick for that.

If a game cant run on 1080p with 8GB of VRAM - then the game is borked. Guarantee that TLOU1 will be running fine in a few months at 1080p on an 8GB card.

Being a beta tester is an expensive way to PC game. Up to you which hype you believe, one is expensive, the other option cheaper for both the game purchase & hardware.
 
Last edited:
All games on release have performance issues - if they didn't they would never need to add patches and optimization. Since games no longer come in a physical format, and not many release a beta test version before the release. THose paying MSRP ARE the beta testers. If you like stutter, glitches, crashes, falling through gam,e worlds. Oddly you need to pay full price for the game.
I know I like to pick on Bethesda, but way back when just after Daggerfall when they had that Terminator game, it was advertised with a ground breaking engine.

Having fallen through the floor in many Daggerfall dungeons, I though: let me try that Terminator demo and see...

..Sure enough, walked up some ramp and fell through the ground!

Truly "ground breaking" but maybe not in the way marketing meant! :D
 
It seems like the 7900xt is the only sensible choice right now, the vram on 4070ti is gimped , 4080 is overpriced and its looking like 4070 will just be 3080 performance.
 
If you like playing games on release, a card with higher VRAM will give you a better gaming experience - but can still stutter & crash.

All games on release have performance issues - if they didn't they would never need to add patches and optimization. Since games no longer come in a physical format, and not many release a beta test version before the release. THose paying MSRP ARE the beta testers. If you like stutter, glitches, crashes, falling through gam,e worlds. Oddly you need to pay full price for the game.

All games get fixed and optimized and ALWAYS plays fine after a few months.

USing game performance on day one is marketed by techtubers to get some clicks, because controversy gets clicks. Do not use game release performance as a yardstick for future requirements. New game engines like Unreal 5 is the yardstick for that.

If a game cant run on 1080p with 8GB of VRAM - then the game is borked. Guarantee that TLOU1 will be running fine in a few months at 1080p on an 8GB card.

Being a beta tester is an expensive way to PC game. Up to you which hype you believe, one is expensive, the other option cheaper for both the game purchase & hardware.
I understand what you are saying, but unfortunately that's just the reality of how things are and it's still better having more breathing room than without.

I mean ideally I would buy trousers/pants with the waist-size fit me just right, but as I am about as lazy on exercising or hitting the gym as the devs are optimising for their games, I would buy them with an inch or so bigger just in case :cry:
 

UE5 game dev blows the 'games are unoptimised at launch' clean out the window, there's always more to the story.

If you can't watch it...

Games are using more textures, way, way more textures, instead of one texture for a character, eyes alone have a higher texture cost than what used to cost for a whole character.

He recommends 12Gb for 1080p going forward.

PC is slower than console accessing assets in vram.

Make your own mind up after listening to it, and listen out for the explanation of how UE5 engine downgrades textures when it runs low on vram.
 
Last edited:
I understand what you are saying, but unfortunately that's just the reality of how things are and it's still better having more breathing room than without.

I mean ideally I would buy trousers/pants with the waist-size fit me just right, but as I am about as lazy on exercising or hitting the gym as the devs are optimising for their games, I would buy them with an inch or so bigger just in case :cry:
:cry: Oh I know that feeling. Trouble with trousers /pants - bigger sizes are same price.

You can attain breathing room for your PC by not buying games on release and not upgrading a gfx card to get only a slightly better experience in a game. Even with a 4090 TLOU1 was stuttering & crashing on DAY1.

It does give you breathing space I guess if you want to play games on release day and be a beta tester.

Problem with VRAM is that Nvidai use GDDR6x which is meant to have x4 the throughput of GDDR6. But it seems to make no difference whatsoever apart from a cost increase as it was much more expensive. I don't know why, if GDDR6 is as good and much cheaper, Nvidia don't just slap more of the cheap stuff on rather than using 6x and pass the saving to consumers.

What would be better is if GPU's had VRAM sticks like system RAM and you could take the VRAM with you. VRAM slots on the card that you can populate. Probably a completely silly idea due to engineering but you buy VRAM & GPU separately and they are transferable as in as much as DDR4 is across compatible boards. This would probably make it more expensive as a GPU is now sold in 2 parts.

I find it funny that NV are the bad when toi me, it's the game, like all games on release - poor. VRAM is just where limits are reached first. GAmes have had memory leaks etc since forever. TLOU1 has now moved up to 47% positive reviews (of 14,000 reviews) from 20% (of 7000 reviews) on release - is this because everyone since has bought new gfx cards to play it or the game has been patched?
 
Last edited:
It means they've patched it to reduce the crashing, it's still just as heavy on the vram as it was when it launched.
So you can confirm that 8GB cards will never be able to run this game at 1080p?


nothing to do with the potential memory leak that is still a known issue?
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom