• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

12GB vram enough for 4K? Discuss..

Status
Not open for further replies.
Far Cry 7. Nuff said.
Actually I do wonder about that. I'm hoping there's a major engine rework done for it. As much as I loved Far Cry 6 I think the current iteration of the Dunia Engine has pretty much reached its limits, particularly as it relates to asset streaming & handling.
Then again, wasn't the rumour it's going to go harder on being more of a multiplayer/co-op game? In which case it will be **** and so who cares. :P
 
Even if I could justify the cost of a 4070ti I would not buy in on those grounds. My logic is telling me that these days I need to make a graphics card purchase that will last me more than two years (becuase I can nolonger justify replacing them every two years) which in turn means a graphics card has to survive any system changes I make, including perhaps a higher res monitor.
 
I know you don't know what future games' vram usage will be, and whether or not 12 GB will prove too little. Another thing I know is that UE5 is seeing wide adoption and from what we've seen the new tech they've developed leads to much lower vram usage than UE4 (and other engines) for near quality assets, thanks to nanite. So thus far I see solid reasons for why 12 GB will be fine even for future games, and I have yet to hear any solid reasons as to why it won't - besides "numbers go up because they've always went up". Without a timeframe & performance impact it's a moot point.


That's not how games actually run. In fact we see varying vram usage depending on game engines as well as per studio philosophy (as it relates to asset budgeting). Moreover the target around which these games are authored (which very much includes texture quality) is that of consoles, and that hasn't changed since 2020. Actually, just like how it happened with SSD speed, it's not just about the hardware available to be used but also the actual authoring of said assets & how it fits with the gameplay. That's why even for a game like R&C Rift Apart that's meant to be a showcase for maximising the fancy SSD in the PS5 (and also not burdened by having to work on all other hardware/platforms) it fails to do that & in fact can be run on much slower SSDs just as well. Just because the resources are available doesn't mean they will be fully utilised because the main goal of the game developers isn't to maximise hardware usage but rather to make the game they want to make, and very often those two goals are in conflict with each other (due to requiring extra time/resources to develop) and it's the latter that ends up prevailing (as it should). As it relates to texture quality I distinctly remember the case of The Surge where they did actually ship with more "optimal" textures where they decided to go with lower quality than the max they could've shipped, because they wanted the storage savings and felt that the quality loss to higher compression was not noticeable. Even if look at AAA examples (f.ex. AC Origins/Odyssey come to mind) we can see that they shipped a minimum viable quality based on consoles, and PC had to do with that rather than have some fancier, nicer higher quality version available. We do also get some HD pack sometimes, but it's rare, and is usually just slightly higher res but not radically different (and now with PS5 gen it will likely all be the same quality).

Ultimately let's not get lost in the weeds of different conversations. The topic remains about 4070 Ti's vram and if it will be enough for 4K*. Given that the devs target the consoles & the increases in vram usage efficiency for UE5 (as well as the for now still missing Direct Storage on PC which will also further help with vram) I contend that it is enough for all the reasons I've mentioned before.

* 4K = to be understood as using FSR/DLSS so as to fit within reasonable performance windows, f.ex. 60 fps, else the vram is irrelevant as the card already buckles to low fps without DLSS.

Spot on especially the last point, in any so called problematic games concerning vram, turns out even the best of the i.e. 3090/6950xt still needed dlss/fsr to be turned on if you wanted high refresh rate 4k gaming, essentially what @TNA and I have been saying for ages, grunt always becomes the problem first in 99.5% of cases.

Direct storage could be a big one, according to microsoft and nvidia, vram usage should drop.





Also, worth remembering, it doesn't appear to be clear cut universal needed vram across the vendors now i.e. amd generally uses more vram across the board, in some cases, can even be a 2GB difference :eek: Either amd have an issue with vram management or/and nvidia are just better at it:







Not a rdna 3 thing either as uses similar vram as rdna 2:






I find it quite funny how the usual suspects are now saying it's not about the amount but the price for it, that argument never came into play or rather was acknowledged when comparing the £650 3080 10gb and £1400 3090 :cry:

Ultimately, I wouldn't be buying a 4070ti for 4k gaming as the issue pointed out by all reviewers is it's memory bus width speed.
 
Last edited:
  • Like
Reactions: TNA
Also, worth remembering, it doesn't appear to be clear cut universal needed vram across the vendors now i.e. amd generally uses more vram across the board, in some cases, can even be a 2GB difference :eek: Either amd have an issue with vram management or/and nvidia are just better at it
Not that relevant when there's enough to compensate for it.
 
Not that relevant when there's enough to compensate for it.

True for gpus with 20+GB but it's still valid for the likes of the 4070ti and of course ampere line up, essentially you can't have people going "on my 7900xt(x), it uses 13GB, therefore I wouldn't want anything less than 20/24gb vram now" when clearly it's not as clear cut as that now.... which has always been a silly flawed argument anyway given if vram is there, it can be used but doesn't neccasrily always mean that it is going to make a difference to the performance as we have seen time and time again.

Oof, didn't realise that. How did this one slip under the radar! :D

Probably because amd cards don't even get 1 fps :cry: In the words of Alex:

ziPUjO8.png

:D
 
Also, worth remembering, it doesn't appear to be clear cut universal needed vram across the vendors now i.e. amd generally uses more vram across the board, in some cases, can even be a 2GB difference :eek: Either amd have an issue with vram management or/and nvidia are just better at it:

SAM can allow the CPU to access more memory on an AMD GPU. That could be a contributor to the difference.
 
Last edited:
12GB should be fine and if you look at the 4070 Ti it does OK in 4K gaming but you should also look at memory speed as the 4070 Ti uses a 192 bit memory bus and this undermines it performance in 4K, compared to greater bus widths on the 7900XT/4080 or the 3090 Ti, so Memory Bus Width matters as much, if not more, than the amount of VRAM.

If 4K is a certain upgrade, and you had to by a card today, then a 4080 or a 7900XTX, or previous gen a 3080 10GB or 12GB if you can find a good price, for good amount of VRAM and greater bus widths, would be the minimum at present for reliable 4K gaming and longevity.
 
Last edited:
"Incredibly ridiculously laughably poorly yes AMD"

I'm sure there's more of a sentence there, but in isolation that is absolute nonsense!

Well it is true tbf as even with TSR, it doesn't improve much :D it's obvious why though.... here is the time point for that video:


But again, this falls back to the point I and others have made for so long and as poneros stated too:

4K = to be understood as using FSR/DLSS so as to fit within reasonable performance windows, f.ex. 60 fps, else the vram is irrelevant as the card already buckles to low fps without DLSS.

Not even a 4090 can play it well without dlss.

SAM can allow the CPU to access more memory on an AMD GPU. That could be a contributor to the difference.

SAM is just amds marketing name for resize bar, which is enabled for nvidia and intel systems too i.e. they both eliminate the 256MB temp swap file system between the gpu vram and cpu, could be down to the way amd has tuned their chipset drivers and gpu drivers though.
 
Last edited:
Nexus in da houuuuuuuuse...

Now the thread will kick off. Get the popcorn ready lads :D:cry:


Nvidias Portal RTX 4K Ultra settings has a 16Gb vram requirement, so canny blame AMD this time folks.:p

Does it? I played Ultra just fine at around 60fps. But that was using DLSS performance. Unless one has a 4090 you can't even consider playing that game without DLSS anyways.
 
Well it is true tbf as even with TSR, it doesn't improve much :D it's obvious why though.... here is the time point for that video:


But again, this falls back to the point I and others have made for so long and as poneros stated too:



Not even a 4090 can play it well without dlss.



SAM is just amds marketing name for resize bar, which is enabled for nvidia and intel systems too i.e. they both eliminate the 256MB temp swap file system between the gpu vram and cpu, could be down to the way amd has tuned their chipset drivers and gpu drivers though.
From what I understand between them the AMD one will allow the CPU to use more than the Nvidia one.
 
Last edited:
Now the thread will kick off. Get the popcorn ready lads :D:cry:
Nah, it'll just be a wall of the same video over and over again. 4k means using DLSS, which means the GPU isn't rendering 4k natively...But I'm getting off this roundabout as I'm in favour of upscaling, even if I don't agree with all the base points.

Looking forward to Spiderman's **** once it inevitably meanders to frame generation though. :p
 
Last edited:
I’m pretty sure you are right. It is similar to rebar that Nvidia has. I think Nvidia doesn’t use it all the time which could be a contributor in some games.

Yup nvidia whitelist their games for it, it can be forced on via profile inspector though where as amd have it globally applied.
 
Well it is true tbf as even with TSR, it doesn't improve much :D it's obvious why though.... here is the time point for that video:


But again, this falls back to the point I and others have made for so long and as poneros stated too:



Not even a 4090 can play it well without dlss.



SAM is just amds marketing name for resize bar, which is enabled for nvidia and intel systems too i.e. they both eliminate the 256MB temp swap file system between the gpu vram and cpu, could be down to the way amd has tuned their chipset drivers and gpu drivers though.


The 2060 super is performing 700% faster than a 16GB 6800XT :eek:.

Is it the game design or the card. CAnt see much texture wow factor from an HD texture pack

Like CP2077, Portal is a 'built' environment with many more reflective surfaces than games set in more natural environments, it's games set in these environments that showcase RT better that GPU's seem to struggle with RT.

But don't go selecting a particular set of settings - as we'll end up like the FC6 argument.

Portal non RTX isnt a hard game to run originally - seems the RT impact due to the environment as I mentioned above, cripples perf - and is unplayable on the 'I don't care about RT' GPU choice cards - I guess you wouldnt buy Portal RTX if you see no value in RT anyway.
 
Status
Not open for further replies.
Back
Top Bottom