• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

12GB vram enough for 4K? Discuss..

Status
Not open for further replies.
Which is a good reason NOT to buy the best you can afford. The 3090 was what, £750 more than the 3080 and is now a crusty old potato utterly outclassed by the new generation of cards - better off saving the difference and upgrading more frequently. Buying a top GPU and hanging onto it for years makes no sense to me.

Common sense has to come into play, the 3090 was always known to be a poor value for gamers
 
Did you skim my post? :cry:
Absolutely I did. I haven't got time to read the whole of your essay each time :p (tbf that one wasn't so bad, but I was about to leave work...)

I think my main point is that lower down the tree, if you need to use upscaling to hit a performance target, RT is basically the first thing going off. Certainly if you're planning to keep the card for longer than a single generation, VRAM is a better investment - RT is in its infancy, its requirements are going to grow exponentially.

Upscaling is important, but relying on upscaling to keep weaker cards in the game is not a good idea as the upscaling and associated tech will simply change and not work on lower level cards, already seeing that with DLSS 3, frame generation and the 3000 series.

Having the VRAM to at least have quality textures in play will have a greater impact at lower card levels - if you are using older types of upscaling in the future, it will probably give you more frames than trying to use a weak upscaler to run rapidly more intense RT which it is not powerful enough to do.
 
Last edited:
I had a 3090 and now a 4090,neither have vram issues but i still think dlss is the best thing that happened to pc gaming. It gives insane longevity to cards. If nvidia was as greedy as people think why would they work so hard on a feature that extends the lifetime of your gpu by that much?
For RT.
 
Absolutely I did. I haven't got time to read the whole of your essay each time :p (tbf that one wasn't so bad, but I was about to leave work...)

I think my main point is that lower down the tree, if you need to use upscaling to hit a performance target, RT is basically the first thing going off. Certainly if you're planning to keep the card for longer than a single generation, VRAM is a better investment - RT is in its infancy, its requirements are going to grow exponentially.

Upscaling is important, but relying on upscaling to keep weaker cards in the game is not a good idea as the upscaling and associated tech will simply change and not work on lower level cards, already seeing that with DLSS 3, frame generation and the 3000 series.

Having the VRAM to at least have quality textures in play will have a greater impact at lower card levels - if you are using older types of upscaling in the future, it will probably give you more frames than trying to use a weak upscaler to run rapidly more intense RT which it is not powerful enough to do.

Except, 99% of the time as shown, it is grunt which is the limitation thus requires reducing settings or/and using upscaling, which both reduce vram usage :D Just look at consoles which have "16gb vram" (not really as it is shared memory) and according to the man with an axe is a better buy than a 3080 10gb.... :cry: :o But I digress....

- sacrificing raster settings
- sacrificing RT settings
- running a lower res. at times (even compared to fsr/dlss internal res.)

All in order to hold a locked 4k60 and even with the above, starting to see 4k30 again unless settings are sacrificed even further....


Frame generation is an entirely different tech to upscaling.... Upscaling tech is not going to dramatically change for the foreseeable future. Ampere and turing are still benefiting from all the improvements to dlss, this is the beauty of being able to swap in the file yourself and not having to wait on game developers to update FSR, if ever..... (which nvidia gpus also can use along with xess [although with dlss, no reason to use either of them]). Frame generation as it is, is mostly aimed at overcoming cpu bottlenecks and full path tracing titles such as portal and the upcoming cp 2077 overdrive rt mode, which even then as shown will still require dlss performance or ultra performance presets for even the 40xx gpus....
 
Except, 99% of the time as shown, it is grunt which is the limitation thus requires reducing settings or/and using upscaling, which both reduce vram usage :D Just look at consoles which have "16gb vram" (not really as it is shared memory) and according to the man with an axe is a better buy than a 3080 10gb.... :cry: :o But I digress....

- sacrificing raster settings
- sacrificing RT settings
- running a lower res. at times (even compared to fsr/dlss internal res.)

All in order to hold a locked 4k60 and even with the above, starting to see 4k30 again unless settings are sacrificed even further....


Frame generation is an entirely different tech to upscaling.... Upscaling tech is not going to dramatically change for the foreseeable future. Ampere and turing are still benefiting from all the improvements to dlss, this is the beauty of being able to swap in the file yourself and not having to wait on game developers to update FSR, if ever..... (which nvidia gpus also can use along with xess [although with dlss, no reason to use either of them]). Frame generation as it is, is mostly aimed at overcoming cpu bottlenecks and full path tracing titles such as portal and the upcoming cp 2077 overdrive rt mode, which even then as shown will still require dlss performance or ultra performance presets for even the 40xx gpus....
So why don't we just have 4gb of vram (because that's enough :p) and be done with it? Because needs grew. 8gb is probably not enough at 4k, 10gb is a push at times and only reached because of upscaling.

Do you think as upscaling and frame generation improves, vram will go down? I don't.

That's why I'd say 12gb is enough for 4k. But not for £1k, I'd like more.
 
Last edited:
Getting deja vu in here, thought the 4070ti thread was locked
:D

TBH thread should be locked, was an obvious troll bait thread from the get go. If people want to have a proper discussion on "vram" in general, better of creating a thread directed at vram and not singling out one brand/gpu but then we wouldn't get as much activity because no "nvidia bad".... At the end of the day, the thread exists because once again, it's the only thing which amd have over nvidia as of now, bit like was the case with rdna 2 vs ampere.

That's fair, I'll stick to the question. 12gb for £1k is probably not enough, regardless of resolution.

12gb is enough for 4k, I'm still happy with 10gb. But for a grand? No.

A grand? 4070ti costs about £800-850

Either way, this is the point worth highlighting but then again are any of the gpus, which are 1+k worth it? With maybe the exception of the 4090.... hell no
 
  • Like
Reactions: TNA
So why don't we just have 4gb of vram (because that's enough :p) and be done with it? Because needs grew. 8gb is probably not enough at 4k, 10gb is a push at times and only reached because of upscaling.

Do you think as upscaling and frame generation improves, vram will go down? I don't.

That's why I'd say 12gb is enough for 4k. But not for £1k, I'd like more.

This is where if you read my posts you would see I have stated when it comes to vram amounts "within reason" :D :p Sure remember back in the day when 4gb was enough for 4k because "it was hbm" despite the nvidia counterpart having 6gb vram ;)

Upscaling - why would vram go up when the render resolution is lower? Keep in mind, that the lower presets such as balanced, performance and even ultra performance are being required for the more demanding titles now because of lack of grunt..... which further reduces vram usage again
Frame generation - this does seem to increase the vram usage although in some titles such as plague it doesn't so possible there is some optimisation to be done. Still quite new/early tech so can go either way. I haven't seen any signs of frame generation being enabled on the 4070ti causing issues either? Links if so.

Don't disagree but then with the 7900xt(x) and 4080 costing £1+k, despite having more vram, I would still like more from them too.....
 
Last edited:
  • Like
Reactions: TNA
This is where if you read my posts you would see I have stated when it comes to vram amounts "within reason" :D :p Sure remember back in the day when 4gb was enough for 4k because "it was hbm" despite the nvidia counterpart having 6gb vram ;)

Upscaling - why would vram go up when the render resolution is lower? Keep in mind, that the lower presets such as balanced, performance and even ultra performance are being required for the more demanding titles now because of lack of grunt..... which further reduces vram usage again
Think you missed my point. If upscaling technology becomes normal (as it should) then like you say ram requirements should decrease. In which case, it should be possible (and cheaper) for manufacturers to produce cards to hit performance targets of the day.

Do you think that an nvidia 6090 will have 12gb, but trumpet the fact upscaling means it's all you need?

More to the point, is 12gb at £1k (or £800 £850,granted!) at 4k enough for you? In one sentence...
 
Think you missed my point. If upscaling technology becomes normal (as it should) then like you say ram requirements should decrease. In which case, it should be possible (and cheaper) for manufacturers to produce cards to hit performance targets of the day.

Do you think that an nvidia 6090 will have 12gb, but trumpet the fact upscaling means it's all you need?

More to the point, is 12gb at £1k (or £800 £850,granted!) at 4k enough for you? In one sentence...

But the reason we need upscaling is because we don't have the grunt... e.g.


Again you really should read my posts :D :p

Given I still haven't run into any issues with my 3080 10gb at 4k (sorry forgot, unless I force rebar on fc 6, refuse to use fsr [even though the gpu doesn't have the grunt for it, same as with the 6800xt as shown in reviews] :cry:) and 3440x1440 then of course 12gb would be enough for me :D

you and TNA are top posters in this thread and you try to play some kind of victim, just let it go :cry:

Adding a lot to the discussion here, 10/10, would read again :cry:

Yes pretty toxic in here now too... Tired of seeing threads get locked thanks to the same people spamming them with the same cut and paste replies.

Given the title of the thread, did we really expect any different? I wonder what would happen if a thread titled "is rdna 3 RT enough for £1000@4k? Discuss...." got created.....
 
  • Haha
Reactions: TNA
That would make sense, if you could only use dlss with RT. But that is not the case, you can use it without RT, so why would they work so hard on extending the lifetime of the card?
I'm not sure it's about extending the life of a card, that wouldn't really be in the interest of the manufacturer. It's a selling point which is the main reason, and a good thing to have, but it seems that the main gain for the manufacturer is for producing bar charts on otherwise impossible to run games with huge comparisons promising up to 3x performance, when the reality is 98% of games see a small uplift at best.
 
Status
Not open for further replies.
Back
Top Bottom