• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

I have been monitoring GPU memory while gaming on my Asus TUF OC 3080 and Samsung G9 5120 x 1440

This was after a couple of hours playing Immortals Fenyx Rising, Fallout 76 showed similar results.

Current/Min/Max/Average

keohTa4.jpg
 
Last edited:
That's allocated memory though not how much is actually needed by the application.
Same way as some apps will allocate more ram than they need but run just fine with less.

However, if you take Doom eternal 4k Ultra you will get stuttering on an 8Gb card even though the 3070 has enough GPU power to get a good framerate at those settings. The 2080ti with much the same GPU power but more VRAM does just fine in that game with the same settings.

IMO 8Gb is enough for 1440p in the short term and not a bad purchase if you upgrade every generation. I only buy every couple of generations so 8Gb is not enough for me.
 
Considering the exact same one I have in my basket on OCUK says £799.99 I'm not too bothered ;)

At-least it means I'll have my 3080 which is practically sold out everywhere. :p

You should have been patient and kept an eye on discord / telegram. I tried on Sept 17th and failed, and failed again, and again, about 4 weeks later I got the FE for 650 quid. Worth the wait.
 
Bad news guys... The reseller posted my ASUS 3080 TUF OC to the wrong address which was refused by a neighbor. I can now either wait for it to be returned back to him so he can resend it or ask for a refund.

I'm really ****** off because I wanted to play games on New Years Eve! :mad:

The question now is should I continue with purchase or wait an indefinite amount of time for a 3080Ti?
 
Last edited:
Quake 2 RTX has a VRAM bug where is slowly fills until it goes over 8GB, then its choppy as hell, save the game, close, reload and its fine. That's just a tech demo anyway, but a fun one :). 8GB VRAM is enough because when it isn't I can sell this card and get one that has more VRAM :) like a erm err 3070 16GB* or a 4070 16GB*!! (*waiting for invention)
 
Bad news guys... The reseller posted my ASUS 3080 TUF OC to the wrong address which was refused by a neighbor. I can now either wait for it to be returned back to him so he can resend it or ask for a refund.

I'm really ****** off because I wanted to play games on New Years Eve! :mad:

The question now is should I continue with purchase or wait an indefinite amount of time for a 3080Ti?



First world problems, eh?


Yeah wait for 3080ti
 
3060 12GB... more vram than 3070 8GB and 3080 10GB for PC. 3080 16GB more VRAM on laptop version. Absolute joke! I honestly think Nvidia had some issues acquiring enough supply for the memory to make all those cards which is why they skimped it. I can't see another reason. They can't be as bold as to simply cut corners so hard then just add the extra VRAM on the 3060. I am just trying to understand the logic here. Is the 3060 supposed to be the future proof card here??? Or is it just the card that's bought the most frequently so they cater the needs to the average consumer more?

What is the logic behind having more VRAM on a weaker card? The card won't be able to pump out as much details in the graphics so less VRAM will be required. A stronger card is able to make the game load more VRAM because there's more detail involved. Why would you add more in the 3060 than in the 3070... 3080? The 1060 had 6GB version which was great. But both 1070 and 1080 had 8GB. Of course, it was a tad overkill. But still...?

I am just dissapointed by Nvidia this generation. One one hand, performance gains of the new cards is awesome, but then they somehow managed to bottleneck their own cards and then add more memory to the card that is the lower tier one from the line up.

If someone has in depth knowledge, please do chime in, as I am genuinely confused... I understand the 192-bit vs 256 bit difference... But I would much rather have more memory, as to not cap out, and stutter in the game, rather than have faster memory but less of it... 0 sense!

Also, for VRAM monitoring, use Afterburner and enable "GPU1 Dedicated Memory usage, MB". That will show the Dedicated vram that is used by the card for an accurate reading. As I already mentioned in this thread, I am not even playing 4K and I've already managed to bottleneck the 3070 via the memory.

I will be getting a VR headset and I'll test more on the topic, VR titles inherently require more VRAM, so we'll see how much we can bottleneck in that department.
 
Last edited:
I honestly think Nvidia had some issues acquiring enough supply for the memory to make all those cards which is why they skimped it. I can't see another reason. They can't be as bold as to simply cut corners so hard then just add the extra VRAM on the 3060.

Thinking the same, there was rumoured to be a gddr6x shortage when the first run of the 3 series was being manufactured so it looks like they just went with what they had just to get them to market. I'd be surprised if 10 gigs for the "flagship" 3080 was the plan all along.
 
Thinking the same, there was rumoured to be a gddr6x shortage when the first run of the 3 series was being manufactured so it looks like they just went with what they had just to get them to market. I'd be surprised if 10 gigs for the "flagship" 3080 was the plan all along.
The 3060 is still the same memory as the 3070... and they skimped on the 3070 the most out of them all. The mid-high range card literally has the lowest vram of them all, marketed as a "1440p" card. Yikes that I own this, but it will have to do until the next upgrade comes along, which will probably be next gen. Gone are the days where we're future proofed, blessed 1060 6GB lasted me 4 years.
 
3060 12Gb is extremely weird. I think it is related to AMD cards somehow. Nvidia probably wanted to say: "we're not worse than AMD in all segments".

Anyway, I am almost sure 8Gb cards are not bad at all. If VRAM will ever become a problem (2-3 years from now at least), just drop the texture quality from Ultra to High. Practical difference will be negligible. And these "ultra" modes and "texture packs" are created primarily to push hardware and justify high-end GPUs rather than bringing real & obvious value.
 
3060 12Gb is extremely weird. I think it is related to AMD cards somehow. Nvidia probably wanted to say: "we're not worse than AMD in all segments".

Anyway, I am almost sure 8Gb cards are not bad at all. If VRAM will ever become a problem (2-3 years from now at least), just drop the texture quality from Ultra to High. Practical difference will be negligible. And these "ultra" modes and "texture packs" are created primarily to push hardware and justify high-end GPUs rather than bringing real & obvious value.
2-3 years from now at least...?

What are you talking about ? You can already max out the VRAM on 3070 and 3080 even, on current gen titles. It's already happening. And I haven't even tested VR yet.

My example where it was bottlenecking in Cyberpunk (you can find it in the previous page or so in this thread) wasn't even max settings or some insane resolution. It was ultrawide 1440p and the settings were optimized.

Most people are playing 1080p anyways, so it doesn't matter as for 1080p the VRAM is not an issue. In this sense, not a lot of people will be shafted by this decision, and they probably have more to gain than to lose, people will buy up the cards like crazy anyway. If you wanna game beyond that, if a game is unoptimized, or you go VR, VRAM becomes an issue and that's exactly the critical flaw of these cards.
 
Last edited:
I think this re-size bar/sam thing could be pretty beneficial for games as the consoles whilst stated as having 16gb vram, they actually don't have all that dedicated purely to the gpu, it is the overall system ram shared between cpu and gpu....
 
Back
Top Bottom