• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

8bCe6vH.jpg

Here's a direct example of what I am talking about. BTW these aren't max settings and they're adjusted for comfortable FPS (mostly the "optimized/performance digital foundry suggested settings, a few others tuned down). I normally have 60 fps in this area. Ray tracing is on and performance DLSS is also on. Without ray tracing, this would probably be avoided as RT does eat up more VRAM. But that would defeat the purpose of having an RTX card would it not?

200-300mb~ is probably reserved for system/app/OS purposes. This is using the MSI Afterburner Dedicated VRAM monitor (not allocated).

So, going over 7,8GB of VRAM doubles down on the FPS, causing some lagging, it isn't a big issue as it is a single player game. The game is also unoptimized, and caches a lot of VRAM like it's free real estate. I doubt it will happen when the game is optimized in future patches. But to anyone wondering if the VRAM is an issue for the 3070 or not especially on 3440x1440... or anyone who tries to bring up the allocated VRAM argument, here's a clear example. I can get a video if need be, but just thought I would raise some awareness lulz.
 
Here's a direct example of what I am talking about. BTW these aren't max settings and they're adjusted for comfortable FPS (mostly the "optimized/performance digital foundry suggested settings, a few others tuned down). I normally have 60 fps in this area. Ray tracing is on and performance DLSS is also on. Without ray tracing, this would probably be avoided as RT does eat up more VRAM. But that would defeat the purpose of having an RTX card would it not?

200-300mb~ is probably reserved for system/app/OS purposes. This is using the MSI Afterburner Dedicated VRAM monitor (not allocated).

So, going over 7,8GB of VRAM doubles down on the FPS, causing some lagging, it isn't a big issue as it is a single player game. The game is also unoptimized, and caches a lot of VRAM like it's free real estate. I doubt it will happen when the game is optimized in future patches. But to anyone wondering if the VRAM is an issue for the 3070 or not especially on 3440x1440... or anyone who tries to bring up the allocated VRAM argument, here's a clear example. I can get a video if need be, but just thought I would raise some awareness lulz.

Do you have "GPU1 dedicated memory usage \ process" turned on in MSI Afterburner?
 
miqHWOF.png

Yep, these are the 2 graphs I have turned on. I am actually not sure anymore what people mean when they're saying it's not showing actual VRAM usage. I've tested this on multiple games and it's correctly showing the real-time VRAM used. For example, I was really surprised when playing Destiny 2 how little VRAM the game uses for such stunning visuals.

Either way, it's sad Nvidia gimped the 3080/3070 of VRAM this generation. I understand I may have bought an underpowered card for my resolution (3440x1440) but then again it doesn't change the fact that I can bottleneck it via the memory without much effort. The 2 arguments I fell for initially was that the card would actually simply not perform well in the first place on "ultra/maxed out settings" before the VRAM becomes a problem. And the 2nd one was the allocated VRAM one. Both are technically incorrect (at least regarding the 3070).

From other people I've heard that CoD Warzone has a similar issue with RT on for these cards and Watch Dogs Legion as well, but no one actually, seriously plays watch dogs so who cares I guess.

I was really considering getting the 3080 and selling this one but it's way too much effort just to get some measly 2gb extra and some FPS gains. I'll power through this generation hopefully and see what AMD has to offer in the future, or if Nvidia will stop gimping their consumers like they do not know better (which, let's face it, the average consumer doesn't think about these things). It's just sad that I had faith in Nvidia all this time and saw them as the superior GPU master race seeing how AMD could not offer equivalents for so long (I did have a 7870 HD at one point, before going to gtx 1060 6GB).

Final few notes - from the benchmarks I looked up over the days, it seemed to me the 6800xt was handling 4K gaming better than the 3080. If it's due to 10gb vs 16gb VRAM, I have no clue, but it was interesting to say the least.
 
Last edited:
Looks like it's using 7329GB according to that graph. 2GB extra VRAM can make a big difference especially of the higher bandwidth GDDR6X variant.

That's why I went for the 3080 over the 3070. If I need a purely 1440P card and wasn't bothered about raytracing my pick would be the 6800. My current monitor is 3440x1440 but I plan to dip my toes into some 4K gaming soon.

Of course the 3080Ti is coming by summer, but I hope I can also power through this generation with my 3080 TUF OC which I payed just over £1000 for.
 
Last edited:
miqHWOF.png

Yep, these are the 2 graphs I have turned on. I am actually not sure anymore what people mean when they're saying it's not showing actual VRAM usage. I've tested this on multiple games and it's correctly showing the real-time VRAM used. For example, I was really surprised when playing Destiny 2 how little VRAM the game uses for such stunning visuals.

Either way, it's sad Nvidia gimped the 3080/3070 of VRAM this generation. I understand I may have bought an underpowered card for my resolution (3440x1440) but then again it doesn't change the fact that I can bottleneck it via the memory without much effort. The 2 arguments I fell for initially was that the card would actually simply not perform well in the first place on "ultra/maxed out settings" before the VRAM becomes a problem. And the 2nd one was the allocated VRAM one. Both are technically incorrect (at least regarding the 3070).

From other people I've heard that CoD Warzone has a similar issue with RT on for these cards and Watch Dogs Legion as well, but no one actually, seriously plays watch dogs so who cares I guess.

I was really considering getting the 3080 and selling this one but it's way too much effort just to get some measly 2gb extra and some FPS gains. I'll power through this generation hopefully and see what AMD has to offer in the future, or if Nvidia will stop gimping their consumers like they do not know better (which, let's face it, the average consumer doesn't think about these things). It's just sad that I had faith in Nvidia all this time and saw them as the superior GPU master race seeing how AMD could not offer equivalents for so long (I did have a 7870 HD at one point, before going to gtx 1060 6GB).

Final few notes - from the benchmarks I looked up over the days, it seemed to me the 6800xt was handling 4K gaming better than the 3080. If it's due to 10gb vs 16gb VRAM, I have no clue, but it was interesting to say the least.

warzone/Cold War just caches as much vram as it can grab. It uses 22gb vram on my 3090

however they have implemented a vram slider in the options menu that lets you hard cap how much vram the game can use, it will rank performance if you set it too low though
 
So whats the conclusion to this thread. Is 8GB enough VRAM for the next few years or better to opt for a higher amount for 1440P gaming at high refresh.
Depends on who you ask. As far as I am concerned no it isn't enough on a card worth more than £300
+1
If you want to use all the bells and whistles of the card then no it's certainly not enough for 1440p or higher.
 
It's just sad that I had faith in Nvidia all this time and saw them as the superior GPU master race seeing how AMD could not offer equivalents for so long (I did have a 7870 HD at one point, before going to gtx 1060 6GB).
AMD always had better cards, but they released them 3 years later than they should. :D
 
I have a 3840x1600 monitor and 8gb of memory on the gpu. I get warnings all the time that memory is dangerously short. I am hanging on for the 3080ti. In my opinion this is only going to get worse, so it just seems crazy to me to get even 10gb let alone 8gb.
 
My 3070 is doing 30fps in Flight Simulator on mostly ultra settings, which I'm happy with and which is what I bought it for. It's also overclocked to +80 core / +1050 memory and undervolted and running brilliantly on my Corsair HX 520W PSU.

My ceiling on a GPU is £500 so I didn't have much choice in all honesty; even if I had wanted to stretch to an 3080 FE, they're not actually available and I'd have had to spend another £100 on a PSU.

Therefore my conclusion is that no, 8GB isn't ideal, but I had no real choice.
 
No way I would buy a 8GB card now..... Even if it's super duper quick.

Cyberpunk was eating my VRAM (most likely memory leak though) to the point I enabled HBCC in amd drivers so GPU in theory has 11GB VRAM now, first time I have ever used this and pleasantly surprised with it! Got rid of those hitches/micro stuttering I was having.
 
Back
Top Bottom