• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

It will probably not be a 4k card so yeah but It should have been at least 10/12Gb and the 3080 16/20Gb as we have had 8Gb cards for 5 years already.
 
8 GB is already not enough in many instances, unless you're ok with gimping textures or 1080p. It's a pathetic amount tbh & why the 3080 is such a better deal than this dumb 3070. 10 GB is just at the limit of enough, so any extra GB is gonna help in the long run. Nvidia ofc doesn't care because if you have to upgrade again because of it in 2 years then so much the better.

edit: this is how the future will look like with 8gb, more & more

LKYnlBt.png

Just for context, this chart shows frametimes achieved @ 3840x2160 using a texture streaming setting that computerbase themselves said had no practical advantage.

this is what it looks like @ 5k without any daft setting:

SR7fmBy.png
 
Just for context, this chart shows frametimes achieved @ 3840x2160 using a texture streaming setting that computerbase themselves said had no practical advantage.

this is what it looks like @ 5k without any daft setting:

Like I said, it's fine if you want to gimp the settings. I can definitely notice streaming & textures differences. But I was using that as an illustrative example, don't particularly care about Wolfenstein. Same as below.

Et4Rram.png
 
I wouldn't buy one for 4K. How long have people been trying to run 4k well? Seems like about 10 years. The terrible is the goalposts keep moving. As soon as the hardware improves the software gets more demanding. This has been accelerated by ray tracing. I think even 1080p could be demanding for years to come as proper ray tracing comes in.
 
Like I said, it's fine if you want to gimp the settings. I can definitely notice streaming & textures differences. But I was using that as an illustrative example, don't particularly care about Wolfenstein. Same as below.

Et4Rram.png

'gimp' the settings? give me a break. You're not gimping settings if you choose not to use the heaviest texture streaming setting that's beyond what the game even calls maximum. You're making a mountain out of a molehole to prove a point and we're not falling for it. Are you the sort of guy who'd run nothing but 64x AA if it was an option, and consider anything else a 'gimp'?
 
Last edited:
'gimp' the settings? give me a break. You're not gimping settings if you choose not to use the heaviest texture streaming setting that's beyond what the game even calls maximum. You're making a mountain out of a molehole to prove a point and we're not falling for it. Are you the sort of guy who'd run nothing but 64x AA if it was an option, and consider anything else a 'gimp'?

So go ahead and buy the 8 GB card, don't be mad at me for pointing out the facts. And your comparison to 64x AA is disingenuous because while that kind of a workload would be near impossible to run by any hardware, present or future, all these settings require is a bit more memory. THAT'S IT! You don't even need a more 'powerful' (processing) card.
 
How anyone can argue 8gb is enough is baffling.

Nv only give you what you need, when it runs out tomorrow they'l release its replacement.

That's part of the reason how they sell more gpu's than anyone.

They make there own customers want to upgrade.

If purchasing every new 70 series, you'll be fine, keeping it for 2 gens or more, it'll still run games within 8gb but the higher vram equivalent card won't have the same problem.

No ones ever, ever about when it goes wrong, its either 'its only X titles' and 'you can't notice the difference anyway'.
 
I wouldn't want an 8GB card for 1440P or higher at these price points. It smacks of planned obsolescence.

To not have the option to pay for more VRAM is frustrating. There's physical space for the ICs on the PCBs. Maybe supply is an issue. If AMD launch 16GB cards though its going to leave Nvidia looking a little lacking.
 
Last edited:
So go ahead and buy the 8 GB card, don't be mad at me for pointing out the facts. And your comparison to 64x AA is disingenuous because while that kind of a workload would be near impossible to run by any hardware, present or future, all these settings require is a bit more memory. THAT'S IT! You don't even need a more 'powerful' (processing) card.
Im not buying any of the cards. I never said 8gb was enough, I'm saying your example isn't enough to prove it isn't. Big difference. And you're not pointing out any facts, you're being selective with them to prove a point. And it won't work.
 
Yes likely enough.

There's a "is 10Gb enough for the 3080" thread which I'd suggest you read because it answers this question and brings to light a lot of issues with vRAM measurement, primarily that we've been measuring vRAM based on what is allocated to the game but not how much of that allocation is actually put to use. There's now updates to MSI Afterburner beta which allow you to display these 2 values and bottom line is games use a lot less than we've been previously measuring. And so far all the objections of people posting "this game X uses more" have been found to be rubbish, either the game isn't really using that much vRAM or by the time you push the card to that vRAM limit with new AAA games, the frame rate is unplayable anyway.

The problem with last gen is that the 11Gb cards were overkill, if you use the metric above to actually look at real usage, none of them 11Gb cards ever used more than 8Gb of vRAM, so the additional 3Gb was pointless. The reason they used 11Gb to begin with is likely because they had an awkward memory configuration which limited them to certain mutliples of memory modules, and whatever the multiple was below 11Gb was probably too low, so faced with a choice to either under supply or oversupply vRAM they opted to go over and avoid a bottleneck.

The 3070 is tricky because it's right on the borderline for 4k if the perf is close to a 2080Ti. In many games 4k is gonna be OK but some it'll struggle a bit to maintain 60fps, and so my recommendation would be that if you're running 1440p then it'll be a great card and 8Gb will be enough. 4k I'm not so sure, for 4k I'd definitely recommend a 3080 anyway.
 
I guess if 10gb is enough for a 3080 then presumably 8gb will be ok in the 3070?

Be interesting to see how AMD position their new GPUs and how much VRAM they come with.
 
Back
Top Bottom