• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

Permabanned
Joined
2 Sep 2017
Posts
10,490
I really doubt games will be allocating that much, but let's see..

for current gen at 4k the standard vram usage is like 6gb-8gb at ultra, next gen seems like it will be 8gb-12gb.

I don't think 16gb will be utilised in any game for atleast another 6 years. That's just me looking at history and speculating, anything could happen...

Games barely touch 8gb now and its the end of this console era where consoles had 8gb of shared ram.

Consoles have 16gb this time, so i don't think 16gb will be used until 2025+

I have seen that performance slow downs appear BEFORE the whole memory capacity is occupied.
The system needs some amount of free and in standby memory to be there and when the memory is saturated, you get terrible stutter.

More system memory and more VRAM should be used to offload the SSD work, too.

 
Associate
Joined
25 Sep 2020
Posts
128
This new gen for the consoles will last for roughly 6-7 years. NVIDIA have given the 3070 the minimum required in late 2020 for high quality textures imo which 8GB is. Not looking good going forward I think.
I agree, but i don't see any issues happening for cross gen titles at ultra 1440p atleast, maybee next gen games coming out in 3 years might call for ultra->high textures but that is the worst that can happen.

10gb seems like it should last a decent amount of time at 1440p though
 
Associate
Joined
25 Sep 2020
Posts
128
I have seen that performance slow downs appear BEFORE the whole memory capacity is occupied.
The system needs some amount of free and in standby memory to be there and when the memory is saturated, you get terrible stutter.

More system memory and more VRAM should be used to offload the SSD work, too.

Yea I mean system memory is relevent here but for vram i don't think that is the case, if the full 8gb on the 3070 are being used it does not slow down the 3070...

or is that 16gb vram? You have the radeon 7?
 
Associate
Joined
24 Mar 2016
Posts
44
I posted this in 3070 thread but seems more ideal here.

my biggest observation regarding Ram on 3070, GPU runs out of Steam before running out of ram at 4K. 4K is massive hog. Example BFV. I turned DLSS off for these just so I could see some raw performance. IT definately fills the VRAM up, I turned off Memory limitation setting as well see if it would demand more but it sits around nearly full and runs smoothly. my 1600AF keeping up as well at 4K in Single player and multiplayer.

OfEHAsQ.jpg
jdXNaCf.jpg
 
Soldato
Joined
6 Feb 2019
Posts
17,566
I posted this in 3070 thread but seems more ideal here.

my biggest observation regarding Ram on 3070, GPU runs out of Steam before running out of ram at 4K. 4K is massive hog. Example BFV. I turned DLSS off for these just so I could see some raw performance. IT definately fills the VRAM up, I turned off Memory limitation setting as well see if it would demand more but it sits around nearly full and runs smoothly. my 1600AF keeping up as well at 4K in Single player and multiplayer.

OfEHAsQ.jpg
jdXNaCf.jpg


is that with ray tracing on or off? cause that framerate is low even for 4k and well behind a 2080ti
 
Associate
Joined
24 Mar 2016
Posts
44
Yes everything on ultra including Ray tracing, googled some 2080ti performance charts

bfv.png

It's about right, my average would likely be round that
 
Last edited:
Soldato
Joined
18 Feb 2015
Posts
6,484
The irony is that the exact opposite of what low vram defenders were saying was going to happen is happening. Namely that in a game like watch dogs legion the 3070s vram is hampering the card's performance rather than the other way around! Because the game is so cpu limited you have an upper cap on performance and so regardless of how good your gpu is you won't be able to increase framerate past 60 let's say, which means that if you want to fully use your gpu then you're going to have to increase graphical settings such as rtx - except oops! Rtx wants quite a bit of vram too. So if you have a 3070 you're stuck with both worse performance & effects & textures!

Nice Nvidia 4d chess
 
Associate
Joined
20 Sep 2020
Posts
187
The irony is that the exact opposite of what low vram defenders were saying was going to happen is happening. Namely that in a game like watch dogs legion the 3070s vram is hampering the card's performance rather than the other way around! Because the game is so cpu limited you have an upper cap on performance and so regardless of how good your gpu is you won't be able to increase framerate past 60 let's say, which means that if you want to fully use your gpu then you're going to have to increase graphical settings such as rtx - except oops! Rtx wants quite a bit of vram too. So if you have a 3070 you're stuck with both worse performance & effects & textures!

Nice Nvidia 4d chess
No, it's because watch dogs legion is extremely unoptimised.
 
Soldato
Joined
18 Feb 2015
Posts
6,484
No, it's because watch dogs legion is extremely unoptimised.
Is it? And what if it isn't and this is just how hard it is to have a wide open world & so much npc interactivity? You think devs are going to stop adding to gameplay so you can better use a gimped card? Gl with that.
 
Associate
Joined
20 Sep 2020
Posts
187
Is it? And what if it isn't and this is just how hard it is to have a wide open world & so much npc interactivity? You think devs are going to stop adding to gameplay so you can better use a gimped card? Gl with that.
You sound like a boring hater who probably has some low end system trying to make up for it in bitchy little comments online.

That's okay though, 3070 isn't a gimped card, you're just presenting it with unrealistic unoptimised titles, take Ark, Horizon Zero Dawn or monster hunter world for example.

Would you call your 3090 trashy and gimped just because it doesn't get much fps in those relative to other titles in 4K? Probably actually considering your hate toward the 3070, 3090 can't even hit stable 60 fps in Watch Dogs: Legion, does that make it a bad card? Because you can use it on most other titles and it'll go above 60 FPS easily, so again, does one title make a card bad? This 3070 is as fast as two titan rtx in VRAY, lmao.

But yes, the 3070 and 3080 are priced oddly against eachother.
 
Soldato
Joined
18 Feb 2015
Posts
6,484
You sound like a boring hater who probably has some low end system trying to make up for it in bitchy little comments online.

That's okay though, 3070 isn't a gimped card, you're just presenting it with unrealistic unoptimised titles, take Ark, Horizon Zero Dawn or monster hunter world for example.

Would you call your 3090 trashy and gimped just because it doesn't get much fps in those relative to other titles in 4K? Probably actually considering your hate toward the 3070, 3090 can't even hit stable 60 fps in Watch Dogs: Legion, does that make it a bad card? Because you can use it on most other titles and it'll go above 60 FPS easily, so again, does one title make a card bad? This 3070 is as fast as two titan rtx in VRAY, lmao.

But yes, the 3070 and 3080 are priced oddly against eachother.
Ahh ok, thought you'd take a different route but suits me all the same, my ignore list is not full.
 
Soldato
Joined
21 Jul 2005
Posts
20,020
Location
Officially least sunny location -Ronskistats
8GB should be used on a GPU with the performance level of the 5700 XT as an absolute maximum, in my personal opinion.

This is with gaming at 1440P and 5120x1440 resolution. Even then, there were multiple games where at maximum details the game would stutter due to video memory saturation.

This was always resolved by switching to a 16GB Radeon VII.

So in my opinion, 8GB is not sufficient for a GPU like the 3070 as it is more powerful than the 5700 XT.

Yes, your just proving it by stepping up your GPU in AMD platform. This is why the nvidia guys arent seeing the wood from the trees, it may be delayed due to the speed of the better DDR6 improving the bandwidth somewhat, but I cannot see it bailing them oout of all scenarios. This convinced me that the 3080 is only just equipped enough (on the fence) for me, however the 3070 was DOA especially considering you can get the 6800 for the same as an AIB 3070. Strange but there you have it.
 
Associate
Joined
20 Sep 2020
Posts
187
Yes, your just proving it by stepping up your GPU in AMD platform. This is why the nvidia guys arent seeing the wood from the trees, it may be delayed due to the speed of the better DDR6 improving the bandwidth somewhat, but I cannot see it bailing them oout of all scenarios. This convinced me that the 3080 is only just equipped enough (on the fence) for me, however the 3070 was DOA especially considering you can get the 6800 for the same as an AIB 3070. Strange but there you have it.
Haha, you think the 3070 is DOA? I own an LG 48CX with a 3070 FE and have had no issues playing games at 4K with it, yes it would be nice if the 3070 had 10gb, 3080 12gb and 3090 24gb but that's just not how it turned out, as unfortunate as that maybe however, it still runs absolutely fine in 99% of games out there, apart from a select few that may genuinely require over 8gb of VRAM but most of the time it's just allocation for free space, which is why a system with 32gb ram uses up more ram at idle than a system with 16gb, partly anyway.

And no, I'll never use an AMD product, even with the additional VRAM because tbh, CUDA in applications is just too useful, also I don't want to deal with drivers again.
 
Soldato
Joined
21 Jul 2005
Posts
20,020
Location
Officially least sunny location -Ronskistats
Haha, you think the 3070 is DOA? I own an LG 48CX with a 3070 FE and have had no issues playing games at 4K with it, yes it would be nice if the 3070 had 10gb, 3080 12gb and 3090 24gb but that's just not how it turned out, as unfortunate as that maybe however, it still runs absolutely fine in 99% of games out there, apart from a select few that may genuinely require over 8gb of VRAM but most of the time it's just allocation for free space, which is why a system with 32gb ram uses up more ram at idle than a system with 16gb, partly anyway.

Yes. You just regurgitated what I said. You are happy with it so that's fine, I am not here to convince anyone, its my opinion. Post back with the 99% of games at 4k sometime soon.

Watch HU explain 19:08

TLDW = Tim "2080Ti having extra VRAM is the better product". Steve "when you ran out of VRAM at 4k.. by this time next year we will have many more (titles affected by this).. the 2080Ti is still the better product of the two."
 
Last edited:
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
Ahh ok, thought you'd take a different route but suits me all the same, my ignore list is not full.

I did notice he ignored your question completely. It's good to be wary of people making claims of things being unoptimized because a lot of people use this term wrong. WDL runs slowly when maxed out because it's using next gen graphics effects like Ray Tracing for reflections which is extremely expensive on the GPU, right now there's no evidence provided for lack of optimization.

The fact that next gen titles being hard on the GPU is a good exampleethat you can have all the vRAM you want but if your GPU can't keep up then it's pointless.
 
Associate
Joined
20 Sep 2020
Posts
187
I did notice he ignored your question completely. It's good to be wary of people making claims of things being unoptimized because a lot of people use this term wrong. WDL runs slowly when maxed out because it's using next gen graphics effects like Ray Tracing for reflections which is extremely expensive on the GPU, right now there's no evidence provided for lack of optimization.

The fact that next gen titles being hard on the GPU is a good exampleethat you can have all the vRAM you want but if your GPU can't keep up then it's pointless.

I literally have better things to do than go on forums all day. Remind me again the average fps on a 2080 ti in HZD at 4K on release date and the FPS with the same card now? Yeah, optimisations. Also raytracing isn't next gen, it's been around since before nvidia popped out a graphics card.
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
I literally have better things to do than go on forums all day. Remind me again the average fps on a 2080 ti in HZD at 4K on release date and the FPS with the same card now? Yeah, optimisations. Also raytracing isn't next gen, it's been around since before nvidia popped out a graphics card.

I'm not sure what HZD has to do with WDL? The fact that some games are unoptimized is not evidence that all games are. You made a claim that WDL is extremely unoptimized but you didn't justify this claim, you just based it on the game running slowly. Poneros quite rightly challenged you on whether this is because the game is unoptimized or uses demanding visual effects.

I'm not going to get into a semantic argument of what defines next gen, needless to say ray tracing is well known to be computationally expensive, when you enable RT effects it hits your performance a lot and so low frame rates with RT effects turned all the way up shouldn't be too surprising to anyone who knows those facts.

This is why it's so relevant to a discussion about vRAM because people keep correctly pointing out that new games will be more demanding on vRAM ignoring that they will also come with new effects that are also demanding on the GPU and what matters is the balance of the 2. WDL is a good example of why the 3070 has the amount of vRAM it does because it's right at the edge of its memory pool in game and also right at the edge of playable frame rates. A good data point among many others showing that the memory choice was an appropriate one for a GPU of that speed.
 
Back
Top Bottom