• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
We need to see how what kind of vram usage 3090 owners get.

Thing is we are in a bit of pickle. Stick wiht Nvidia and the 3080 but potentially have vram issues. Go AMD for the 16GB of vram but then the RT performance will be less than the RTX 3000 series.

We cant win either way.

Nah this doesn't usually work. This is why the memory allocated is a bad way to measure usage, because often developers will just allocate an arbitrarily high value such as all, or most, of the free vRAM, no matter whether they can use it or not. And so in many games you'll see a 3080 assign 10Gb or close to it, and a 3090 assign like 20Gb.

This is why you NEED to measure how much vRAM is actually in use. We can't do that with watchdogs yet because the DRM prevents 3rd party tools from inspecting the exe/process and its real memory use, but Denuvo has been cracked now so it wont be long before we see a release we can test on and when that happens I'll post real vRAM usage. Although until my 3080 arrives I wont be able to test ray tracing, but at the very least we can prove the point again that ALLOCATED != USAGE

I've shown that, with evidence for Doom Eternal, Wolfenstein II, Resident Evil 3, etc. There's an entire thread at the resetera forums demonstrating this problem.
 
Caporegime
Joined
12 Jul 2007
Posts
40,422
Location
United Kingdom
Nvidia is saying that the 8GB 3070 is fast as or even faster then the 2080ti which has 11GB of vram

Is having only 8GB of vram going be enough ?
8GB should be used on a GPU with the performance level of the 5700 XT as an absolute maximum, in my personal opinion.

This is with gaming at 1440P and 5120x1440 resolution. Even then, there were multiple games where at maximum details the game would stutter due to video memory saturation.

This was always resolved by switching to a 16GB Radeon VII.

So in my opinion, 8GB is not sufficient for a GPU like the 3070 as it is more powerful than the 5700 XT.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
8GB should be used on a GPU with the performance level of the 5700 XT as an absolute maximum, in my personal opinion.

This is with gaming at 1440P and 5120x1440 resolution. Even then, there were multiple games where at maximum details the game would stutter due to video memory saturation.

This was always resolved by switching to a 16GB Radeon VII.

So in my opinion, 8GB is not sufficient for a GPU like the 3070 as it is more powerful than the 5700 XT.

There is a reason why AMD goes with 16 GB for all the Radeon RX 6800/6900 series cards.
Nvidia is just having its Radeon R9 Fury X 4GB moment with Ampere - 8/10 GB on RTX 3070/3080 is not ok unless they are put in the $200-$300 market tier.
 
Soldato
Joined
28 Oct 2011
Posts
8,356
They've cheaped out and it's bit them in the a** it's as simple as that, AMD have embarassed them with their VRAM allocations for Navi. NV will happy will be happy to sell you what you should had in the first place though at a hefty premium with later cards no doubt...

2016 saw a low-mid range card for $243 with 8GB, now getting on for 5 years later we're supposed to accept the same 8GB for $500-$600? - Absolute BS. This is classic NV VRAM gimping.
 
Soldato
Joined
18 Feb 2015
Posts
6,480
We need to see how what kind of vram usage 3090 owners get.

Thing is we are in a bit of pickle. Stick wiht Nvidia and the 3080 but potentially have vram issues. Go AMD for the 16GB of vram but then the RT performance will be less than the RTX 3000 series.

We cant win either way.

Imo I think the in-game vram counter is right, so around 9 GB, but if you have extra it will 100% help with streaming and stutter less. Granted it's unplayable with RT on ultra even on a 3090 without reducing resolution (either natively or through DLSS), but if you lower that & add 100% details it's probably still 9-10 GB required.

 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
Imo I think the in-game vram counter is right, so around 9 GB, but if you have extra it will 100% help with streaming and stutter less. Granted it's unplayable with RT on ultra even on a 3090 without reducing resolution (either natively or through DLSS), but if you lower that & add 100% details it's probably still 9-10 GB required.


It's always an over estimate but question is by how much? According to a recent post on the resetera forums on their vRAM thread they used Ultra presets with RT on DLSS off and 4k, they claim variable usage between 8-9Gb, but they also claimed that disabling BattleEyeLauncher allowed them to measure real per process vRAM and it's reliably coming in at about 1Gb less usage than the allocated value. What you're seeing in that video is not how much vRAM the game is using but rather how much it has requested from the video card be allocated. These are different values.

The poster does exceed 8Gb of usage at those settings on what I assume is a 3080 (he doesn't say) but the killer is that exceeding 8Gb of vRAM in the screenshot he posted, he has 21fps

So again, I maintain that future games will need more vRAM, but they will also need better GPUs. And that 10Gb on the 3080 or 8Gb on the 3070 is sufficient so the GPU gives out before the vRAM budget is reached. 21fps is not playable, let's not be stupid about this.

Source - Poster RCSI in this thread https://www.resetera.com/threads/msi-afterburner-can-now-display-per-process-vram.291986/page-3
 
Soldato
Joined
18 May 2010
Posts
22,297
Location
London
Nah this doesn't usually work. This is why the memory allocated is a bad way to measure usage, because often developers will just allocate an arbitrarily high value such as all, or most, of the free vRAM, no matter whether they can use it or not. And so in many games you'll see a 3080 assign 10Gb or close to it, and a 3090 assign like 20Gb.

This is why you NEED to measure how much vRAM is actually in use. We can't do that with watchdogs yet because the DRM prevents 3rd party tools from inspecting the exe/process and its real memory use, but Denuvo has been cracked now so it wont be long before we see a release we can test on and when that happens I'll post real vRAM usage. Although until my 3080 arrives I wont be able to test ray tracing, but at the very least we can prove the point again that ALLOCATED != USAGE

I've shown that, with evidence for Doom Eternal, Wolfenstein II, Resident Evil 3, etc. There's an entire thread at the resetera forums demonstrating this problem.

Fair enough.
 
Soldato
Joined
18 May 2010
Posts
22,297
Location
London
It's always an over estimate but question is by how much? According to a recent post on the resetera forums on their vRAM thread they used Ultra presets with RT on DLSS off and 4k, they claim variable usage between 8-9Gb, but they also claimed that disabling BattleEyeLauncher allowed them to measure real per process vRAM and it's reliably coming in at about 1Gb less usage than the allocated value. What you're seeing in that video is not how much vRAM the game is using but rather how much it has requested from the video card be allocated. These are different values.

The poster does exceed 8Gb of usage at those settings on what I assume is a 3080 (he doesn't say) but the killer is that exceeding 8Gb of vRAM in the screenshot he posted, he has 21fps

So again, I maintain that future games will need more vRAM, but they will also need better GPUs. And that 10Gb on the 3080 or 8Gb on the 3070 is sufficient so the GPU gives out before the vRAM budget is reached. 21fps is not playable, let's not be stupid about this.

Source - Poster RCSI in this thread https://www.resetera.com/threads/msi-afterburner-can-now-display-per-process-vram.291986/page-3

With Watch Dogs Legion it seems to be true that you run out of compute before you run out of vram. But it's a fine line.

And a average of 35fps although blooming low is still playable.

I played Odyssey on my 1080 at around 40fps and with Gsync it was smooth.

The other thing to consider is the texture packs that they release for games to bump them up to 4k textures I don't think cost anything in compute to render. They are just assets that sit in vram taking up space. So this is where a higher vram budget helps a lot. And a compute bottleneck would not be reached in this situation before the vram limit as the texture dont cost any compute to render.

I'm still suspicious as to why AMD have put 16GB of ram on their cards. Just marketing to make Nvidia look bad or is there more too it??
 
Soldato
Joined
16 Aug 2009
Posts
7,728
The other thing to consider is the texture packs that they release for games to bump them up to 4k textures I don't think cost anything in compute to render. They are just assets that sit in vram taking up space. So this is where a higher vram budget helps a lot. And a compute bottleneck would not be reached in this situation before the vram limit as the texture dont cost any compute to render.

No they don't they just gobble up vram. The thing is at 4k with large monitors you really notice low res textures and if a game is moddable one of the first things you do is search out high res texture packs because no-one likes looking at poor definition images. Then of course you start finding out just how much your vram is worth.
 
Associate
Joined
15 Jun 2009
Posts
2,189
Location
South London
Watch Dogs Standard edition is 45GB, the HD Texture pack is 20GB in extra storage. That is a huge amount of data in comparison to the standard game size.

It shows that things could be dialled way up with the upcoming gen, when you look at the quality of the textures on the game running at 4K.

In the Hardware Unboxed Watch Dogs Legion optimisation guide, the Textures toggled from Low to Ultra give hardly any performance penalty to the "GPU Core" - If you have the VRAM available.

It could be argued that you may get a major win with visual fidelity regarding textures in next gen games and the power of the GPU plays little part (within reason) if VRAM is not a limiting factor.
 
Last edited:
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
The other thing to consider is the texture packs that they release for games to bump them up to 4k textures I don't think cost anything in compute to render. They are just assets that sit in vram taking up space. So this is where a higher vram budget helps a lot. And a compute bottleneck would not be reached in this situation before the vram limit as the texture dont cost any compute to render.

I'm still suspicious as to why AMD have put 16GB of ram on their cards. Just marketing to make Nvidia look bad or is there more too it??

Yeah I mean the numbers speak for themselves. The benchmarks show that 3070 doesn't suffer from 8Gb of vRAM, and that frame rates are down in the ~45 region which is arguably on the line of playable for many gamers. The 3080 sits happily inside of 10Gb of usage and again becomes unplayable before getting close to a 10Gb vRAM budget.

16Gb for AMD is likely architecture limitations, if you have a specific architecture at specific memory bus width and a specific memory bandwidth target then you have a limited number of vRAM configs you can put on the card, normally some multiple and they decided 16Gb was better than whatever the next step down was. Just like with the 1080Ti being 11Gb, it never ever made use of the 3Gb of RAM above the sister cards at 8Gb, but they kinda were forced into that or something much lower.

No they don't they just gobble up vram. The thing is at 4k with large monitors you really notice low res textures and if a game is moddable one of the first things you do is search out high res texture packs because no-one likes looking at poor definition images. Then of course you start finding out just how much your vram is worth.

Everything you put in vRAM that is actually in use to render the game has some impact on frame rate, and textures are no exception. But you are right in the sense that as a ratio of their size in vRAM compared to their impact on GPU performance they are the last impactful on GPU speed. Their impact is going to probably come more indirect things, like for example a shader that samples texels over a texture to generate some output is likely to be more computationally expensive doing that over a 4k texture than a lower res one. Higher res textures aren't free, but they're certainly closer to free than most other graphical features.
 
Associate
Joined
25 Sep 2020
Posts
128
Nah this doesn't usually work. This is why the memory allocated is a bad way to measure usage, because often developers will just allocate an arbitrarily high value such as all, or most, of the free vRAM, no matter whether they can use it or not. And so in many games you'll see a 3080 assign 10Gb or close to it, and a 3090 assign like 20Gb.

This is why you NEED to measure how much vRAM is actually in use. We can't do that with watchdogs yet because the DRM prevents 3rd party tools from inspecting the exe/process and its real memory use, but Denuvo has been cracked now so it wont be long before we see a release we can test on and when that happens I'll post real vRAM usage. Although until my 3080 arrives I wont be able to test ray tracing, but at the very least we can prove the point again that ALLOCATED != USAGE

I've shown that, with evidence for Doom Eternal, Wolfenstein II, Resident Evil 3, etc. There's an entire thread at the resetera forums demonstrating this problem.
Yea I am waiting for your benchmarks, im sure watch dogs legions is good under 8gb too :p even at 4k! Just need some hard evidence which I hope you can provide once the game is cracked :p
 
Soldato
Joined
18 Feb 2015
Posts
6,480
I'm still suspicious as to why AMD have put 16GB of ram on their cards. Just marketing to make Nvidia look bad or is there more too it??
Because they are the underdog so they have to fight harder to prove themselves and win over other users compared to Nvidia. So they're gonna give us more (as usual) in order to do that. I think it's also a happy coincidence, because the tech they're working on towards enabling an MCM future (infinity cache) also happens to afford them this advantage of doing more with lower-end memory & smaller bus size compared to Nvidia needing to go GDDR6X. A sort of reversal to what it used to be, when Nvidia's design advantage let them use slower memory and still keep up or out-pace AMD which was using HBM/HBM 2 etc.
Either way, I'm very happy. To me the 6800 XT is the perfect card to upgrade to this time with no compromises.
 
Associate
Joined
25 Sep 2020
Posts
128
Guys its finally done! We finally have real benchmarks of actual vram usage. I asked a benchmark youtuber to install the new msi beta and do his next benchmarks with the program. Here they are, 8gb should not be a *HUGE* issue at 1440p for atleast a few years even if vram requirements increase, if we also get optimized games at 1440p 8gb could last 4 years if you can turn down textures in some games.
Watch dogs legions without the hd texture pack uses only 5gb at 1440p lol

Anyways, here it is!
https://www.youtube.com/watch?v=rVMbkjtY9ko&lc=z23it1gifpylhvnwi04t1aokgxlcmstpmka1w12i3zjerk0h00410
 
Associate
Joined
20 Sep 2020
Posts
187
Guys its finally done! We finally have real benchmarks of actual vram usage. I asked a benchmark youtuber to install the new msi beta and do his next benchmarks with the program. Here they are, 8gb should not be a *HUGE* issue at 1440p for atleast a few years even if vram requirements increase, if we also get optimized games at 1440p 8gb could last 4 years if you can turn down textures in some games.
Watch dogs legions without the hd texture pack uses only 5gb at 1440p lol

Anyways, here it is!
https://www.youtube.com/watch?v=rVMbkjtY9ko&lc=z23it1gifpylhvnwi04t1aokgxlcmstpmka1w12i3zjerk0h00410
Can you run Final Fantasy benchmark in this setting? See how it compares to mine?

ImhGf4H.png
 
Back
Top Bottom