• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

Soldato
Joined
19 May 2012
Posts
3,633
NVIDIA can easily fix this. They just need to drop 2x 16GB or 20GB cards.

3070TI and 3080TI but the issue is it leaves the 3070 and 3080 in a really really really weird place.

I think they'll get by for the next 6 months because.. well fanboys wil be fanboys and go for the NVIDIS card regardless of performance and VRAM limitations. Then they can price drop or just phase out the old cards in 6 months an drop in the new ones. Maybe better off calling them SUPER than TI.
 
Associate
Joined
25 Sep 2020
Posts
128
A dev cycle for Nvidia or AMD is usually two years. That means that when they sell you a product? they design it to last for two years.

If they didn't? then they know for a fact people would stick to what they had, like many did with Turing. Most just held onto their Pascal cards. It obviously took longer than that with Ampere, but that was only because AMD released some mid range cards. So they could shift all of their stock before releasing Ampere.
Well I want to keep my card for 4 years at 1440p144hz...
 
Last edited:
Soldato
Joined
18 Feb 2015
Posts
6,480
Surely if Nvidia's sells a lot of 8Gb cards then game developers will make sure their games work on 8Gb? Surely developers want sales? Why would they aim for high end users only?

Ofc they will work but you will have to reduce texture quality - that's what devs do on PC, give you the option to adjust settings for your hardware. If you don't care about texture quality then it isn't an issue & vice-versa. Same as with anything.

But that's what many of us here are saying - if you feel like texture quality (and lod models, and streaming distance, and many other vram-heavy but performance-light settings) is one of the most important visual upgrades for a game (and I do), then 8 GB is a SERIOUS problem at least at 4K. That's what makes the 3070 look so pathetic. For many settings (reflections, volumetrics etc) you can reduce below ultra and get 99% of the same visuals and much better performance, but with vram limits you can't, that's why vram is so important & why people who try and say "oh but it's not strong enough for 4K anyway" are completely missing the point.

Alas, this is always a dumb issue to argue over, because people never change their mind irrespective of the facts (as we've had this exact discussion in the past about 2 v 3 v 4 gb etc). At least we now have a choice: care about vram? AMD is giving you an option for 16 GB at a decent price. You don't? Then cool, 3070s are gonna be plentiful. :)
 
Associate
Joined
25 Sep 2020
Posts
128
Ofc they will work but you will have to reduce texture quality - that's what devs do on PC, give you the option to adjust settings for your hardware. If you don't care about texture quality then it isn't an issue & vice-versa. Same as with anything.

But that's what many of us here are saying - if you feel like texture quality (and lod models, and streaming distance, and many other vram-heavy but performance-light settings) is one of the most important visual upgrades for a game (and I do), then 8 GB is a SERIOUS problem at least at 4K. That's what makes the 3070 look so pathetic. For many settings (reflections, volumetrics etc) you can reduce below ultra and get 99% of the same visuals and much better performance, but with vram limits you can't, that's why vram is so important & why people who try and say "oh but it's not strong enough for 4K anyway" are completely missing the point.

Alas, this is always a dumb issue to argue over, because people never change their mind irrespective of the facts (as we've had this exact discussion in the past about 2 v 3 v 4 gb etc). At least we now have a choice: care about vram? AMD is giving you an option for 16 GB at a decent price. You don't? Then cool, 3070s are gonna be plentiful. :)
How do you feel about the 3080 10gb for 1440p144hz? is that plenty vram going forward for 1440p?

I would buy the 6800xt but I kind of want dlss...
 
Soldato
Joined
18 Feb 2015
Posts
6,480
How do you feel about the 3080 10gb for 1440p144hz? is that plenty vram going forward for 1440p?

I would buy the 6800xt but I kind of want dlss...
Would be a great card for that! Funny thing is though, the 6800 XT has a higher performance advantage vs the 3080 at 1440p compared to at 4K. Ampere is better suited for higher resolution thanks to its design, while RDNA 2 is better for high fps gaming. It's just the way they're designed.

Remember also, all DLSS gives you is an intermediate step between render resolution & output resolution in terms of better visuals/performance.

So what I mean is, if you're playing at 4K, think of it as giving you 1800p image quality for 1440p performance (quality mode). That's it, but it also requires per-game implementation. When you understand that then you won't be so blown away by DLSS "magic". It's decent, don't get me wrong, but it's blown waaaay out of proportion by the marketing material that's burrowed into people's minds. It's also much better at higher resolutions than lower ones, so won't be as good at 1440p compared to 4K (visually), and it's also better when being very GPU bound compared to a mix - so much less effective for high fps gaming.

Q: Where does DLSS provide the biggest benefit? And why isn’t it available for all resolutions?

A: The results of DLSS vary a bit, because each game has different characteristics based on the game engine, complexity of content, and the time spent on training.
Our supercomputer never sleeps, and we continue to train and improve our deep learning neural network even after a game’s launch.

When we have improvements to performance or image quality ready, we provide them to you via NVIDIA software updates. DLSS is designed to boost frame rates at high GPU workloads (i.e. when your framerate is low and your GPU is working to its full capacity without bottlenecks or other limitations).

If your game is already running at high frame rates, your GPU’s frame rendering time may be shorter than the DLSS execution time. In this case, DLSS is not available because it would not improve your framerate. However, if your game is heavily utilizing the GPU (e.g. FPS is below ~60), DLSS provides an optimal performance boost. You can crank up your settings to maximize your gains. (Note: 60 FPS is an approximation -- the exact number varies by game and what graphics settings are enabled)

To put it a bit more technically, DLSS requires a fixed amount of GPU time per frame to run the deep neural network. Thus, games that run at lower frame rates (proportionally less fixed workload) or higher resolutions (greater pixel shading savings), benefit more from DLSS. For games running at high frame rates or low resolutions, DLSS may not boost performance. When your GPU’s frame rendering time is shorter than what it takes to execute the DLSS model, we don’t enable DLSS. We only enable DLSS for cases where you will receive a performance gain. DLSS availability is game-specific, and depends on your GPU and selected display resolution.

https://www.nvidia.com/en-us/geforce/forums/3d-vision/41/294436/nvidia-dlss-your-questions-answered/
 
Soldato
Joined
19 May 2012
Posts
3,633
Now this becomes a purely fundamental discussion..
Practically, it doesnt make sense to purchase a 3070 now given new developments, unless nvidia responds with price cuts :)

3070 and 3090 are dead.

3080 I think is still a fair deal. Trades blows with AMD, 6GB less VRAM but has RTX, DLSS and all of NVIDIA's perks.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,184
Location
Greater London
How do you feel about the 3080 10gb for 1440p144hz? is that plenty vram going forward for 1440p?

I would buy the 6800xt but I kind of want dlss...
AMD's offering seem to be better 1440p, so as you plan to keep the card for a long time AMD are offering you 6GB extra VRAM and better 1440p performance on all games. Where as nvidia will only likely have DLSS in a small portion of games released every year. In your position I would be going AMD. That said lets wait and see proper reviews before we jump to conclusions.
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
There is no such thing as a console port for starters. I'm not going to explain how games are made, but if you understood you wouldn't call them that. "Cross coded slop"? maybe, but they are not ports.

As consoles started to basically just becomes PCs with a fancy case developers moved towards multi-platform engines, engines like the unreal engine are designed to run natively on all these platforms. But that's purely from a technical standpoint, the issue I think most gamers are referring to when they say a port is that develops focus on making a 1 game fits all platform, and in order to do that they target the so called lowest common denominator which are always the consoles. And then the PC gets whatever was made for the consoles, but with a few extra options if you're lucky. It's why we went through decades of low FOVs, mouse acceleration forced on, auto aim and things like that baked into our games which are console feature and PC gamers generally want to avoid. The same is sadly true for video settings although as the PCs get way ahead of the consoles sometimes we see developer throw us a bone with settings.

So every console is packing more than 8-10GB of VRAM, every AMD card has more than 8-10GB of VRAM...

But NVIDIA are telling us 8-10 is enough? ok...

Not more, no. They have 10Gb maximum to assign to video memory. And a better question isn't how much they have, but how much can their GPU (APU) actually make good use of? We've seen with the PC as you increase assets in vRAM you get frame rate decline because it's more work for the GPU to do. But the consoles have really pretty mediocre GPUs, so how much vRAM can they actually make use of? I dunno, if early look at things like Dirt 5 are any indication https://www.youtube.com/watch?v=CF9A935XFkU they have to strip back all the shadows and audience density and image quality to get the game playable at high resolutions or refresh rates.

No.

Games have texture options. So 8GB owners will just have to use high or medium textures whilst people with higher VRAM cards can use the ultra textures. And textures make a MASSIVE difference to visual fidelity IMO.
Most of the games for next gen are boasting photorealistic textures and they aren't cheap.

They aren't cheap in terms of vRAM but they aren't cheap in terms of their impact on the GPU either. The GPU has to process those textures every frame, larger textures means less frame rate. Which is why when you put more than 8Gb of assets into vRAM you get frame rates that are unplayable even on a 24Gb 3090 (see FS2020 & Avengers in Ultra 4k)
 
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
i dont think 10gb is a hard limit for texture caching on the consoles, but if they're interested in maximum performance then i can't see many games going over that, purely because of the split memory bus on the series X. hmm split memory bus, chuckle.
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
This is why yesterday I did not even attempt to spend hours waffling. I understand what happens to a card this gen when VRAM is breached. It streams from memory. Which is nowhere near as bad as the old way of streaming from your paging file. Thus, the game can still be playable but the FPS will be much lower, like we see in the video above, Doom Eternal being played properly and not cherry picked and so on.

I'm not getting into a stupid tit for tat argument with someone over this. I am not spending hours of my life wasted on trying to convince any one of anything. If you blindly think 8gb is enough going forward or can fit the metrics into your argument about how it is enough if you just drop settings then fine.

However. If you are going to do that (and start deviating away from benchmarks in order to make a game run, and or reduce settings? you may as well buy a console, which is my part of that argument some didn't seem to quite understand. IE - if you are about making compromises buy a console. That is not why people game on a PC, and it is not why they become enthusiasts and it certainly IS NOT about spending over double the amount of money for what is, theoretically the same thing.

If you want to continue debating it? quote the videos of the tech press who don't agree with you and put the worlds to rights that way because as a enthusiast who has been burned three times by "not enough VRAM" you will get nowhere arguing the toss with me.

TNA. Just because someone disagrees with pretty much the whole world and is prepared to slap out 10,000 word essays about the subject doesn't make them any more right. It just means they will just continue to waffle.

Assuming you've read my posts, this is deeply dishonest. I've posted a lot in both the 8Gb and 10Gb threads, I've taken all debate seriously. I've made extremely high amount of effort to understand the nuance of memory usage in games and the different methods of measuring vRAM and their drawbacks such as:

1) The settings menu (an estimate, can be very wrong/misleading)
2) Memory Allocated (an upper bound of what is request, NOT what is actually used) <-- Most tools report this
3) The new per process memory allocation in the MSI Afterburner beta (which hooks the Direct3DKMT API and reports memory in use for the current process)
4) Special K modding tool which reports DXGI_Budget (The memory budget response from DirectX, used to check if you're over memory budget or not)

Yes I've made long posts that go into this detail but it's not just "waffle", it's actually a nuanced topic which people are attempting to over simplify into a statement like "8Gb is not enough". My actual arguments are evidence based, I've made an effort to look at every example people have given me of games using large amounts of vRAM to actually inspect if that claim is true or not. Then linked to evidence of either my own testing results OR where other users have tested and provided evidence. Here are some examples:

1) FS2020 was originally claimed to use 12.5Gb peak, it doesn't it uses about 9.5Gb peak when measured properly, confirmed with in-game dev tools to display real memory usage. And is unplayable in these settings on a 3090
2) Read Dead Redemption 2 was claimed to need 11Gb for Ultra 4k, it doesn't. Plenty of users showed when measured properly this takes about 6.5Gb.
3) There's an entire thread on this at the resetera forums here https://www.resetera.com/threads/msi-afterburner-can-now-display-per-process-vram.291986/page-2 that show when measured properly:
3a) Gears Tactics Max settings 3440x1440 is 4,767MB of vRAM
3b) Control Max settings 3440x1440 is 4,164MB vRAM
3c) Age of Empires 3 1,680MB of vRAM
3d) Death stranding 3,764MB of vRAM
3e) Metro Exodus Max settings with RTX 3440x1440 5.1Gb of vRAM
3f) There's loads of example in that thread with evidence, just go and read and look at all the screenshots of memory usage
4) Then someone claimed Wolf 2 using "16k textures" was using like 20Gb of vRAM so I downloaded that, tested it. Turns out they were just turning up the engines texture pool size to 16Gb with zero benefit, real memory usage was like 5Gb, I posted screenshots of that for evidence here on the forum.
5) People made similar claims about Doom Eternal I downloaded that and tested and posted evidence that at max it was using 7Gb but that actually just like Wolf 2 (same engine) all it was doing was reserving a massive texture pool that it never used. In "Ultra Nightmare" texture pool it's 4.5Gb reserved just for the pool but at "High" it's only 1.5Gb and that this brought no visual benefit. Real actual memory usage when not reserving a large unused pool tok it from 7Gb to 4.5Gb
6) And now we have more claims about Resident Evil 3, which I had previously pointed out others had given evidence for severely misleading menu estimates. So I've just downloaded and test again.

Here...completely maxed settings including highest 8Gb textures, in 4k, the menu warns it will use 12.5Gb of vRAM (LOL, the game is only a 22Gb install), and here are random screenshots as I played through the start of the game, displaying again vRAM Allocated as what Afterburner would typically report, and vRAM Used which is the newer per process memory in use, as reported by the D3D API

Warning these images are very large 7-19Mb uncompressed. Sorry I had to post a bunch but I'm facing accusations of cherry picking now and I'm just not going to stand for that kind of bad faith argument.

Resident-Evil-3-Remake-Screenshot-2020-10-28-19-21-53-45.png


Resident-Evil-3-Remake-Screenshot-2020-10-28-19-32-59-15.png


Resident-Evil-3-Remake-Screenshot-2020-10-28-19-35-13-90.png


Resident-Evil-3-Remake-Screenshot-2020-10-28-19-39-24-33.png


Resident-Evil-3-Remake-Screenshot-2020-10-28-19-43-05-89.png


For those of you who CBA to download/expand/zoom to see OSD values they are:

8018 Reported 5521 actually used
7099 Reported 5428 actually used
7701 Reported 6071 actually used
7409 Reported 5865 actually used
8018 Reported 5521 actually used
5850 Reported 4367 actually used

That's an average of memory actually used of about 5.4Gb, the idea this is bumping up against 8Gb limits is WRONG, it's demonstrably wrong. And your reliance on reviewers is what is leading you to a bad conclusion. Consider the fact that this new way of measuring vRAM was only incorporated into a beta of a public tool on the 18th Sept this year, prior to that there was no real easy way to do it, certainly no industry standard tool. All the tools except 1 extremely obscure modding tool all reported memory allocated. So any reviewer you rely on who hasn't explicitly switched to this new method is not telling you how much vRAM games actually use. They're telling you what the game engine requests to be allocated, it's not the same thing. All the research on this shows that the difference between these 2 values can often be extremely big sometimes games will just allocate 90% of the vRAM you have available in which case they appear to be using like 22Gb on a 3090 but they're really using like...3Gb or something small.
 
Last edited:
Soldato
Joined
31 Oct 2002
Posts
9,851
Assuming you've read my posts, this is deeply dishonest. I've posted a lot in both the 8Gb and 10Gb threads, I've taken all debate seriously. I've made extremely high amount of effort to understand the nuance of memory usage in games and the different methods of measuring vRAM and their drawbacks such as:

1) The settings menu (an estimate, can be very wrong/misleading)
2) Memory Allocated (an upper bound of what is request, NOT what is actually used) <-- Most tools report this
3) The new per process memory allocation in the MSI Afterburner beta (which hooks the Direct3DKMT API and reports memory in use for the current process)
4) Special K modding tool which reports DXGI_Budget (The memory budget response from DirectX, used to check if you're over memory budget or not)

Yes I've made long posts that go into this detail but it's not just "waffle", it's actually a nuanced topic which people are attempting to over simplify into a statement like "8Gb is not enough". My actual arguments are evidence based, I've made an effort to look at every example people have given me of games using large amounts of vRAM to actually inspect if that claim is true or not. Then linked to evidence of either my own testing results OR where other users have tested and provided evidence. Here are some examples:

1) FS2020 was originally claimed to use 12.5Gb peak, it doesn't it uses about 9.5Gb peak when measured properly, confirmed with in-game dev tools to display real memory usage. And is unplayable in these settings on a 3090
2) Read Dead Redemption 2 was claimed to need 11Gb for Ultra 4k, it doesn't. Plenty of users showed when measured properly this takes about 6.5Gb.
3) There's an entire thread on this at the resetera forums here https://www.resetera.com/threads/msi-afterburner-can-now-display-per-process-vram.291986/page-2 that show when measured properly:
3a) Gears Tactics Max settings 3440x1440 is 4,767MB of vRAM
3b) Control Max settings 3440x1440 is 4,164MB vRAM
3c) Age of Empires 3 1,680MB of vRAM
3d) Death stranding 3,764MB of vRAM
3e) Metro Exodus Max settings with RTX 3440x1440 5.1Gb of vRAM
3f) There's loads of example in that thread with evidence, just go and read and look at all the screenshots of memory usage
4) Then someone claimed Wolf 2 using "16k textures" was using like 20Gb of vRAM so I downloaded that, tested it. Turns out they were just turning up the engines texture pool size to 16Gb with zero benefit, real memory usage was like 5Gb, I posted screenshots of that for evidence here on the forum.
5) People made similar claims about Doom Eternal I downloaded that and tested and posted evidence that at max it was using 7Gb but that actually just like Wolf 2 (same engine) all it was doing was reserving a massive texture pool that it never used. In "Ultra Nightmare" texture pool it's 4.5Gb reserved just for the pool but at "High" it's only 1.5Gb and that this brought no visual benefit. Real actual memory usage when not reserving a large unused pool tok it from 7Gb to 4.5Gb
6) And now we have more claims about Resident Evil 3, which I had previously pointed out others had given evidence for severely misleading menu estimates. So I've just downloaded and test again.

Here...completely maxed settings including highest 8Gb textures, in 4k, the menu warns it will use 12.5Gb of vRAM (LOL, the game is only a 22Gb install), and here are random screenshots as I played through the start of the game, displaying again vRAM Allocated as what Afterburner would typically report, and vRAM Used which is the newer per process memory in use, as reported by the D3D API

Warning these images are very large 7-19Mb uncompressed. Sorry I had to post a bunch but I'm facing accusations of cherry picking now and I'm just not going to stand for that kind of bad faith argument.











For those of you who CBA to download/expand/zoom to see OSD values they are:

8018 Reported 5521 actually used
7099 Reported 5428 actually used
7701 Reported 6071 actually used
7409 Reported 5865 actually used
8018 Reported 5521 actually used
5850 Reported 4367 actually used

That's an average of memory actually used of about 5.4Gb, the idea this is bumping up against 8Gb limits is WRONG, it's demonstrably wrong. And your reliance on reviewers is what is leading you to a bad conclusion. Consider the fact that this new way of measuring vRAM was only incorporated into a beta of a public tool on the 18th Sept this year, prior to that there was no real easy way to do it, certainly no industry standard tool, all the tools except 1 extremely obscure modding tool all reported memory allocated. So any reviewer you rely on who hasn't explicitly switched to this new method is not telling you how much vRAM games actually use. They're telling you what the game engine requests to be allocated, it's not the same thing. All the research on this shows that the difference between these 2 values can often be extremely big sometimes games will just allocate 90% of the vRAM you have available in which case they appear to be using like 22Gb on a 3090 but they're really using like...3Gb or something small.

Can counter your wall of text with a few facts:

1. Try play Doom Eternal 4k max on a 8GB card. You lose loads of performance, relative to cards with more memory, due to 8GB not being sufficient
2. Consider the new consoles (releasing soon!) are getting double the total memory vs the previous console generation. 8GB total last gen, 16GB total this gen. We can roughly say that console games will be able to double their VRAM used (whichever % of total memory gets dedicated to VRAM, consoles are ultra efficient/optimized, so hard to compare to PC).
3. Next gen AAA games (many are console 'ports') will very likely use anywhere up to double the VRAM they use now. Game developers are under pressure to leverage every MB of storage, every Mhz of processing power from the new consoles, you can bet texture sizes etc are about to explode.
4. Some of the games you mentioned (age of empires 3......) are based on very old games. These will run on an overclocked potato, and don't require a 3090. It's kind of pointless mentioning the about of memory a old game uses.....
5. Note to the few who are about to quote me and say that the consoles don't have 16GB of VRAM - I'm aware of this, and don't claim they do. They have 16GB total memory, double that of the 8GB previous generation. Double VRAM requirements INC yo!
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
Can counter your wall of text with a few facts:

1. Try play Doom Eternal 4k max on a 8GB card. You lose loads of performance, relative to cards with more memory, due to 8GB not being sufficient

I LITERALLY did that, AND I posted evidence this is wrong in this very thread https://www.overclockers.co.uk/forums/posts/34123036/

AND I listed that evidence in the post you quoted, but obviously did not read, it was point 5)
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,184
Location
Greater London
I LITERALLY did that, AND I posted evidence this is wrong in this very thread https://www.overclockers.co.uk/forums/posts/34123036/

AND I listed that evidence in the post you quoted, but obviously did not read, it was point 5)
No matter what you say, he will continue saying them same thing over and over. Before he was saying 10gb was not enough, then we all proved without a shadow of a doubt it is, now he has moved on to 8gb. This is a guy who felt his 16gb on the Radeon VII would come in handy. It never did :p

Wasting your time.
 
Associate
Joined
2 Feb 2018
Posts
237
Location
Exeter
No matter what you say, he will continue saying them same thing over and over. Before he was saying 10gb was not enough, then we all proved without a shadow of a doubt it is, now he has moved on to 8gb. This is a guy who felt his 16gb on the Radeon VII would come in handy. It never did :p

Wasting your time.

Too many people spreading false news, without hard facts. Even when it's been debunked, they will not stop believing... Some people just don't like to be wrong and will fight till the end... Sad really.

Yet we seem to keep repeating ourselves...

Getting pretty tiresome. :o
 
Back
Top Bottom