• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

8GB isn’t enough. It’s just enough now. Nvidia know this - that’s why they do it, so you will upgrade sooner rather than later.

They’ve always done it, the 680 had 2GB, the 780 had 3GB...AMD always gave more vram. I wouldn’t spend circa £600 on a card with 8GB of vram now.
 
8GB isn’t enough. It’s just enough now. Nvidia know this - that’s why they do it, so you will upgrade sooner rather than later.

They’ve always done it, the 680 had 2GB, the 780 had 3GB...AMD always gave more vram. I wouldn’t spend circa £600 on a card with 8GB of vram now.


No, not really. They know most people hold on to hardware for a generation or two.

They bought out 2GB and 4GB GTX 580s and GTX 680s. The GTX 780 was 3GB, but this was when they introduced the Ti varient again. Adding the GTX 780 Ti with 6GB VRAM.

Remember, the 3070 is not the top card, which is what they class the 3080 as.
Not sure what they are classing a 3090 as, as it is not a titan either. A "BFGPU" is what they call it with Titan like performance... Not a Titan though.

Point is, 8GB VRAM is enough for 4K gaming right now. The 3070 has the power to drive games at 4K 60fps+

I mainly game on my 38" LG with 3840x1600 ultrawide over my 4K monitor. No game I have allocates more than 7.8GB. That's allocates, not required VRAM.

Cyberpunk is one game I'm ignoring right now, as it still needs optimising.
 
Hah, I do not remember accepting such a thing back then :p Even bought a second GTX 580 so I could improve my frames in BF3. Even further back I was chasing frames, think you could jump at a bit higher at maybe 125fps(?) in quake 3 for example, I blame the timedemo function in quake 1...

Me either, but reviewers said this was the bare minimum as playable.

Me too, I was running a pair of Evga Classified 4GB 580s back then to run my triple screen setup.

I decided to install BF3 last night, been replaying the campaign, and it has really aged well.
 
No, not really. They know most people hold on to hardware for a generation or two.

A generation or two? That little? I would be surprised by this, especially when you look at something like the hardware surveys on steam that show people seam to hang on to hardware longer. A lot of people are still using 900 series cards for example.
 
A generation or two? That little? I would be surprised by this, especially when you look at something like the hardware surveys on steam that show people seam to hang on to hardware longer. A lot of people are still using 900 series cards for example.

I bought the 1060 6GB version for around ~ 390 euros in 2016. It was hands down the best 390 I spent for hardware since like... ever. The card satisfied my 1080p gaming needs for 4 years. Only when I got a 3440x1440 monitor I switched to 1080 at the same time. Sold the 1060 for about 150~ and got a 1080 for around 260~ euros instead. Absolute smashing value. It held up for the entire year, it kept up with the 3440x1440 demands, albeit it didn't net me great frames, it did the job.

Now, I am finding the VRAM limiting after 1 week of using my card. It MIGHT be some memory leaks on Cyberpunk that are actually the culprit, but nonetheless, I am pretty sure that according to other 3440x1440 benchmarks, it's quite easy to get to the limit of the VRAM of this card.

The GTX 10xx generation was a real baller. It was the true future proof generation. After that, I don't think RTX 20xx came even close for future proofing. I mean, the 2080 ti would be absolutely destroyed if the 3070 had at least 10gb of VRAM, because of the price different of the cards and the performance similarity. I don't think people would choose the 2080 ti over the 3070 still, because of the sheer value. But my point stands. The 10xx generation was the most future proof we have seen in years.

And tbh I understand that tech grows exponentially. I fully understand that you can't expect to get a card and have it perform at current gen status for 4 years. But even if I have a non standard resolution, 1440p UW, I fully expect a 2020 card to stay strong for at least a few years. I doubt it will though.

Which is why I am looking into how to get something else ASAP. I can sell the card for a similar price I paid for it atm. This will not be the case once the prices for the 3070-3080 drop. But the real challenge is getting a next gen card with more VRAM for a reasonable price. I believe I could find a 3080 for around 900 euros, but that's still overpaying a lot. Though I'd rather overpay for the 3080 than for the 3070 (690 euros, and this was the cheapest I could find in my region, via some connections basically). The biggest e-shops here are selling the 3070 for 3080 prices. Ranging from 800-950 euros for the "best" brand ones. It's a bit ridiculous that even the e-shops are scalping. The 3080 in e-shops is about 1,2k~ if even available. But since I am in the EU zone there are sites in germany which offer the 3080s for ~1000 euros. All in all, it's not even about availability anymore. Just about how much you're willing to overpay. I am just wondering how much I realistically overpaid for the 3070.

It's kind of dissapointing. I am sure it's a great card, just not for the 3440x1440 res. I mean, I'm sure that without ray tracing, this card is more than enough. But I wish to utilize the selling point of these cards (over AMD that is) which is the Ray Tracing features... What good is the card for me if the Ray tracing features will eat even more VRAM? I am 99% sure without RT I wouldn't hit any caps, but if I go into titles that are heavy on the memory and turn RT On, i am 99% sure it will start capping out on the memory. Correct me if I'm wrong but from what I've been able to gather from my own testing AND from benchmarks for specifically 3440x1440, this seems to possibly be the case.
 
Last edited:
A generation or two? That little? I would be surprised by this, especially when you look at something like the hardware surveys on steam that show people seam to hang on to hardware longer. A lot of people are still using 900 series cards for example.

When you factor in that the GTX 1060 and 1080p is the most popular, this is due to laptops. Check the GTX previous generations before, and mobile GPU's used to be labeled with a "m" to show whether it is a desktop or mobile variant. So that is not an accurate source, as those types of laptops can not be upgraded, as the GPU and CPUs are directly soldered to their motherboard.

I bought the 1060 6GB version for around ~ 390 euros in 2016. It was hands down the best 390 I spent for hardware since like... ever. The card satisfied my 1080p gaming needs for 4 years. Only when I got a 3440x1440 monitor I switched to 1080 at the same time. Sold the 1060 for about 150~ and got a 1080 for around 260~ euros instead. Absolute smashing value. It held up for the entire year, it kept up with the 3440x1440 demands, albeit it didn't net me great frames, it did the job.

Now, I am finding the VRAM limiting after 1 week of using my card. It MIGHT be some memory leaks on Cyberpunk that are actually the culprit, but nonetheless, I am pretty sure that according to other 3440x1440 benchmarks, it's quite easy to get to the limit of the VRAM of this card.

The GTX 10xx generation was a real baller. It was the true future proof generation. After that, I don't think RTX 20xx came even close for future proofing. I mean, the 2080 ti would be absolutely destroyed if the 3070 had at least 10gb of VRAM, because of the price different of the cards and the performance similarity. I don't think people would choose the 2080 ti over the 3070 still, because of the sheer value. But my point stands. The 10xx generation was the most future proof we have seen in years.

And tbh I understand that tech grows exponentially. I fully understand that you can't expect to get a card and have it perform at current gen status for 4 years. But even if I have a non standard resolution, 1440p UW, I fully expect a 2020 card to stay strong for at least a few years. I doubt it will though.

Which is why I am looking into how to get something else ASAP. I can sell the card for a similar price I paid for it atm. This will not be the case once the prices for the 3070-3080 drop. But the real challenge is getting a next gen card with more VRAM for a reasonable price. I believe I could find a 3080 for around 900 euros, but that's still overpaying a lot. Though I'd rather overpay for the 3080 than for the 3070 (690 euros, and this was the cheapest I could find in my region, via some connections basically). The biggest e-shops here are selling the 3070 for 3080 prices. Ranging from 800-950 euros for the "best" brand ones. It's a bit ridiculous that even the e-shops are scalping. The 3080 in e-shops is about 1,2k~ if even available. But since I am in the EU zone there are sites in germany which offer the 3080s for ~1000 euros. All in all, it's not even about availability anymore. Just about how much you're willing to overpay. I am just wondering how much I realistically overpaid for the 3070.

It's kind of dissapointing. I am sure it's a great card, just not for the 3440x1440 res. I mean, I'm sure that without ray tracing, this card is more than enough. But I wish to utilize the selling point of these cards (over AMD that is) which is the Ray Tracing features... What good is the card for me if the Ray tracing features will eat even more VRAM? I am 99% sure without RT I wouldn't hit any caps, but if I go into titles that are heavy on the memory and turn RT On, i am 99% sure it will start capping out on the memory. Correct me if I'm wrong but from what I've been able to gather from my own testing AND from benchmarks for specifically 3440x1440, this seems to possibly be the case.

I don't understand why you are hitting VRAM limits at 3440x1440? Are you just going off MSI afterburner, and the VRAM usage it shows?
If so, this has been debunked to prove that it is just the momery caching, and is not physically needed if it shows 7.8GBs usage etc.
The more memory you have, the more momery caching will be.
This is the same for system memory, the more you have, the more Windows will start caching it.

I game at higher resolutions than you, with one monitor my ultrawide 3840x1600 and my 4k monitor at 3840x2160. I'm yet to hit VRAM limits.
Same goes for VR, when running the Quest 2 with 1.4 SS.
 
3070 should have had 10GB, 3080 should have had 12. It’s typical Nvidia. It will be ‘resolved’ with the Ti variants so they can get ‘enthusiasts’ (i.e people like us) to buy two cards out of a single generation...
 
3070 should have had 10GB, 3080 should have had 12. It’s typical Nvidia. It will be ‘resolved’ with the Ti variants so they can get ‘enthusiasts’ (i.e people like us) to buy two cards out of a single generation...

If you're talking about generation leaps where memory doubled, then the 3070 should have had 16GB. The 3080 also with 16GB, and 3080 Ti 22GB, Ampere Titan 48GB.

If you own a 3070 now, simple, don't buy into the same generation. Wait till the next generation, and get a bigger performance boost, and we can all better judge how much VRAM is really needed by then with true next gen game engines.
 
If you're talking about generation leaps where memory doubled, then the 3070 should have had 16GB. The 3080 also with 16GB, and 3080 Ti 22GB, Ampere Titan 48GB.

If you own a 3070 now, simple, don't buy into the same generation. Wait till the next generation, and get a bigger performance boost, and we can all better judge how much VRAM is really needed by then with true next gen game engines.

Im not suggesting it should double every generation, but even the 1080ti a card two generations old has more vram than a 3080. The 3080 should have had 12 minimum, perhaps 16 for the Ti.
 
It's funny how some people, even with evidence to the contrary, will still deny what their eyes see and keep beating the drum of "totally enough mate". I just don't understand how there can be so much denial about something so inconsequential as a PC part. I've seen atheists convert to religion with more ease than Nvidia fans are to admit the cards might be sold with too little vram. Crazy stuff! :eek:
 
Im not suggesting it should double every generation, but even the 1080ti a card two generations old has more vram than a 3080. The 3080 should have had 12 minimum, perhaps 16 for the Ti.

I owned a 1080 Ti back in 2016 for 2yrs, but even back then, 11GB was over kill. The 2080 Ti also had 11GB, as again, VRAM was never the limiting factor.
The 3080 with 10GB is not an issue right now either. (Nvidia directly compared it to a 2080, showing it to be 2x as fast, but the 2080 had 8GB of VRAM, so the 3080 got a 2GB VRAM boost).
The 3080 Ti was never meant to be, but competition forced Nvidia hand to match the price/performance of the 6900XT.
The 3080 Ti is the true successor of the 2080 Ti.
The 3090, this should have been the 3080 Ti, but marketing as a new 90s series to make the £1500 price tag seem legit as people would have kicked off it it was a "Ti" for £1500. This was just a marketing stunt to give those with deep pockets who just want the best at any price to silverline Nvidias pockets.
So blame the people who bought those GPU's, that showed Nvidia, and other companies, that they can get away with charging what they like.

Back on topic;
Unless you are pushing over 4K resolution, then these cards will be good for a couple of years, if not more.

When hitting VRAM limits, this isn't a little spike here and there, this would become a slide show instead. I've been there with being an early adopter to 4K and actually hitting VRAM limits.
 
When you factor in that the GTX 1060 and 1080p is the most popular, this is due to laptops. Check the GTX previous generations before, and mobile GPU's used to be labeled with a "m" to show whether it is a desktop or mobile variant. So that is not an accurate source, as those types of laptops can not be upgraded, as the GPU and CPUs are directly soldered to their motherboard.



I don't understand why you are hitting VRAM limits at 3440x1440? Are you just going off MSI afterburner, and the VRAM usage it shows?
If so, this has been debunked to prove that it is just the momery caching, and is not physically needed if it shows 7.8GBs usage etc.
The more memory you have, the more momery caching will be.
This is the same for system memory, the more you have, the more Windows will start caching it.

I game at higher resolutions than you, with one monitor my ultrawide 3840x1600 and my 4k monitor at 3840x2160. I'm yet to hit VRAM limits.
Same goes for VR, when running the Quest 2 with 1.4 SS.

Well, since you also use this card, and even run 4K res, I am not going to dismiss what you have to say. I am sure that by tuning settings down you could dance around this memory limit for the card. But a discussion on whether this is actually enough for the card is why I decided to turn to the forums. Just FYI I believe Cyberpunk @ 4K with RT On will eat around 10GB of VRAM.

But from what I observed in Cyberpunk at least, the VRAM problem may be related to the memory leaks inherent to the buggy/glitchy launch. I will be able to fully test this once the game is patched up and hopefully the leaks are resolved and not something inherent to the experience of the game. But even without any leaks, the game eats a hefty 7GB on my resolution on average. Once again, I am not cranking the settings up without any thought to it, I carefully set each graphical option to give me the best visual experience with RT On that has the most performance for this title. And it is literally treading the line for the VRAM, even though I am netting 60FPS and even above 70 in less intensive areas easily... How am I to expect this will continue to be enough going into 2021, if the first next gen title released eats almost 8GB for breakfast? I assume I should be fine as long as I compromise by reducing the various settings... But again, making the compromise of turning RT off... on an RT card that isn't the low tier one, just to give headroom to the memory limit, is pushing it IMO.

Honestly, I am simply explaining what I observe. How is the VRAM graph debunked exactly...? if I observe the lag spike accordingly to the memory graph trying to go over 8gb, when the game has a lot going on in terms of changing the environments, going over the 7,8-9GB limit results in FPS drops as shown in this benchmark (this is a different game, but the example is of the same problem due to the VRAM limit:
https://youtu.be/xejzQjm6Wes?t=215

I know the games are unoptimised and not the best examples. I also understand that by fine tuning the settings, that 1GB of VRAM could have been avoided in that particular benchmark. I also agree that in general, games offer way too many fidelity options for way too big of a performance cost with little visual difference...

But you can clearly see in the benchmark that the FPS drops as a direct result of the VRAM cap...? This is not even 4K btw. Can we even talk about headroom for this card for the memory? If you want stable performance, you need headroom as not every game is perfectly optimized. Isn't that a correct assumption?
 
Last edited:
It's funny how some people, even with evidence to the contrary, will still deny what their eyes see and keep beating the drum of "totally enough mate". I just don't understand how there can be so much denial about something so inconsequential as a PC part. I've seen atheists convert to religion with more ease than Nvidia fans are to admit the cards might be sold with too little vram. Crazy stuff! :eek:

Agreed. If VRAM is so unimportant, as is often claimed, then why did we move on from 2GB and 4GB cards? Surely anything over 6GB is overkill? :rolleyes::p
 
Anyone remember Fury X. Nvidia fans were slating it for 4gb vram and rightly so as it was not enough just like 8gb is not enough. Turned out that Fury X had the grunt eventually but not the Vram to back it up. Same will happen here.
 
It's one of the arguments you see used when discussing 8GB/10GB/16GB; the speed of the card is more important than the size of the VRAM, which isn't totally wrong, but it's not exactly right either.

Well, I think this is the case. In some ways though, memory speed is important, but as of now, not by a lot. The headroom/amount of VRAM itself is probably important IMO, because ironically enough, in the benchmark I linked, the 10gb 3080 with the faster memory didn't go over 8gb. While the 3070 with only 8gb did. I am sure with enough dedication, you could demonstrate this over some other titles too.
 

Honestly, I am simply explaining what I observe. How is the VRAM graph debunked exactly...?
if I observe the lag spike accordingly to the memory graph trying to go over 8gb, when the game has a lot going on in terms of changing the environments, going over the 7,8-9GB limit results in FPS drops as shown in this benchmark (this is a different game, but the example is of the same problem due to the VRAM limit:
https://youtu.be/xejzQjm6Wes?t=215

There are people on these forums who think that VRAM allocation is unimportant and that only game usage matters (these values differ). The think that as long as the what the game uses fits within the VRAM then everything will be okay.

This is based on the assumption that there is no performance penalty if the game engine is unable to allocate the VRAM it wants to allocate. As far as i am aware this hasn't been thoroughly tested we also don't know how different game engines respond to this problem.
 
It's one of the arguments you see used when discussing 8GB/10GB/16GB; the speed of the card is more important than the size of the VRAM, which isn't totally wrong, but it's not exactly right either.

No no no no, there's a big massive gulf of a difference between vram being less important and vram being unimportant. Nobody said vram was unimportant, that's an absurd thing to say.
 
There are people on these forums who think that VRAM allocation is unimportant and that only game usage matters (these values differ). The think that as long as the what the game uses fits within the VRAM then everything will be okay.

This is based on the assumption that there is no performance penalty if the a game engine is unable to allocate the VRAM it wants to allocate. As far as i am aware this hasn't been thoroughly tested we also don't know how different game engines respond to this problem.

It's not rocket science. Yes allocated VRAM is not an indicator of how much of it is ACTUALLY being used. But when the allocated VRAM runs out (which will happen, it's already happening and we can see that), the frame dropping that results from trying to go over the VRAM limit happens because the data is not going into VRAM but into RAM, and that process introduces massive latency. That is why the frame drop/stutter/whatever happens. It's not the end of the world but it's not something you want to see on your current gen card playing current gen titles. I would rather sacrifice 10 FPS than have an otherwise stable experience be ruined by stutters because nvidia skimped VRAM like the greedy ******** they are.
 
Back
Top Bottom