• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
I had a 970. I was pretty miffed to find out the truth behind it. However, the truth didn't come out until later on. The only blind sheep are the people who bought them after the situation became clear, without reading reviews.

Performance didnt change after it all came out. 5 years after all that (last year!) , techspot did a review and found the 970 was STILL the more consistent performer vs the 290 non-x, at least in the suit of games they tested anyway. so much for the vram situation being such a big issue. Fine wine allowed the 290x to command a lead in newer titles. But that's reality, who cares about that right?

what has the amount of vram on ampere got to do with the 970?

Good question.

What looks better, 1440p mega ultra nightmare textures or 4k one texture setting down?

well, 4k would. But to be clear, it's not any texture quality related setting causing all this hullabaloo, it's the texture pool size. The actual texture quality is identical whether the pool size is set to ultra or ultra nightmare, but this is continually ignored because...well, i dont know why.
 
Last edited:
Soldato
Joined
24 Aug 2013
Posts
4,549
Location
Lincolnshire
@Grim5 The XBox Series X has between 336GB/S ~ 560GB/S of memory bandwdith dependent on what memory pool is being used. So for the 10GB of faster RAM,the XBox Series X actually has more bandwidth than an RTX3070. The PS5 has 448GB/S with a slower GPU than thge XBox Series X. So I doubt memory bandwidth is a problem for both of them!

It won’t be much of an issue on the consoles if at all.

The 3070 still only has 2944 true cuda cores so I’m guessing that is why it starts to chog a bit at 4k and above vs the 2080Ti.

Im also guessing the overclocks on a 3070 will be relatively minor. Or minor yields due to the above. 2080Ti can get 10-15% extra so would be interesting to see OC vs OC performance.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,922
Location
Greater London
I've now watched several reviews and the 3070 has some memory issues

thisnis easily seen when compared to the 2080ti. In several games the 3070 beats the 2080ti at 1440p but loses at 4k and in almost all games the gap between the 3070 and 2080ti gets smaller at 4k.

This doesn't necessarily mean the 3070 is running about of vram though, in some cases that might be happening but I think the more likely culprit is the memory bandwidth - the 3070 is well down on bandwidth and this is a topic I brought up in the console discussions a few days ago when I said next gen games will be held back by the poor memory bandwidth on the consoles not the total amount of vram
Yeah, I did think looking at the review results that it was a mixture of either running out of VRAM or because of the memory bandwidth. There really is a huge difference between the 3080 and 3070 in memory bandwidth. That said, I could probably get by on a 3070 until Hopper comes along, but with games like Cyberpunk 2077 coming, I want a 3080, so hopefully mine will be delivered before the new delayed release date. It should be as I am not that high in the queue.
 
Soldato
Joined
9 Nov 2009
Posts
24,881
Location
Planet Earth
It won’t be much of an issue on the consoles if at all.

The 3070 still only has 2944 true cuda cores so I’m guessing that is why it starts to chog a bit at 4k and above vs the 2080Ti.

Im also guessing the overclocks on a 3070 will be relatively minor. Or minor yields due to the above. 2080Ti can get 10-15% extra so would be interesting to see OC vs OC performance.

I suspect its a bit of everything coming into play TBF. It will be interesting to see if Nvidia launches a 16GB variant of the GA104 or one with faster memory modules - it would definitely give us some more insights into where the bottlenecks are!
 
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
Then why are you posting at all? Because all you're doing is perpetuating the myth.

when you set a texture pool size to it's maximum, you are literally asking the game to reserve more vram than it needs to, more than your GPU has, and saying **** it. And then you get people making purchasing decisions off the back of that and people like you dismissing anybody who tries to explain it because you cant be arsed. Well great, stay out let the rest of us get on with it then.

And you think I don't know about that? why?

I posted a video recently of my 2080Ti achieving what was apparently impossible (a figure at 1440p). However, in that video if you paid attention you would see the game was "asking for" between 9.5gb and 10.5gb depending on scene. Of which I played through quite a few just to make sure that my results couldn't be argued against by people saying "Well this level or that level is more demanding".

I am posting because I have listened to people like that in the past. Banging on a one sided argument about how it is totally fine yada ya. ORLY? if it was then pray tell, when the Fury X launched and LOADS of people said "It's more than enough because it's HBM and won't use more than that etc". I can tell you the results to that were the complete opposite. It was a royal PITA to live with. So what went wrong? quite simply it didn't have enough VRAM. End of story, no sequel, put the book down. And this was regardless of what ANY one said at launch. Had people actually been honest and educated? AMD would not have sold a single bloody card !

So it's very much a case of "once bitten" thank you very much, and I reserve the right to ignore the professors of poop who continually going around say it is enough.

The absolutely infuriating thing though is how they segment these cards based on VRAM. "8gb 99p fries version" "10gb budget value meal version" and "Fine dining 24gb ridiculous version".

It's a joke. And they are doing it for a reason. We are also talking about right now, not what could happen a year or two from now. And given that many will be "upgrading" from a 1080Ti I find it particularly infuriating that the card they are going out to buy is at least in ONE area technically inferior. That should not even be plausible, let alone actually happening.

But Nvidia know that if they shoved up the price by about £50, or whatever that extra bit of VRAM costs that you most probably WOULD NOT be going back for 7nm TSMC, Hopper, Bopper or whatever the F they want to call their next cash cow. They know full well what is about to happen when the dev cycle flips to these new consoles and minimum specs increase.
 
Soldato
Joined
24 Aug 2013
Posts
4,549
Location
Lincolnshire
I suspect its a bit of everything coming into play TBF. It will be interesting to see if Nvidia launches a 16GB variant of the GA104 or one with faster memory modules - it would definitely give us some more insights into where the bottlenecks are!

Yeah very likely a combination of slower memory/smaller bus plus the cuda design.

A 16/20gb variant of the 3070/80 would be very interesting indeed.
 
Soldato
Joined
9 Nov 2009
Posts
24,881
Location
Planet Earth
Yeah very likely a combination of slower memory/smaller bus plus the cuda design.

Be interesting to see if a 3070 gets similar memory overclocking performance gains to the 2080Ti which usually do +1000-1200mhz.

There is already noise about the RTX3070TI 16GB! The RTX3060TI seems to have the same memory arrangement as the RTX3070 but with about 20% CUDA cores.
 
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
Because of how quickly you dismissed PrincessFrosty's explanation, of course. You quite clearly said you dont care about it, it being a setting that's very misunderstood and shouldn't be used as the basis of any argument regarding VRAM and how much of it we really need.

I don't care because it's just more recycled crap that I heard when buying my Fury X.

And it was all wrong, very wrong. Sadly it didn't only go so far as a forum and every reviewer also said the same thing "It's more than enough".

And it wasn't. So ever since then I have not gone for the lower cards in the stack that don't have enough VRAM for the resolution I expect them to run at for that price.

Look, if none of this had happened yet? fine ! I could be just an old worry wart. However, given it is already happening? and we haven't seen any true next gen titles yet? I think it's really silly of someone to stick their neck out and say it's easily enough like he continues to say.

Let me put it this way. If it were enough? Nvidia would have set their reviewer's guide to 4k, not 1440p. So it isn't a 4k card? that's fine. There are and will be much cheaper options for good 1440p cards and there already are.

This gen needed to be all about 4k. Because that is what the consoles will be doing. Had they just increased the spec and price a tiny bit? then fine, it would have been a 4k card.

This should all become quite blatantly apparent tomorrow when AMD talk about their cards that are around the same price, same performance yet oddly have 16gb of VRAM on them. Leaving 8gb for their entry level cards. And who out of the two companies would have a much better grasp of what the next gen consoles are going to be eating up? AMD, who basically designed the bloody things or Nvidia, who didn't and have no idea about what is going on in either company?
 
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
Andy, you're going off on a number of tangents there. A setting in an IDx engine game nothing to do with whatever it was you heard about something when you bought a fury X.

Let me put it this way. If it were enough? Nvidia would have set their reviewer's guide to 4k, not 1440p. So it isn't a 4k card? that's fine. There are and will be much cheaper options for good 1440p cards and there already are.

nVidia are placing it as a 1440p card. So where's the issue, other than the price?

This gen needed to be all about 4k. Because that is what the consoles will be doing.

Well, that's a bit of a broad statement. To be a little more precise, that's what the series X will be targetting, with the ps5 probably sitting somewhere below that (dynamic resolution scaling and all will muddy things to hell and back but one thing's for sure; it wont be on the same level as the series x) and the series s most defiantly being a 1440p console. at best.

This should all become quite blatantly apparent tomorrow when AMD talk about their cards that are around the same price, same performance yet oddly have 16gb of VRAM on them. Leaving 8gb for their entry level cards. And who out of the two companies would have a much better grasp of what the next gen consoles are going to be eating up? AMD, who basically designed the bloody things or Nvidia, who didn't and have no idea about what is going on in either company?

Well, two things. 1) i do believe AMD will be generous with the VRAM. But 2) Just remember how much ram the consoles actually have available for graphics, because it is NOT 16gb. not even close.
 
Soldato
Joined
6 Feb 2019
Posts
17,710
It won’t be much of an issue on the consoles if at all.

The 3070 still only has 2944 true cuda cores so I’m guessing that is why it starts to chog a bit at 4k and above vs the 2080Ti.

Im also guessing the overclocks on a 3070 will be relatively minor. Or minor yields due to the above. 2080Ti can get 10-15% extra so would be interesting to see OC vs OC performance.

Check out the Gamers Nexus review, they always test against OC cards.

The OC'd 3070 gets eaten alive by a OC'd Strix 2080ti - losing to the 2080ti by as much as 20% at 4k in some games.

This is caused by 2 things: 1) We know the 3070 drops performance at 4k due to low memory bandwidth and 2) While the 2080ti picks up 15% extra performance with overclocking, the 3070 only picks up 3% extra performance leading to a significant 12% advantage to the 2080ti from overclocking headroom alone.
 
Soldato
Joined
6 Feb 2019
Posts
17,710
@Grim5 The XBox Series X has between 336GB/S ~ 560GB/S of memory bandwdith dependent on what memory pool is being used. So for the 10GB of faster RAM,the XBox Series X actually has more bandwidth than an RTX3070. The PS5 has 448GB/S with a slower GPU than thge XBox Series X. So I doubt memory bandwidth is a problem for both of them!

For machines that want to give developers the option of 8k output, I'd have expected 1TB/s bandwidth. The Series X is a little bit better, but the PS5 is a bit slow, anything under 600GB/s seems a tad slow for 4k gaming - but maybe RDNA is more efficient with its memory by using superior compression tools to Nvidia
 
Soldato
Joined
26 Aug 2004
Posts
5,036
Location
South Wales
This gen needed to be all about 4k. Because that is what the consoles will be doing. Had they just increased the spec and price a tiny bit? then fine, it would have been a 4k card.
The consoles cannot handle 4k evident by cut down graphics, 30fps locked games, and even the high hz performance mode on the series x for Dirt drops from 1440p to 1080p with scaled back graphics. So it seems they can't even handle 1440p never mind 4k.

Not sure what's the fascination with consoles having 4k if they can't even do it properly.
 
Associate
Joined
17 Sep 2020
Posts
624
I don't care because it's just more recycled crap that I heard when buying my Fury X.

And it was all wrong, very wrong. Sadly it didn't only go so far as a forum and every reviewer also said the same thing "It's more than enough".

And it wasn't. So ever since then I have not gone for the lower cards in the stack that don't have enough VRAM for the resolution I expect them to run at for that price.

Look, if none of this had happened yet? fine ! I could be just an old worry wart. However, given it is already happening? and we haven't seen any true next gen titles yet? I think it's really silly of someone to stick their neck out and say it's easily enough like he continues to say.

Let me put it this way. If it were enough? Nvidia would have set their reviewer's guide to 4k, not 1440p. So it isn't a 4k card? that's fine. There are and will be much cheaper options for good 1440p cards and there already are.

This gen needed to be all about 4k. Because that is what the consoles will be doing. Had they just increased the spec and price a tiny bit? then fine, it would have been a 4k card.

This should all become quite blatantly apparent tomorrow when AMD talk about their cards that are around the same price, same performance yet oddly have 16gb of VRAM on them. Leaving 8gb for their entry level cards. And who out of the two companies would have a much better grasp of what the next gen consoles are going to be eating up? AMD, who basically designed the bloody things or Nvidia, who didn't and have no idea about what is going on in either company?


I see your logic and understand the view point, however you are deeply flawed in how you have arrived at the conclusion. I'm not dismissing your feelings on it at all I get why and why you arrived here, there is just so much information now as to why 10gb is enough on the 3080 and 8GB is enough on the 3070 and it baffles me why these threads continue outside of must be trolls or people unable to grasp the basics.
 
Soldato
Joined
30 Mar 2010
Posts
13,067
Location
Under The Stairs!
the VRAM allocation makes most sense for 1440p gaming. 8GB of GDDR6 memory is fine for 1440p ultra settings right now, thought at 4K we are starting to see one or two titles eat more than that.

https://www.kitguru.net/components/...s/nvidia-rtx-3070-founders-edition-review/31/

Now we can argue about the nature and choice of an 8GB GDDR6 framebuffer, but in most use cases, it's enough, and we understand the choice made here; NVIDIA needs to keep that bill of materials healthy, as otherwise, this 499 USD card would have easily been 650 USD based on that 16GB decision. With future games in mind, this will turn out to become a WQHD (2560x1440) resolution card

https://www.guru3d.com/articles_pages/geforce_rtx_3070_founder_review,32.html

At the end of the day, there is only one topic to be viewed critically with the GeForce RTX 3070: the memory expansion. At 8 GB, it is on par with the GeForce GTX 1070, which was released four years ago in June 2016. As of today, 8 GB for WQHD and thus the primary resolution of the GeForce RTX 3070 is usually still enough, only Ghost Recon Breakpoint wants more in this resolution on the course. But after four years at this level and next-gen consoles on the doorstep, the likelihood that more memory will be useful in the future for the highest and highest details in WQHD has increased significantly at the end of 2020. This is even more true for UHD.

Apart from the memory, the GeForce RTX 3070 does everything right

https://www.computerbase.de/2020-10/nvidia-geforce-rtx-3070-test/4/#abschnitt_fazit

should you ever feel VRAM is running out, just sell the RTX 3070 and buy whatever card is right at that time

https://www.techpowerup.com/review/nvidia-geforce-rtx-3070-founders-edition/41.html

The only potential gotcha is the card’s 8GB of memory. For the vast majority of games available today, 8GB should be adequate with maximum image quality, even at high resolutions, but moving forward that 8GB of memory may require some image quality concessions to maintain smooth framerates.

https://hothardware.com/reviews/nvidia-geforce-gtx-3070-ampere-gpu-review?page=4

you only have 8GB on the RTX 3070. But this is enough for the current and even next-gen games coming if you're only playing at 1080p, 1440p, and 3440 x 1440

https://www.tweaktown.com/reviews/9...unders-edition/index.html#Whats-Hot-Whats-Not

Some might have wanted to see more than 8GB of GDDR6 memory on the GeForce RTX 3070, but that shouldn’t be an issue on current game titles for 1440P gaming. If a game comes out in the future that needs more than 8GB for ultra image quality settings then the solution would be to just change the game settings

https://www.legitreviews.com/nvidia-geforce-rtx-3070-founders-edition-review_222986/15

the amount of memory of the RTX 3070 is also a point of discussion, because in 2020 the release of a video card of more than 500 euros that with eight gigabytes has the same amount of video memory as its two predecessors makes us somewhat frown

https://tweakers.net/reviews/8274/n...-edition-high-end-prestaties-voor-minder.html

Nvidia never tires of emphasizing that the storage capacities of the RTX-30 graphics cards were chosen deliberately, as they are sufficient . Our observations and measurements say otherwise - we don't look at the margin, after all. In fact, the Geforce RTX 3070 8GB shows symptoms of memory deficiency a little more often in the WQHD intended for it than the Geforce RTX 3080 10GB in Ultra HD. Both models are supported by gracious streaming systems, which address the graphics memory up to a percentage upper limit and also simply omit texture details. We are really not telling loyal PCGH readers anything new hereLet me tell everyone else: What is still enough today may be too little tomorrow. We are about to launch a new generation of consoles, which will bring new multiplatform games with new graphics and new standards. Although new technologies such as Variable Rate Shading and DirectStorage are in the starting blocks to increase efficiency, experience shows that it will take a long time for these ideas to penetrate the market. We are talking about years, not months. On the other hand, there is the fact that most developers do not have the time to optimize their work for months. There will always be games that literally run like nuts. PC gamers like to deal with this problem with strong hardware - but the Geforce RTX 3070 lacks this buffer. Memory hogs like Horizon: Zero Dawn, Ghost Recon Breakpoint, Wolfenstein Youngblood and a few others are already showing where the journey is headed. Ray tracing makes things worse due to the increased memory requirement.

https://www.pcgameshardware.de/Gefo...-Ti-Release-Benchmark-Review-Preis-1359987/4/




Best one for me was tpu, just punt it when it runs oot!:p
 
Soldato
Joined
31 Oct 2002
Posts
9,887
And you think I don't know about that? why?

I posted a video recently of my 2080Ti achieving what was apparently impossible (a figure at 1440p). However, in that video if you paid attention you would see the game was "asking for" between 9.5gb and 10.5gb depending on scene. Of which I played through quite a few just to make sure that my results couldn't be argued against by people saying "Well this level or that level is more demanding".

I am posting because I have listened to people like that in the past. Banging on a one sided argument about how it is totally fine yada ya. ORLY? if it was then pray tell, when the Fury X launched and LOADS of people said "It's more than enough because it's HBM and won't use more than that etc". I can tell you the results to that were the complete opposite. It was a royal PITA to live with. So what went wrong? quite simply it didn't have enough VRAM. End of story, no sequel, put the book down. And this was regardless of what ANY one said at launch. Had people actually been honest and educated? AMD would not have sold a single bloody card !

So it's very much a case of "once bitten" thank you very much, and I reserve the right to ignore the professors of poop who continually going around say it is enough.

The absolutely infuriating thing though is how they segment these cards based on VRAM. "8gb 99p fries version" "10gb budget value meal version" and "Fine dining 24gb ridiculous version".

It's a joke. And they are doing it for a reason. We are also talking about right now, not what could happen a year or two from now. And given that many will be "upgrading" from a 1080Ti I find it particularly infuriating that the card they are going out to buy is at least in ONE area technically inferior. That should not even be plausible, let alone actually happening.

But Nvidia know that if they shoved up the price by about £50, or whatever that extra bit of VRAM costs that you most probably WOULD NOT be going back for 7nm TSMC, Hopper, Bopper or whatever the F they want to call their next cash cow. They know full well what is about to happen when the dev cycle flips to these new consoles and minimum specs increase.

Pointless arguing with the majority of people in this thread. They're just blinded by the flashy colourful youtube views and believe 8GB on the 3070 and 10GB on the 3080 will be fine for years to come.

Doesn't matter what evidence you give them (Doom Eternal VRAM for example) they just say it's fake news, or no-one plays that game etc.

They are also all in denial that the new consoles have double the VRAM that they currently have, doesn't make a very high IQ to realize the implications of that....
 
Back
Top Bottom