• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Yes, it objectively does indeed suggest that it is not fine in the minds of a significant enough proportion of people. It's been criticised in the media and among regular users, so it's definitely of enough concern to people to become a talking point and source of contention.

In peoples minds, sure. But this isn't something that right now is any kind of objective fact, what we've seen so far is no real games break the 10Gb barrier in terms of what they need (if you measure actual usage via Special K or in game dev tools), yet what we have seen is the GPU unable to keep up the frame rate once settings in the more graphically intensive games are cranked right up. Which suggests the amount of memory is fine. All the other speculation about it not being fine are largely based on peoples feelings, about what they think they "deserve" for a premium card, or what their expectations about generational leaps ought to be, or that future games will need more memory ignoring that they'll also require faster GPUs and pretending like bottlenecks do not exist. These things are all bad arguments and until we have evidence that 10Gb wont be enough all we have is speculation.

We're in a paradigm shift and people will have to start changing the way they approach these ideas otherwise they're going to be way off the mark, you're going to create a market where Nvidia purely from a financial standpoint it's in Nvidias best interest to create a new product line for these people, and then they'll blow all that money on the extra 6-10Gb, whatever it could end up being, and then find out 2-3 years down the road that actually they never ended up using that memory and it was a waste of money as was basically the 11Gb of the 1080Ti with no game in its lifetime ever really going above 8 (again, when measuring actual usage accurately) and the games that questionably could use more than 8Gb today would leave it with like <10fps meaning they'd have to drop all the settings down and push the memory usage below 8Gb.

People need to pivot their thinking on this, how games use vRAM has changed radically over the last decade or so now and scaling of vRAM doesn't need to be as aggressive anymore, which is why all the new technology is focusing on speeding up the SSD/Storage to GPU link, rather than inflating GDDR6x usage.
 
Heh. Yeah right.

By the time I saw your posted I see @james.miller sorted you out anyway :D
Your funny, you should both do stand up.

Or children's parties:p

Got to love all the strawmaning in here now by the 10gb is not enough crowd. They are now trying to paint the picture that we are saying 10gb is enough for many generations or using 8K to justify their positions. Lol.

I still stick by my position. If you are buying a ampere card and intend to keep it for more than one gen then sure, maybe consider more than 10gb. But for the rest of us who upgrade every gen 10gb will be fine. By the time it becomes an issue I will be rocking a Hopper or Arcturus card that will have more vram anyways ;)
.

Right back at you ;) (From page 1, post 4)
If you plan on upgrading in 2 years time then yes it is enough because games will probably only start to push the limits by then. However if you intend on skipping a generation it will start to struggle IMO.
 
In peoples minds, sure. But this isn't something that right now is any kind of objective fact, what we've seen so far is no real games break the 10Gb barrier in terms of what they need (if you measure actual usage via Special K or in game dev tools), yet what we have seen is the GPU unable to keep up the frame rate once settings in the more graphically intensive games are cranked right up. Which suggests the amount of memory is fine. All the other speculation about it not being fine are largely based on peoples feelings, about what they think they "deserve" for a premium card, or what their expectations about generational leaps ought to be, or that future games will need more memory ignoring that they'll also require faster GPUs and pretending like bottlenecks do not exist. These things are all bad arguments and until we have evidence that 10Gb wont be enough all we have is speculation.

Your logic is the very similar in principle to the people claiming 10GB isn't enough... you are saying it is enough. The fact is in the games we have seen SO FAR, 10GB VRAM is indeed enough... but we are at the start of a brand new generation and to already have some concern about VRAM on a bleeding-edge flagship card is not a good thing. People have an absolute right to feel like what they are spending £700 on is going to be specified well enough to last the generation without any concerns such as VRAM limitations.

People need to pivot their thinking on this, how games use vRAM has changed radically over the last decade or so now and scaling of vRAM doesn't need to be as aggressive anymore, which is why all the new technology is focusing on speeding up the SSD/Storage to GPU link, rather than inflating GDDR6x usage.

Yeah, which is why Nvidia appear to be releasing 16GB 3070's and 20GB 3080's in December... because they think just the same as you do. I bet any Super editions later next year also have 10GB VRAM... honest. ;)
 
Your logic is the very similar in principle to the people claiming 10GB isn't enough... you are saying it is enough. The fact is in the games we have seen SO FAR, 10GB VRAM is indeed enough... but we are at the start of a brand new generation and to already have some concern about VRAM on a bleeding-edge flagship card is not a good thing. People have an absolute right to feel like what they are spending £700 on is going to be specified well enough to last the generation without any concerns such as VRAM limitations.

I mean what I said so far is that there's no evidence for what is being said, and there's not. My position is a bit weaker because I'm not asserting it will definitely in all cases always be enough, I think with all things there's edge cases that break trends and there's lot's of nuance. But I've basically already address what you're saying. Yes the upcoming games as the new console generation kick into gear are going to increase demands on vRAM, that's absolutely true. BUT they're also going to increase demands on the GPU, this is a completely uncontroversial statement, the 2 things are intrinsically linked, if you're putting stuff into vRAM on a modern game engine then it's something you need for rendering the scene around you and thus is an additional burden on the GPU. What I've said is that the evidence so far has shown us that the only games which get anywhere close to 10Gb of usage (8GB+) are games in which the GPU on the 3080 begins to struggle and provide unplayable frame rates. And this is the rub, demands on both will increase, if you load up the GPU with too many effects and assets then it will fail to deliver a playable frame rate and result in having to lower those settings, as you lower the settings the demands on vRAM go down.

This is what I mean by a paradigm shift, this is not how games are evolving today and in recent years, the vRAM isn't scaling with game assets anymore, it's scaling with what is needed to feed the GPU, they're 2 different paradigms. It used to be back in the day that vRAM was used more like a dumb cache and all the assets needed for a game level would cache in at the start during the loading process and then be a burden on vRAM until unloaded at the end of the level, and as games and their assets scaled up we needed to scale vRAM to cope. With modern game engines they have vastly more assets on disk that vRAM and the engine just pre-loads whatever it needs in the moment, it's how we managed to transition from claustrophobic "levels" to beautiful open world games like GTA V.

The people saying "ah hah, but future games will be more demanding on vRAM", that's in reference to this old paradigm that we're leaving behind, that's just not how things work anymore. vRAM is going to scale much more with GPU speed and not with some arbitrary demand from game size. Because (ideally) we only pack into vRAM what we need to render the next frame, we dont leave huge swaths of it unused until you just happen to need that asset. And actually all the evidence now seems to point to the new consoles and Microsoft with windows focusing a lot more on improving that area of gaming, improving the streaming from SSD to GPU, that's why Nvidia focused on DirectStorage for the Xbox and windows, and the PS5 focused on super fast SSD and custom storage controller. This is all with a future vision of basically eliminating loading all together.

Yeah, which is why Nvidia appear to be releasing 16GB 3070's and 20GB 3080's in December... because they think just the same as you do. I bet any Super editions later next year also have 10GB VRAM... honest. ;)

First of all it's just rumours. And second, it's Nvidia's job to make money, and if there's consumer demand that's large enough for a product then they'll make it if their research tells them enough people will buy it. They'll happily take a markup on the extra £100+ you'll have to spend on that additional 10Gb of GDDR6x. it's like Razer making mice with 16,000 DPI when most gamers use them at like 4000DPI max, people think the number matter so they'll buy it. 20Gb of vRAM on a 3080 variant is like the video card equivalent of gold HDMI connectors.
 
I've always used SLI/Crossfire so im used to not having enough vram anway :) I'm surprised people are worried for single card use though, i guess something must have changed with this gen, did the 2080 have vram issues too?
 
A bigger man would admit his mistake when it's laid out in front of him. Not you though, you'd rather resort to playground insults. Oh well, one day ;)
Do you mean like here? I didn't realise that it upset you that i wrapped it in a joke. Because it was such a minor mistake that it is laughable how you triumphantly declared it in your post. Which stemmed from you getting upset that i pointed out that i didn't say something.

You are not in any position to talk about playgroud insults. There was also your spat in the AMD thread with another user.

I would recommend you drop the "bigger man" talk, there's enough glass on the fall.
 
I mean what I said so far is that there's no evidence for what is being said, and there's not. My position is a bit weaker because I'm not asserting it will definitely in all cases always be enough, I think with all things there's edge cases that break trends and there's lot's of nuance. But I've basically already address what you're saying. Yes the upcoming games as the new console generation kick into gear are going to increase demands on vRAM, that's absolutely true. BUT they're also going to increase demands on the GPU, this is a completely uncontroversial statement, the 2 things are intrinsically linked, if you're putting stuff into vRAM on a modern game engine then it's something you need for rendering the scene around you and thus is an additional burden on the GPU. What I've said is that the evidence so far has shown us that the only games which get anywhere close to 10Gb of usage (8GB+) are games in which the GPU on the 3080 begins to struggle and provide unplayable frame rates. And this is the rub, demands on both will increase, if you load up the GPU with too many effects and assets then it will fail to deliver a playable frame rate and result in having to lower those settings, as you lower the settings the demands on vRAM go down.

This is what I mean by a paradigm shift, this is not how games are evolving today and in recent years, the vRAM isn't scaling with game assets anymore, it's scaling with what is needed to feed the GPU, they're 2 different paradigms. It used to be back in the day that vRAM was used more like a dumb cache and all the assets needed for a game level would cache in at the start during the loading process and then be a burden on vRAM until unloaded at the end of the level, and as games and their assets scaled up we needed to scale vRAM to cope. With modern game engines they have vastly more assets on disk that vRAM and the engine just pre-loads whatever it needs in the moment, it's how we managed to transition from claustrophobic "levels" to beautiful open world games like GTA V.

The people saying "ah hah, but future games will be more demanding on vRAM", that's in reference to this old paradigm that we're leaving behind, that's just not how things work anymore. vRAM is going to scale much more with GPU speed and not with some arbitrary demand from game size. Because (ideally) we only pack into vRAM what we need to render the next frame, we dont leave huge swaths of it unused until you just happen to need that asset. And actually all the evidence now seems to point to the new consoles and Microsoft with windows focusing a lot more on improving that area of gaming, improving the streaming from SSD to GPU, that's why Nvidia focused on DirectStorage for the Xbox and windows, and the PS5 focused on super fast SSD and custom storage controller. This is all with a future vision of basically eliminating loading all together.

Please... stop with the ceaseless waffling. You generally write 3x more than you need to and to top it off you do it in a very droning and anal style. I am not going to read reams of text that appears to be largely just a repeat of the other reams of text you write. Write concisely, less is more.

First of all it's just rumours. And second, it's Nvidia's job to make money, and if there's consumer demand that's large enough for a product then they'll make it if their research tells them enough people will buy it. They'll happily take a markup on the extra £100+ you'll have to spend on that additional 10Gb of GDDR6x. it's like Razer making mice with 16,000 DPI when most gamers use them at like 4000DPI max, people think the number matter so they'll buy it. 20Gb of vRAM on a 3080 variant is like the video card equivalent of gold HDMI connectors.
We all know it's not "just rumours" and the SKU's have appeared in AIB marketing slides. To say it's "just rumors" is just blanket deniel of the evidence that is mounting up prior to an official annoucement. Higher VRAM cards are coming as a responde to AMD's higher amounts of VRAM in their high-end SKU's... at this point it's just obvious.
 
Please... stop with the ceaseless waffling. You generally write 3x more than you need to and to top it off you do it in a very droning and anal style. I am not going to read reams of text that appears to be largely just a repeat of the other reams of text you write. Write concisely, less is more.

No, because the issue has nuance to it which you're blindly ignoring if you demand that takes on this situation have to remain simple enough you can digest it in 1 sentence. I made a very specific series of points which address some of these claims you've made or supported. If you don't care to defend them further, then that's fine by me. But you countered with some points and I'm countering those counters with actual reasons and explaining why it matters. If you stick yourself into an over simplified view of the world then it'll only be you who suffers for that.

Who cares about marketing slides, marketing changes all the time to meet the demands of the rest of the business. But I'm tempted to just give you this one and say, fine you win a cookie, Nvidia is going to make a larger vRAM variant of the 3080. So what? If they do, they're responding to demand from consumers, which is not the same thing as that amount of vRAM being required for anything. I think you think this proves your point, but it doesn't it just proves some people are willing to buy anything if you put a big number on it.
 
I can tell you with certainty that 10gb is adequate - but only adequate and not more than enough and frankly for an £800 upgrade I’d want more than enough as 4K is fast becoming the standard people want as displays get cheaper.
I have no doubt in my mind that the 3080 of choice will be the 20gb version and that 10gb cards will be on eBay regularly once the 20gb version is released.
As a 2080ti Strix owner I wouldn’t buy a 10Gb card - it’s that simple
 
No, because the issue has nuance to it which you're blindly ignoring if you demand that takes on this situation have to remain simple enough you can digest it in 1 sentence. I made a very specific series of points which address some of these claims you've made or supported. If you don't care to defend them further, then that's fine by me. But you countered with some points and I'm countering those counters with actual reasons and explaining why it matters. If you stick yourself into an over simplified view of the world then it'll only be you who suffers for that.

Who cares about marketing slides, marketing changes all the time to meet the demands of the rest of the business. But I'm tempted to just give you this one and say, fine you win a cookie, Nvidia is going to make a larger vRAM variant of the 3080. So what? If they do, they're responding to demand from consumers, which is not the same thing as that amount of vRAM being required for anything. I think you think this proves your point, but it doesn't it just proves some people are willing to buy anything if you put a big number on it.
He will likely stick you on ignore now, as his brain just can't handle it. Lol. Guy needs to get off his high horse.
 
Do you mean like here? I didn't realise that it upset you that i wrapped it in a joke. Because it was such a minor mistake that it is laughable how you triumphantly declared it in your post. Which stemmed from you getting upset that i pointed out that i didn't say something.

You are not in any position to talk about playgroud insults. There was also your spat in the AMD thread with another user.

I would recommend you drop the "bigger man" talk, there's enough glass on the fall.

...but you ARE insufferable, so what's your point?

chuk_chuk said:
There was also your spat in the AMD thread with another user.
What spat, are you talking about that nutella guy? the one who kicked off after i asked him what the performance penalty was for the nVidia drivers that fixed the 3080 crashing? the penalty he was really complaining about? You think that was my fault he kicked off like that, knowing he had no idea what he was talking about? That guy? :confused: You're clutching, just stop.

Look, I said something, you tried to to counter my point with a benchmark but misread the details which meant it didnt support your claim at all. It happens to us all. Accept it, move on. bore off and remove the stick from your arse. You might cheer up a bit. Or continue to act like a petulant child, your call :)
 
Last edited:
What people fail to understand is that most games that kill my GPU are not doing it via lack of VRAM.

So newer games will require a new GPU because the GPU cant cope not because there isnt enough VRAM.

So worrying about VRAM is dumb unless you only have 4GB of it.
 
What people fail to understand is that most games that kill my GPU are not doing it via lack of VRAM.

So newer games will require a new GPU because the GPU cant cope not because there isnt enough VRAM.

So worrying about VRAM is dumb unless you only have 4GB of it.

There is a lot of truth in that, we've seen in previous generations. I am also assuming that direct storage will hopefully have a positive impact on vram cache footprints but in reality that's probably a long way off yet.
 
There is a lot of truth in that, we've seen in previous generations. I am also assuming that direct storage will hopefully have a positive impact on vram cache footprints but in reality that's probably a long way off yet.

Support for the PC I'm hearing will be early 2021, possibly Jan, it needs to be integrated into windows to start with. And to my knowledge Nvidia RTX IO that compliments it is already done. Then all you need is support in games. Thing that makes me think that'll come fast is the consoles are going to be pushing this new technology hard and so hopefully a lot of the multi-platform engines, the big players like unreal and alike, they'll get upgrades pretty fast, and then it's just up to games to implement it. I think we'll see games in 2021 that support it would be my bet.
 
I can tell you with certainty that 10gb is adequate - but only adequate and not more than enough and frankly for an £800 upgrade I’d want more than enough as 4K is fast becoming the standard people want as displays get cheaper.
I have no doubt in my mind that the 3080 of choice will be the 20gb version and that 10gb cards will be on eBay regularly once the 20gb version is released.
As a 2080ti Strix owner I wouldn’t buy a 10Gb card - it’s that simple

Exactly mate. Seems people like you and I are some of the few with enough common sense to see that the 3080 should have had more than 10GB. Thankfully, Nvidia agree and are releasing a 20GB version. Or get the 16GB 3070 if you want a save a few bucks.
 
Exactly mate. Seems people like you and I are some of the few with enough common sense to see that the 3080 should have had more than 10GB. Thankfully, Nvidia agree and are releasing a 20GB version. Or get the 16GB 3070 if you want a save a few bucks.

Yes it makes perfect sense for Nvidia to lump 20GB or VRAM that nobody needs but must pay for when there is limited supply. Thus further slowing production to satisfy people who know little about how VRAM works because bigger number must be better.

Im surprised Nvidia hasnt already sent you the job offer in the post.
 
Exactly mate. Seems people like you and I are some of the few with enough common sense to see that the 3080 should have had more than 10GB. Thankfully, Nvidia agree and are releasing a 20GB version. Or get the 16GB 3070 if you want a save a few bucks.

I would expect a 16gb 3070 to get beat by a 10gb 3080 in everything now. In a few years (which is really what this vram discussion is about) I bet the 10gb 3080 will still beat a 16gb 3070 in the vast majority of the new games. There may be a game here or there that is coded in such a way that it gobbles up vram without hammering the GPU....those select few titles might put a 16gb 3070 ahead of a 3080 if and when the settings are carefully selected to get that outcome.
 
I would expect a 16gb 3070 to get beat by a 10gb 3080 in everything now. In a few years (which is really what this vram discussion is about) I bet the 10gb 3080 will still beat a 16gb 3070 in the vast majority of the new games. There may be a game here or there that is coded in such a way that it gobbles up vram without hammering the GPU....those select few titles might put a 16gb 3070 ahead of a 3080 if and when the settings are carefully selected to get that outcome.

By the time I will need more than 10gb I will have bought another card anyway.
 
Status
Not open for further replies.
Back
Top Bottom