• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
I have said since the start it was enough at the time but it isn't looking good moving forward. I don't know about you guys but i 100% don't buy a £700+ (more like £1400) graphics card to turn details down after 18 months.

It always smelled of planned obsolesence for me.


It is exactly that.. they never wanted to give 3080 owners a GA102 chip so they nerfed it to lower cuda count and way lower VRAM than it should have been. The real 3080 was meant to be on GA104 the 3070 we have now and then the 3080 super was meant to be the 3070Ti with of course more vram than 8GB on them, they even nerfed the 3070's to 8GB for the same reason and guess what soon the real 3070ti with 16GB :rolleyes::cry:..


8GB VRAM is now 1080p highend Ultra level settings, 1440p medium to high.

10GB VRAM is now 1440p highend Ultra level settings, 4k medium to high.

This is for now next gen games you will need to lower settings or resolution.
 
Last edited:
The problem with the arguments people are making are twofold. They are based on tech media reviews.

First problem is these reviewers tend to review the same sample of games as each other, they tend to be well optimised titles, I have even seen reviewers state they deliberately avoid badly optimised games in their review benches, this makes no sense to me as you are painting a false picture of reality. We dont all just play the big blockbuster well optimised PC shooters, some of us play games from developers who release bad port after bad port.

Second problem is they are not measuring image quality, in addition DF have now tweeted there is a problem with the review industry using tools which fail to pick up certain stutters.

The amount of times I have seen reviewers fail to pick up obvious problems in games (and hardware) is unreal.

Without going over repetitive old ground, be prepared for pages of getting nowhere from the usual suspects.

The counter argument which make the most sense are people that say 'well by the time this could be an issue I will have already upgraded to the 40x0' which I think is hard to argue against but at the same time not the chief audience that bought the card to last them a few years. Generally though, if the issues do materialise into the second/third year of the gen's lifecycle its the people that upgrade less often that get punished. We will be able to flush this out or not, when more games release. The problem has been the cards are only a year old and the games released last year would not have been targeting this hardware in all honesty.

Getting back to the point which was not 'debunked', why have nvidia offered some cards with increased VRAM over the same models if it was never going to be an issue? To me this reads what the spec of the card should have aimed at the beginning. So to counter your answer to a question with a question, question, why not just leave the VRAM at its 10Gb and fluff up anything else?

As another forumite posted:

Denial is not just a river in Egypt! ;) :p

The vram issue imo is only an issue for people who intend to keep the card for a very long time and those that are inflexible and find the thought of tweaking settings unimaginable. For those you can buy a 16gb or 24gb card though, but even with those there will be situations where you need to tweak settings. Like with an AMD card you will need to tweak RT settings for example.

This is what some have been saying. Just because you moved off the 3080 in such a short timeframe, doesn't mean the problem isn't inbound. You were the perfect test case as you (like me) game on a 4k screen. When people post oh its not an issue, then only game at 1440p its less likely to be an issue at that resolution. The card was marketed at 4k.

i.e. :
Not saying the issue doesn't exist although I can't say I've ever had to deal with it at 1440p.

Not enough people have a 4k monitor yet. Wait until that sets in, unfortunately the displays are so expensive most are not on them yet.
 
Last edited:
I have said since the start it was enough at the time but it isn't looking good moving forward. I don't know about you guys but i 100% don't buy a £700+ (more like £1400) graphics card to turn details down after 18 months.

It always smelled of planned obsolesence for me.

And what about the fact that even gpus with more than 10GB vram are having to reduce settings already in order to achieve certain pc gamers standards? Were they planned obsolescence too?

It is exactly that.. they never wanted to give 3080 owners a GA102 chip so they nerfed it to lower cuda count and way lower VRAM than it should have been. The real 3080 was mean't to be on GA102 the 3070 we have now and then the 3080 super was mean't to be the 3070Ti with of course more vram than 8GB on them, they even nerfed the 3070's to 8GB for the same reason and guess what soon the real 3070ti with 16GB :rolleyes::cry:..


8GB VRAM is now 1080p highend Ultra level settings, 1440p medium to high.

10GB VRAM is now 1440p highend Ultra level settings, 4k medium to high.

This is for now next gen games you will need to lower settings or resolution.

Can't say I have had a single issue with vram at 4k yet on my 3080 where I have had to reduce my settings to stay within vram usage, however, I have had to reduce settings because of the lack of grunt especially RT grunt i.e. cyberpunk (unless I decide to run at dlss performance....). Of course, without DLSS/FSR then there would be plenty of games where settings would have to be reduced right across the board but then you would also have to do the same with a 6900xt/3090 if not using dlss/fsr either....

Without going over repetitive old ground, be prepared for pages of getting nowhere from the usual suspects.

The counter argument which make the most sense are people that say 'well by the time this could be an issue I will have already upgraded to the 40x0' which I think is hard to argue against but at the same time not the chief audience that bought the card to last them a few years. Generally though, if the issues do materialise into the second/third year of the gen's lifecycle its the people that upgrade less often that get punished. We will be able to flush this out or not, when more games release. The problem has been the cards are only a year old and the games released last year would not have been targeting this hardware in all honesty.

Getting back to the point which was not 'debunked', why have nvidia offered some cards with increased VRAM over the same models if it was never going to be an issue? To me this reads what the spec of the card should have aimed at the beginning. So to counter your answer to a question with a question, question, why not just leave the VRAM at its 10Gb and fluff up anything else?

As another forumite posted:

Denial is not just a river in Egypt! ;) :p

This is what some have been saying. Just because you moved off the 3080 in such a short timeframe, doesn't mean the problem isn't inbound. You were the perfect test case as you (like me) game on a 4k screen. When people post oh its not an issue, then only game at 1440p its less likely to be an issue at that resolution. The card was marketed at 4k.

i.e. :


Not enough people have a 4k monitor yet. Wait until that sets in, unfortunately the displays are so expensive most are not on them yet.

Funny thing is..... all these pages and yet still no one, including you have been able to post all these supposed issues with the 3080 10gb vram with "evidence".... :cry:

If it is such a problem, surely there must be something you can post to show this supposed big issue with the 10gb vram??? We have only had one legitimate case, can't remember who it was but he was saying he encounters a vram issue because of his vr headset, which has some crazy high res. of like 16k iirc so not exactly your normal gaming usage, although he did go on to say he has to reduce settings anyway on his other high end gpus because of the lack of grunt so it was a bit of moot point.

As for why nvidia have released a 12gb model.... perhaps because they are a company who want to make as much money as possible? Shocker, I know. Also, nvidia like to saturate the market with a card for every performance and price sector, this is a pretty common business practice, not to mention it also means they dominate benchmark scoreboards, just look at the 3060 choice and 3070 choice incoming.

Again, still waiting on the below???

Has there been any proof to show that the extra 2GB vram is actually benefitting the 3080 12GB? (outside of the "overall" specs actually being better than the 10GB version.....) If so please post some links :)

Also, not sure anyone has made that claim of it "never" being an issue, there will come a time when it will become an issue but by that time, every gpu is going to be reducing settings for one reason or another. This thread originally set out with people saying/expecting it to be an issue within the year yet here we are, still not causing any issues after 1 year and 9 months (?), again, feel free to post something that shows otherwise.

Out of interest, what cards do you consider to be "4k" capable? Bearing in mind a 3080 is still regarded as being better than a 6800xt for 4k by most reviewers (probably largely because of dlss support though....)

EDIT:

Also, at the time, it was not physically possible for nvidia to provide a 12gb GDDR6x 3080, and whatever else they could have provided to have higher vram would have cost more than £650, not to mention, you're talking about 1 year and 9 months ago... given the reviews of the 3080 12GB model, majority of the comments seem to like the saving/cost of their 3080 10gb in comparison....
 
Last edited:
And what about the fact that even gpus with more than 10GB vram are having to reduce settings already in order to achieve certain pc gamers standards? Were they planned obsolescence too?

Are you saying putting 16GB of VRAM is the same? One is the card aging, one is deliberately short stacking the memory (Nvidia called the 3080 their flagship during the announcement and it had 1GB less than the previous 2 flagships in the 1080Ti and 2080Ti). How often do the same level of cards regress in their amount of VRAM?
 
still not causing any issues after 1 year and 9 months (?), again, feel free to post something that shows otherwise.
...you're talking about 1 year and 9 months ago

Time to debunk. Like your strange famous self wizardry skills say... the card from skim check was released 17 September 2020. To date, we are lets call it 17 Jan 2022, which is 16 months on from being usable by regular folk. That is not 1yr 9 months. Still not causing issues but lets add on another 5 months shall we!! :p

Is this the twisted time distortion easy rider has when he says he had the 3060Ti or whatever it was for two years!! :rolleyes: :cry:
 
Are you saying putting 16GB of VRAM is the same? One is the card aging, one is deliberately short stacking the memory (Nvidia called the 3080 their flagship during the announcement and it had 1GB less than the previous 2 flagships in the 1080Ti and 2080Ti). How often do the same level of cards regress in their amount of VRAM?

AMD did the same when going from Radeon R9 390X 8 GB GDDR5 to Radeon R9 Fury X 4 GB HBM.

Nvidia's choice of GDDR6X meant that this design choice introduced an additional obsolescence, and a terrible lifecycle-performance compromise.

AMD cleverly escaped this time by using the slower GDDR6 but next level on-die cache called Infinity Cache 128 MB.
 
Are you saying putting 16GB of VRAM is the same? One is the card aging, one is deliberately short stacking the memory (Nvidia called the 3080 their flagship during the announcement and it had 1GB less than the previous 2 flagships in the 1080Ti and 2080Ti). How often do the same level of cards regress in their amount of VRAM?

No, what I am saying is that vram and grunt goes hand in hand, it's pointless having loads of vram and not enough grunt and vice versa.

What mesai said is exactly what will happen:

Games are going to get more demanding, especially with the new engines, so people playing at higher resolutions will need to scale back setting to find playable FPS, which funnily enough will reduce VRAM demand.

We are already seeing current gen gpus having to reduce settings because of not having enough grunt, especially in the ray tracing space.

I always found that a bizarre argument too "last gen cards had more vram".... correct me if I am wrong but a 3080 obliterates a 1080ti and comfortably beats a 2080ti (even the highly overclocked ones), not to mention, superior ray tracing performance.... If offered, I don't think you'll find a single 1080ti/2080ti owner turning down a free upgrade to a 3080 because it has 1GB less of vram.

Time to debunk. Like your strange famous self wizardry skills say... the card from skim check was released 17 September 2020. To date, we are lets call it 17 Jan 2022, which is 16 months on from being usable by regular folk. That is not 1yr 9 months. Still not causing issues but lets add on another 5 months shall we!! :p

Is this the twisted time distortion easy rider has when he says he had the 3060Ti or whatever it was for two years!! :rolleyes: :cry:

If only you would go to the same effort for your vram debate... maybe because there is nothing out there to back them claims up... :cry: :D

And no, it was a simple miscount mistake, having a look at upcoming games though, I can't see anything, which will cause vram "specific" issues on the 3080 though but plenty of ray tracing/intensive titles that will test the grunt of all cards though ;)

Would love to see this addressed too:

Has there been any proof to show that the extra 2GB vram is actually benefitting the 3080 12GB? (outside of the "overall" specs actually being better than the 10GB version.....) If so please post some links :)
 
We are already seeing current gen gpus having to reduce settings because of not having enough grunt, especially in the ray tracing space.

I always found that a bizarre argument too "last gen cards had more vram".... correct me if I am wrong but a 3080 obliterates a 1080ti and comfortably beats a 2080ti (even the highly overclocked ones), not to mention, superior ray tracing performance.... If offered, I don't think you'll find a single 1080ti/2080ti owner turning down a free upgrade to a 3080 because it has 1GB less of vram

No.
Such shenanigans do not apply when users leave their fanboyism and brand loyalty toward Nvidia, and instead buy the better Radeon RX 6800 XT 16 GB..
 
No.
Such shenanigans do not apply when users leave their fanboyism and brand loyalty toward Nvidia, and instead buy the better Radeon RX 6800 XT 16 GB..

Yea, but that 6GB of extra VRAM would have required me to drop an extra couple hundred £. It has basic RT and only has basic scaling tech (it’ll also be limited when it comes to DL), which matters for longevity.

More money for a card that’ll have a shorter lifespan ;)
 
AMD did the same when going from Radeon R9 390X 8 GB GDDR5 to Radeon R9 Fury X 4 GB HBM.

Nvidia's choice of GDDR6X meant that this design choice introduced an additional obsolescence, and a terrible lifecycle-performance compromise.

AMD cleverly escaped this time by using the slower GDDR6 but next level on-die cache called Infinity Cache 128 MB.

This was massively slated at the time too, and was a limitation of HBM at the time.

No, what I am saying is that vram and grunt goes hand in hand, it's pointless having loads of vram and not enough grunt and vice versa.

What mesai said is exactly what will happen:



We are already seeing current gen gpus having to reduce settings because of not having enough grunt, especially in the ray tracing space.

I always found that a bizarre argument too "last gen cards had more vram".... correct me if I am wrong but a 3080 obliterates a 1080ti and comfortably beats a 2080ti (even the highly overclocked ones), not to mention, superior ray tracing performance.... If offered, I don't think you'll find a single 1080ti/2080ti owner turning down a free upgrade to a 3080 because it has 1GB less of vram.



If only you would go to the same effort for your vram debate... maybe because there is nothing out there to back them claims up... :cry: :D

And no, it was a simple miscount mistake, having a look at upcoming games though, I can't see anything, which will cause vram "specific" issues on the 3080 though but plenty of ray tracing/intensive titles that will test the grunt of all cards though ;)

Would love to see this addressed too:

I have never said the 3080 isn't the faster card and I have never tried to claim that you need both grunt and VRAM to have a properly effective card, but that means either the 1080Ti/2080Ti were both given more than needed and shouldn't have had that much or the 3080Ti was given much quicker core but less memory, upsetting the balance of GPU power <> memory. Either Nvidia got it wrong with Pascal and Turing, or they got it wrong with Ampere.

There was 1 primary reason to give it 10GB, money. They could easily have made it 12GB from the off and this question would never have arisen.
 
It is exactly that.. they never wanted to give 3080 owners a GA102 chip so they nerfed it to lower cuda count and way lower VRAM than it should have been. The real 3080 was meant to be on GA104 the 3070 we have now and then the 3080 super was meant to be the 3070Ti with of course more vram than 8GB on them, they even nerfed the 3070's to 8GB for the same reason and guess what soon the real 3070ti with 16GB :rolleyes::cry:..


8GB VRAM is now 1080p highend Ultra level settings, 1440p medium to high.

10GB VRAM is now 1440p highend Ultra level settings, 4k medium to high.

This is for now next gen games you will need to lower settings or resolution.
All of the current top cards including the 3090 will need to lower settings by the time the 4000 series are out to run the next gen AAA games above 60fps regardless of the VRAM, can a Titan RTX run all the new games above 4K 60fps?
 
Yea, but that 6GB of extra VRAM would have required me to drop an extra couple hundred £. It has basic RT and only has basic scaling tech (it’ll also be limited when it comes to DL), which matters for longevity.

More money for a card that’ll have a shorter lifespan ;)

Wouldn't bother wasting time and effort replying to 4k, guy is clueless and a massive amd fan, just look at his post history and check out the "image quality" comparisons he did in CS:S.... :cry: :o Only person I have on ignore.

Fully agree with you though, matches it on the whole in rasterization titles:

- having access to dlss, NIS and FSR and eventually intels upscaler is quite the advantage
- superior rt performance which is in 90% of the games and is the future

Amongst many other little advantages.

IMO, this is the first time nvidia current gen will age better than amds current gen gpus, at least we can say that for certain when it comes to ray tracing titles anyway.

Basically every thread here involves people who don’t have the thing, telling people who do have the thing, how bad the thing is.

Mostly this.

Despite every review, comparison and tech video out there as well as various user posts, a couple of people still insist that there is a problem yet somehow still can't manage to post anything to prove it :o

So debunked then. Better call gpuerrilla.. glad we cleared that one up! ;) :D

Still not coming back to any of those other points I see, keep on being you gpuerrilla :cry:

This was massively slated at the time too, and was a limitation of HBM at the time.

I have never said the 3080 isn't the faster card and I have never tried to claim that you need both grunt and VRAM to have a properly effective card, but that means either the 1080Ti/2080Ti were both given more than needed and shouldn't have had that much or the 3080Ti was given much quicker core but less memory, upsetting the balance of GPU power <> memory. Either Nvidia got it wrong with Pascal and Turing, or they got it wrong with Ampere.

There was 1 primary reason to give it 10GB, money. They could easily have made it 12GB from the off and this question would never have arisen.

I remember those fury x days well, some peoples posts/view points back then would go down well if they were to be brought up now.... given their views on the 10gb 3080 now ;) :cry:

Money and limitation of tech. were the main reasons why we didn't get more vram on the 3080, again, it was not possible for them to put more than 10gb gddr6x on the 3080, they could have only put 24GB GDDR6x on it or gone with 16gb ddr6 (not even sure if that would have been possible for contract and/or supply conflicts with amd??), chances are if they went with either of those 2 options, the 3080 would have cost far more than it did, in which case would people still have regarded it as being the king for value? IMO, not a chance, £650 was already far more than what I and most were willing to pay, no way I would have paid an extra £100 or more just to get extra vram.

All of the current top cards including the 3090 will need to lower settings by the time the 4000 series are out to run the next gen AAA games above 60fps regardless of the VRAM, can a Titan RTX run all the new games above 4K 60fps?

Precisely this.

And I bet you the reason people will upgrade will largely be for the sheer extra grunt/performance especially around ray tracing and not just for more vram....
 
This is what some have been saying. Just because you moved off the 3080 in such a short timeframe, doesn't mean the problem isn't inbound. You were the perfect test case as you (like me) game on a 4k screen. When people post oh its not an issue, then only game at 1440p its less likely to be an issue at that resolution. The card was marketed at 4k.

To be fair though 4K is a moving target. Fury X was also marketed as a 4K card no?

Problems are always inbound, either vram or rasta or rt performance.
 
Status
Not open for further replies.
Back
Top Bottom