I have said since the start it was enough at the time but it isn't looking good moving forward. I don't know about you guys but i 100% don't buy a £700+ (more like £1400) graphics card to turn details down after 18 months.
It always smelled of planned obsolesence for me.
And what about the fact that even gpus with more than 10GB vram are having to reduce settings already in order to achieve certain pc gamers standards? Were they planned obsolescence too?
It is exactly that.. they never wanted to give 3080 owners a GA102 chip so they nerfed it to lower cuda count and way lower VRAM than it should have been. The real 3080 was mean't to be on GA102 the 3070 we have now and then the 3080 super was mean't to be the 3070Ti with of course more vram than 8GB on them, they even nerfed the 3070's to 8GB for the same reason and guess what soon the real 3070ti with 16GB
..
8GB VRAM is now 1080p highend Ultra level settings, 1440p medium to high.
10GB VRAM is now 1440p highend Ultra level settings, 4k medium to high.
This is for now next gen games you will need to lower settings or resolution.
Can't say I have had a single issue with vram at 4k yet on my 3080 where I have had to reduce my settings to stay within vram usage, however, I have had to reduce settings because of the lack of grunt especially RT grunt i.e. cyberpunk (unless I decide to run at dlss performance....). Of course, without DLSS/FSR then there would be plenty of games where settings would have to be reduced right across the board but then you would also have to do the same with a 6900xt/3090 if not using dlss/fsr either....
Without going over repetitive old ground, be prepared for pages of getting nowhere from the usual suspects.
The counter argument which make the most sense are people that say 'well by the time this could be an issue I will have already upgraded to the 40x0' which I think is hard to argue against but at the same time not the chief audience that bought the card to last them a few years. Generally though, if the issues do materialise into the second/third year of the gen's lifecycle its the people that upgrade less often that get punished. We will be able to flush this out or not, when more games release. The problem has been the cards are only a year old and the games released last year would not have been targeting this hardware in all honesty.
Getting back to the point which was not 'debunked', why have nvidia offered some cards with increased VRAM over the same models if it was
never going to be an issue? To me this reads what the spec of the card should have aimed at the beginning. So to counter your answer to a question with a question, question, why not just leave the VRAM at its 10Gb and fluff up anything else?
As another forumite posted:
Denial is not just a river in Egypt!
This is what some have been saying. Just because you moved off the 3080 in such a short timeframe, doesn't mean the problem isn't inbound. You were the perfect test case as you (like me) game on a 4k screen. When people post oh its not an issue, then only game at 1440p its less likely to be an issue at that resolution. The card was marketed at 4k.
i.e. :
Not enough people have a 4k monitor yet. Wait until that sets in, unfortunately the displays are so expensive most are not on them yet.
Funny thing is..... all these pages and yet still no one, including you have been able to post all these supposed issues with the 3080 10gb vram with "evidence"....
If it is such a problem, surely there must be something you can post to show this supposed big issue with the 10gb vram??? We have only had one legitimate case, can't remember who it was but he was saying he encounters a vram issue because of his vr headset, which has some crazy high res. of like 16k iirc so not exactly your normal gaming usage, although he did go on to say he has to reduce settings anyway on his other high end gpus because of the lack of grunt so it was a bit of moot point.
As for why nvidia have released a 12gb model.... perhaps because they are a company who want to make as much money as possible? Shocker, I know. Also, nvidia like to saturate the market with a card for every performance and price sector, this is a pretty common business practice, not to mention it also means they dominate benchmark scoreboards, just look at the 3060 choice and 3070 choice incoming.
Again, still waiting on the below???
Has there been any proof to show that the extra 2GB vram is actually benefitting the 3080 12GB? (outside of the "overall" specs actually being better than the 10GB version.....) If so please post some links
Also, not sure anyone has made that claim of it "never" being an issue, there will come a time when it will become an issue but by that time, every gpu is going to be reducing settings for one reason or another. This thread originally set out with people saying/expecting it to be an issue within the year yet here we are, still not causing any issues after 1 year and 9 months (?), again, feel free to post something that shows otherwise.
Out of interest, what cards do you consider to be "4k" capable? Bearing in mind a 3080 is still regarded as being better than a 6800xt for 4k by most reviewers (probably largely because of dlss support though....)
EDIT:
Also, at the time, it was not physically possible for nvidia to provide a 12gb GDDR6x 3080, and whatever else they could have provided to have higher vram would have cost more than £650, not to mention, you're talking about 1 year and 9 months ago... given the reviews of the 3080 12GB model, majority of the comments seem to like the saving/cost of their 3080 10gb in comparison....