I do lfind it interesting how the narrative is subtly and slowly changing from where it started at "10GB VRAM is enough for this generation, period" to statements like: " Issues wont exist with most games. At most you wont be able to run the higher texture settings."
10GB VRAM is enough for this generation, period.
Ok, I don't know who you think said that but that was not the narrative at all, certainly no general consensus. Depending on who you listen to, it's either a) 10Gb isn't enough and nvidia are a laughing stock or b) we don't know but it hasn't
proven to be an issue YET.
Remember, this thread started asking a question, and answers of 'it's not enough because (x game) were being given without proof and it wasnt until
other people started looking in to these games that holes started appearing in that argument. If anybody is getting the impression that I, or princessFrosty, or anybody else who's taken the time to do at least
some investigation (and in all fairness, the majority of it was frosty - he's got waay more patience than i have with the abuse he's been getting), thinks that 10Gb will always be enough then you have got the wrong impression. Stop thinking that. What we have said is that, so far, there's no indication that any of the games given as examples so far are
reasonable examples of a game that genuinely a) needs more than 10gb of vram and b) actually makes use of it.
Take this latest example, watchdogs at 1080p. people are going nuts about the 3070 tanking at 1080p and yes, it really does...but by the looks of it, so does the 11gb 2080ti at the same resolution and using the same settings. So is that a genuine vram limited scenario or is it just a as-per-usual terribly optimised game? We need more data on that to make any kind of conclusive decision.
RichDog said:
Do I think that a flagship card of a new generation is going to have enough power to max a game of this or the new generation? If not... don't you think it would be a bit... weird? At 1440p at least this should be achievable to fully max any game at Ultra. 4k60fps should be achievable in the majority of well developed and optimised games.
You've asked a question along these lines already and i've answered it once. There are
already games that the 3080 cant manage 4k60 on, and not because of VRAM. Where's all the fuss about that? Why are people bothered about turning down the texture settings one notch (especially in cases like Doom eternal where it makes ZERO difference...) but anything else gets a pass and gets ignored? Doesn't make any sense to me. What's worse is that this isnt even unique, I can't remember a time when a new card has ploughed through everything with max settings. There's
always been a game or two that drops the average FPS below 60 at launch. This is, or was, perfectly normal. But suddenly now it isnt? What's changed then? Damned if i know...