So have a second model with more ram then? Surely you realize they are using 10GB because they know 10GB will last exactly two years at most and they are using that to make people forceably upgrade. I want to keep textures and Anisotropic filtering at max and lower the other settings so at 4k the game textures are are sharp and shiny as they can be but i can keep my fps pretty high.
And how much does ram cost? How much would it really cost to give 12GB over 10GB? Really how can it add that much? Also just to be clear on my point, I ONLY upgrade when a product doubles my FPS what i do NOT want to be forced to upgrade for is a VRAM shortage in the latest games. I prefer to trade other settings for highest texture modes and another point i must add. Just wait until these 10GB card are hit by the high resolution texture packs in 2021 games the same way Fallout 4 and Shadow of War and Mordor were enhanced. GTAV Enhanced is coming too!
It's not like Nvidia havn't done this in the past. For various technical reasons you're limited to the the vRAM configs you can have on a card and we've seen in the past cards which ideally need about 4GB can either pick 3GB or 6GB and in those circumstances Nvidia bit the bullet and just release 2 variants, the smaller vRAM size which obviously has a bottleneck, and then a 6GB variant that doesn't but is more expensive. But such situations are rare, normally the architecture is planned ahead to not end up in those kind of bad situations.
That said it's not a foregone conclusion that the 3080 will need more RAM. This entire thread has been about arguing for and against that case. There's at least some evidence which shows that 10Gb might be enough and that is games that are out now which already have GPU bottlenecks, the GPU itself becomes overloaded before the vRAM runs out. We see that with FS2020, Avengers, Watch Dogs Legion, and even the new CoD if you ignore DLSS and want to run that game in 4k native with RT effects you're going to find yourself at about 20FPS on the 6800XT or about 45FPS on the 3080 but not exceeding your vRAM budget.
The problem with vRAM is that you cannot just add however much you like, it seems intuitive that you simply put 12x1Gb modules on the card if you want 12Gb of vRAM but you can't do that. The architecture of the card splits a certain memory bus width between a bunch of different memory modules and for complicated reasons restricts you to certain multiples of certain vRAM sizes. So like with the example earlier of older Nvidia cards there was only really 2 sensible options 3GB or 6GB, if what is ideal is say 4GB then you're kinda stuffed you either go too little and cheap, or too much and expensive. In that case they offdered both and let the user pick and the cards had distinctly different prices.
For GDDR6 which the AMD cards use it's about $11.69 per each 1GB modules, so 16GB of that is 187.04, and that's just the raw cost to the AIBs they of course need to make their 40% profit margin (or whatever their real margin is) so it's really going to be marked up to something like $261. And that's only the 14Gbps memory, the AMD cards use slightly faster memory which is probably more expensive again, but hard to get accurate cost on. And then of course Nvidia use GDDR6x which so far we don't have good numbers on, because Nvidia sell it themselves and they only sell GPU+memory bundles to the AIBs so the real cost is not known, but we do know that it's harder to manufacture and more expensive. Let's just pick $15 out of the air as a per 1Gb module cost, that's probably in the ballpark of what it costs, a 20GB variant of the 3080 would be 10GB more and so 10*$15 plus the AIB markup which is something like $210 additional cost.
Bottom line is that vRAM is a very specialist type of RAM, it's faster than system RAM and is expensive to produce, you don't want to put anymore vRAM on your card than you absolutely need because you drive up the cost of the card significantly and it's the consumers ultimately that pay for it.