I dont know what the sweet spot is but I would guess around the 12Gb mark would be the minimum to get away with ~4k resolutions for the current moment in time. If you plan on having a card now that will last as long as possible then you need to have choice and probably why nvidia are now shipping the 20Gb 3080 to cover that base.
Nothing at 4k right now needs 12Gb of vRAM. Or when actually measured properly (I'll come back to this) actually needs 10Gb either.
I'd go with that, 12GB for gaming in 2021 and 2022 now 12GB would be minimum as we just dont' know how they'll use more and more textures and ram so, yeah, this is why I'm really surprised with the 3080, it had to be just down to cost to keep the "RRP" which is meangingless anyway down... can't think of a single other reaosn to only put 10GB on a state of the art card that is the pinacle of gaming lol... hey ho, I don't care, not buying one anyway.
Well the reason not to put more vRAM on the card is that it cannot make use of it for gaming. Any game that comes anywhere close to the 10Gb vRAM limit is completely unplayable because you run into GPU bottleneck long before you hit 10Gb. Putting more memory on the card than the GPU can realistically use costs more money and provides no benefit for gaming.
Check the VRam on this, 1440P. this is just in the middle of space, cruising above a planet surface the VRam over spills, I din't have the OSD running in the second video but its pretty rubber bandy on any planet surface.
Overspills what? Not 8Gb of vRAM.
Problem is getting an accurate gauge on actual VRAM utilisation without identical performing cards with different VRAM amounts (or some really in depth software analysis).
As something I've mentioned before - using a modded Skyrim for instance on both a GTX780 and 970 I had to load it up until the VRAM use as per overlay exceeded 5GB (despite using 3GB and "4"3.5GB cards) before there was any performance impact at all while other games will hit a wall the moment you got a few MB over the VRAM amount of your GPU and others like Battlefront will happily allocate 90% of your VRAM (up to a certain size) and sit at that amount almost constantly without actually using more than about 2.8GB at 1440p.
EDIT: Oh I see you are replying to someone on my ignore list - probably why they are talking nonsense.
We've finally got a good way to measure actual vRAM usage now, it was mentioned in the "is 10Gb enough" thread and it's worth repeating here. The latest beta for MSI's Afterburner now can look at actual process specific memory usage the instructions for enabling this OSD are here
https://www.resetera.com/threads/msi-afterburner-can-now-display-per-process-vram.291986/
The bottom line is that almost all the measurement tools we've been using to inspect vRAM usage have been measuring only what the engine has demanded from the video card in terms of allocation (how much memory is reserved exclusively for that process), but this doesn't tell us how much of that allocated memory is actually used by the game engine. Prior to this you either needed to rely on more accurate in-game dev tools which most games don't have (or you can't get easy access to). Or a more recently application called Special K which hooks the game files to inspect what's going on, and had limited support and a bunch of other problems.
Anyway bottom line is that actual memory usage is always lower than what is allocated, often by a lot, the difference can be as much as 3Gb in the case of something like FS2020 where the original 12.5Gb at 4k Ultra that was measured in benchmarks is in reality about 9.5Gb. This would almost certainly be the reason for Skyrim modding not taking a performance hit when exceeding the cards vRAM. And as you've rightly pointed out, actual memory allocation is often done somewhat arbitrarily, a developer may just request 90% of the available vRAM in a system, so a 24Gb 3090 might report something like 21Gb of vRAM in "use" which is shockingly high, but in reality the game may only need 5-6Gb
So it's worth taking with a pinch of salt the examples above of high memory usage, what those people need to do is visit that link, upgrade MSI Afterburner to the new beta and enable both vRAM usage as normally measured, but also the
"GPU Dedicated Memory Usage \ Process" vRAM for comparison to see what is actually in use.
Here is what my quick first test showed with my GTX 1080 on Wolfenstein II, with maxed settings at 4k (minus motion blur, and using SMAA (TX1)) which is 6.5Gb allocated but only 5.4Gb in use. A delta of 1.1Gb