I believe that's capacity not usage.
It is. And look, we've been over this back at the start of the thread. People were on about Doom and Wolfenstein (both id tech engines) and how you could push "8k textures" and then "16k textures". And it turned out that all people were doing was manually setting the engine variable that assigns the texture memory pool directly. These are games which rely heavily on texture streaming from disk and they have a reserved pool of memory in vRAM for texture data, the contents of which the engine manages internally. If you look at the engine variables before/after changing visual settings in the menu you can see that the texture pool size setting in the graphics menu controls this variable and when you do something like set ultra nightmare it just whacks that up to 4.5GB. It's not actually using that space, we can infer that by leaving that value on high which I think is 2Gb or 2.5Gb and then simply test the visuals to see there is literally no difference, I did this extensively, it's also been done on loads of youtube videos comparing the same thing. Sadly as far as I can see id provide no metrics into what is going on internally in that memory pool, how much of it the engine considers "in use" which is a shame because I'd actually be interested to know what it's really doing and really using.
And of course on top of that people are measuring allocated vRAM and not in-use vRAM so you have 2 levels of over estimation inflating this value. The largest you have literally what the game has requested from the drivers (malloc), then you have how much of that the engine is really using given a certain fixed size texture memory pool, which is some fraction of that. And then of that fraction a certain amount is just reserved space but never used. And so only some fraction of the total is actually in use and necessary.
We could just be really dumb about this and say let's use the console command to put that value at say 18GB, driving total memory usage to about 22-23Gb and now we can say you need a 3090 and none of the 16Gb AMD cards are sufficient. That's essentially the argument we saw earlier in the thread to debunk if 10Gb is enough just the other way around. Either we care that the memory we've assigned is being used for something or we don't. I care, because it has actual practical implications and it's not just pee-pee measuring. If you also care then things need measuring carefully. Doom is terrible for this and that has been fairly well documented in this thread already.