This is obviously the real kicker for those with AMD cards who are trying to clutch at straws to justify 16GB of memory. That the competition has a card with way more vRAM. And any weird argument that justifies 16GB over 10GB would also justify 24GB over 16GB.
It's also super amusing to me that with CoD Cold War in the video settings menu you can literally set how much vRAM to allocate. You can set options for 70%, 80% or 90% which is the first time I ever recall seeing this level of control for vRAM. So if it wasn't obvious enough to people yet that vRAM allocated is a dumb metric, this should hopefully shatter that illusion. Because you can put that bad boy on 90% with a 3080 and get ~9GB allocated but then put all your visual settings to low and see the very same menu tell you that you're only using about 2.5GB of vRAM based on your options. And the old way of measuring vRAM (allocated) would tell you "hey you're right near your 10GB limit!!!!!11oneone" when in fact the vRAM used value just confirms what the menu advises you, that you're really just using a few GB. You can see by looking at other modern games how much they allocate is often based on rules of thumb which are little better than the 90% option. Which is why you end up with some games allocating 22GB of a 3090's vRAM.
I also see people with 3090s measuring vRAM allocated in the ~20GB on youtube videos of Cold War, when we know maxing the in game settings isn't going above about 8GB of real in game measured usage, which is about on par with what the menu will advise your settings will use.
"Hey u guise, Cold War uses 20GB of memories, time to upgrade from your cruddy old measly 16GB AMD cards to a real 24GB king that can manage all 20GB of usage, rite u guise ¯\_(ツ)_/¯"