Cheers for reply ANdy,
Crysis was a serious graphical masterpiece though compared to anything else around at the time. The last of US I'm afraid isn't. Nor was FC6, nor was Hogwarts. They are games have extended the boundaries of the current game engines that are mostly soon to move on a gen. Could be
It's just that the recommended specs are 8GB, that's not minimum specs - recommended means really you need 8GB to play it properly. 24GB doesn't do that currently. Post #21 shows there is near as makes no difference between high and ultra settings on higher tier cards.
Saying NV has weaponised VRAM is a tad strong. I'd agree games ran out of VRAM, but really, in the grand scheme of things, the only ones that have, are games tested upon release. Like you say it will be fixed, I dunno how much the textures get lowered etc, but post 21 shows near as makes no difference in performance between high & ultra @ 1080p, though that could be down to CPU bottle neck. But I'd still expect much higher performance from a 4090 @ 1080p. Plus, textures at 1080p saturating 8GB.....I just don't think that's right. The game compared to all others at 1080p will really have to be a visual wow to demand that much and would be apparent at higher rez's.
It's only AAA games in the last year or two that have been sent out in terrible states. Maybe VRAM is just the first thing to highlight this. Heck, I remember when BF4 had a memory leak and stuttered when you reached your cards limit. CRash - start again, this isn't much different from that. If people thing that TLOU really is next level gfx, even at 1080p then I'm all ready for some screen shots. These will need to be saved for when the game is fixed as if you say that textures will be downgraded (not optimized) then the difference in graphics should easily be apparent.
I just wouldn't advise people to upgrade anytime soon based on day1 release performance. Only because, all games touted as such before, have gone on to be fixed and look great and run on cards that showed an issue on release. Giving game devs infinite amounts of VRAM to soak up poorly written games rushed to meet shareholders demands, will only make it more expensive for the consumer.
Really, games coming from UNreal5 engine and next gen engines will be the proper place for seeing how much VRAM game engines need.
As you say, seeing as most have <8GB, then naughty dog wont be selling many copies, which isn't a great way to sell or market your game when you want MSRP for it. I too, want this game.
This VRAM debacle will be much better had when the games on Unreal engines start to release, of course they will run less than optimally at the beginning -just in time before many are drawn into next gen gfx.
BF4 I saw loads of people buying i7's during the beta for the hyper threading as it ran better. ON release - ran fine on the i5's they all swapped out. CPU's fault back then.
Right, off to take the boys to the pub as they have 2 weeks off for easter, ground too wet anyway to do anything. Hopefully no drunken posts later

apologies in advance.