Exactly.
Also, surprise surprise, using a benchmark from when the game first came out and has had multiple patches to improve perf etc. since

I rather reduce settings than play at 30 fps or the obvious choice, use dlss and get a better experience overall..... especially for the ray reconstruction which massively improves IQ as evidenced.
PCGH did an update recently with the new dlc that came out, if we're looking at 1440P:
Aside from 4080+ there, no gpu has playable performance there imo so question is then, is likes of a 4080/4090 worthwhile purchase for the rare demanding titles like this?
No doubt, AW 2 is a game which will prefer more vram at res such as 4k, even if you were to use dlss, 12+GB probably is required for the "best" experience (and even more if using nvidias frame gen since it uses more vram).
The only way a 3090 would have ever been worth it imo was:
- if you really couldn't get a 3080 for <£900
- you need the extra vram for workloads
- you mod your games with extreme high res texture packs
"future proofing" as shown is a stupid reason when it comes to purchasing anything these days.
There was a questionable time with some awful releases last year about vram, more so for 8gb gpus but as shown, a few patches came out for such games like TLOU and sorted most of the issues out and now with ue 5+, as also picked up by a few channels, vram optimisation is in a far better place where EU 5 seems to be generally using quite low amounts compared to other games. Snowdrop engine is the best engine to date for vram optimisation and the way it works with the buffer system and thankfully, all of ubi titles will now be using this engine to my knowledge.
The fact that nvidia changed this approach with the 4080 and 4090 shows that they needed to make the xx90 more worthwhile and given how successful the 4090 has been, it's safe to say, from their pov, they did the right thing.