Sadly I thnk TPU analysis of VRAM usage is pretty bad, the only reviewer who handles this stuff properly in my opinion is digital foundry. e.g. DF pointed out on FF7 remake that the game of course has a wide spread stuttering problem which has been proven to be caused by textures and shaders swapping in and out of VRAM. DF noted that everytime the game stutters, commonly used performance metric tools dont pick it up so someone analysing gpuz or afterburner logs would see no performance issues. Whilst DF have their own external tools (which they had to develop as they do consoles a lot that dont have their own internal tools) that did pick up the stutters. They had the balls to tweet out the review industry got FF7R wrong but sadly didnt have the balls to put it in their video.
Another issue the review industry has with VRAM analysis is they dont analyse texture streaming and texture quality, so e.g. a 8 gig VRAM card might gets the same framerate on ultra textures as a 12 gig VRAM card, but if you examine the textures they might be lower quality, as all UE4 games dont have static texture quality, they have "variable" quality which dynamically controls the texture LOD based on VRAM and other resource availability (the texture quality setting in UE4 games sets the max variable quality its is not guaranteed). I remember tinkering in FF7r on the UE4 console trying to turn this behaviour off but seems its difficult to do. Then of course you also have texture pop in issue which again increase when VRAM is under stress. (Note on the 10 gig 3080 thread someone kindly made screenshots for me from a game where he said he couldnt tell the difference between different texture quality levels and I could immediately see the difference, so different gamers have different sensitivity to it), for me texture quality is one of the most import quality metrics in games. its very jarring to see a high res character standing next to a low res blurry wall.
I compared playing FF7R on the PS5 to PC was set to high detail textures the max setting. In chapter 8 as an example the PC was loading the textures the PS4 had very low res PS3 quality type textures, PS5 has a much higher LOD, but in other areas the PC matches the PS5, chapter 8 also has a lot of pop in issues on the PC, which suggests its demanding VRAM wise. In areas like sector 7 (chapter 3). especially on the yuffie DLC where they expanded the explorable area, there is various stutters that can happen due to the sheer amount of objects that get streamed in and out.
I found someone with a 3090 who was prepared to help me with testing we did it live on discord, and we discovered together if you manually increase the VRAM budget on the UE4 console on FF7r, most of the issues disappear, chapter 8 textures are like the PS5. Stuttering doesnt vanish but its 90% better, pop ins go away, however by default on a 3090 the game doesnt increase the VRAM budget my much on a 3090 by itself, so if you was just a dumb tester like the TPU guys seem to be, and you just swapping hardware, you not really testing 24 gig VRAM as the game isnt been allowed to use it, it has to be manually configured on the UE engine. On my 3080 it sets the VRAM budget to 5500MB, on the 3090 it was set to 6500. When we bumped it manually to 20000 it was way better. I should also note if I left it at 5500 my game crashed with VRAM OOM, I have to reduce it to 5000 or lower (or drop shadows or textures to low quality) as the game also uses VRAM for RAM type stuff. Something which is going to get more common on newer console ports due to the shared memory architecture. FF7r has extremely low RAM usage as most RAM type stuff is loaded into VRAM instead. e.g. it needs over 3 gig of VRAM just to get to the title menu. The new far cry game was coded the same way originally and the patch that mitigated VRAM issues moved RAM stuff from VRAM to RAM to free up VRAM for textures.
FF15 another game with VRAM issues and the reason I originally upgraded my 1070 to a 1080ti. This game had amazing visuals when you turned on all the hidden options and used alongside the 4k texture pack, but the VRAM usage was crazy high. In addition if Nvidia grass was enabled it had a nasty memory leak, the game code overflowed VRAM usage to system RAM, so if you ran out of VRAM the game didnt crash, you just had a large performance loss, however it carried on leaking and the entire windows OS would hit critical memory condition if you didnt restart the game before all virtual memory was consumed, FF15 also is what triggered me to upgrade from 16 gigs of ram to 32. This bug never got patched. Likewise ff7r issues havent been patched.
I hope this post explains how easy it is for reviewers to not analyse these problems properly (I have read TPU analysis of VRAM's effect on games and its just embarrassingly lazy), and when I think about every game I have played that has stuttering issues (usually JRPGs), the problems are always linked to texture asset swapping. Texture asset swapping is basically a hack to get round under spec'd memory resources. Initially developed for xbox360 and PS3 which had an extremely low amount of memory just a few hundred megs. Ironically the industry has kind of done a 180 and its now PC gpu's that may be under spec'd for VRAM with consoles having more generous amounts. The UE4 engine with the development of things like directstorage seems to finally have its issues resolved, but of course PCs dont yet have that API so the problems remain.