Mipmaps get generated as standard by any engine not just for performance benefits with the LOD system but because texture filtering is necessary to avoid rendering artefacts. You're not going to get significant shifts in memory usage by placing a few more lights. It's about a large number of graphical effects that go into modern rendering, each needing some additional memory usage. I've demonstrated this on a very nice looking game like RDR2, there's a very long list of options in the graphics menu that go into how good it looks, I went through them all and showed most of them have some additional memory overhead. And the counter example you're showing low memory usage for non-texture usage, but you only need look at the screenshots side by side there's an extremely stark difference between something like RDR2 and whatever you're showing.
The left is using things like tessellation on the ground and on the trees, screen space reflections in the water/puddles, shelf shadowing on models, global illumination on most light sources, volumetric clouds/fog/explosions, screen space ambient occlusion, fur rendering on animals and hair, parallax occlusion mapping, motion blur, reflection/mirror effects, anti-aliasing, soft shadows, dynamic lighting on light sources that can move including time of day long shadows, physics simulation on trees/grass/water, lighting from particles etc.
My post here were I breakdown the additional memory usage for each of these is relatively small
https://www.overclockers.co.uk/forums/posts/34927660 but you add them all up when you run the game maxed out and they account for the vast majority of the memory in use. I even have memory on screen showing the usage in my screenshot above.
In fact here's a comparison shot from the benchmark in roughly the same place with the lowest settings for everything, except for textures are still at Ultra so you can see the real memory difference all those combined changes have. 6768 MB real usage for all ultra and 3222 MB for all low except Ultra textures. Actually demonstrating that the menus memory measure for the settings is pretty darn accurate.
That's a delta of 3546 MB of memory. I'm not sure what you think is happening here? Is RDR2 some outlier? Or maybe you think they did a bad job on optimization at high settings and it's just wasting memory unnecessarily?
I think what's happening is something that I go back to a lot, which is that games did used to be like this for a long time. Games mostly used video memory to store assets for the levels, they'd cram that memory full and when they ran out, that was it. And we wanted more memory because we want more assets and higher quality assets. And then around the era of the original Crysis (or there about) we started seeing big open worlds with assets that stream in and out of memory that allowed games to just blow past memory limitations on video cards. If you zoned out your levels well your game map could have vastly more assets in it than fit into memory. Since then things have changed significantly, we've had huge advancements in all sorts of rendering effects and most of those things need their own buffers in memory. Textures and assets have still got better and use more memory, but their relative proportion compared to other effects has gone down and that's happening at a faster pace, especially now we have RT where BVH tables take up a substantial amount of memory. People have a lot of assumptions today about how these things work that are holdovers from gaming 10+ years ago that today just aren't the same. Which is kind of why you're seeing what you're seeing in your example, lots of models and textures, very sparse on other graphical effects, it visually looks like a game from 10+ years ago.
This is why i've moved away from the conventional wisdom of future games will use all this memory for assets and thus the amount of memory you need is tied to what the games demand. Towards a different paradigm where a lot of the memory is used for graphical effects and the amount of memory you need is tied to how powerful the GPU is. If your GPU is slow and can't run these effects at a decent frame rate you turn them down/off and it fees up memory. What we should be asking is, is 10Gb enough to service the GA102 GPU memory needs? The answer seems like yes. By the time you've filled that 10GB of memory like with say FS2020 which is one of the few games that gets close, the frame rate is in the toilet.
I appreciate you've gone to length to show metrics in whatever unity game that is, I take your point. But anyone can get free access to unity and drop in store bought assets and you can do so to achieve any numbers you like. What I'm talking is about is real commercial games using modern rendering techniques and leveraging all the optimization techniques to get the game looking as nice as possible for the least performance cost.