I brought up my GTX 560 to illustrate that there is a point where the GPU can't use any more vram because it's not strong enough to make use of it.
How do you calculate where that point is, or do you actually think that the amount of vram any GPU can use is unlimited?
Look you asked about GTX1080TI saying you doubted it would run out of VRAM. I ran out of VRAM at qHD on a slower GTX1080 in those scenarios in 2017. The GPU wasn't at full utilisation in both cases. After that I was more careful on what settings and mods I used to manage it within the 8GB framebuffer.
If I had a 4K monitor,yes your GTX1080TI would use 11GB of VRAM fine without even being fully utilised. Because people have run out of VRAM in modded games at higher resolution using one. Once you ran out of VRAM,you get caching to system RAM,which causes stuttering and GPU utilisation falls.
A lot of people keep saying low VRAM is fine - but most don't actually bother to test and see what happens if a game goes over it. If it leads to performance drops,you have run out.
The thing is unlike 10 years ago,VRAM increases have slowed down from generation to generation. It was common to get a doubling(or at least a 50% increase) in VRAM quantities at each generation.
Between the Nvidia 7800 series/X1900(512MB) and Kepler/R9 290,we went from 512MB to 4GB/6GB VRAM. This is a 12X improvement(or 8X if we ignore the Titan). That was in 8 years. We also went from GDDR3 RAM to 512 bit GDDR5 memory buses.
In the six years since,we have not even seen a doubling(6GB to 11GB),despite GPUs becoming much more powerful(the Titan is now so expensive its an outlier but is only 4X).
The old consoles only had 8GB or a bit more RAM IIRC,so there was at least some level of trying to manage things and you see that with textures.
Many games don't use as high resolution textures as they could due to console limitations,but certain games can,which does push VRAM usage up. Modded games can also move textures to 4K and even 8K - in the past games used to increase texture resolutions more quickly per generation,as the VRAM quantities were rising in dGPUs.
Now we have consoles,which have not only increased VRAM amount,but can now use very fast SSDs to cache textures. That means more VRAM usage,and more need of faster storage to cache files.
Most PCs don't even support PCI-E 4.0,so can't run something similar.We also have increased use of DX12/Vulkan which also seems to increase VRAM usage.