Associate
Looking at analysis of actual vRAM breakdown in many games today shows that's actually not true. Many games have texture pools which are of fixed size and textures swap in and out of, you can typically measure the size of the texture pool through the engine variables like with Doom and see that memory usage is around say 2Gb for textures where as the game takes up to 8-10Gb of vRAM maxed out with raytracing. So it's about 1/4 to 1/5th, and the same is true in many other games. You can often see the memory breakdown in the graphics settings and see this really isn't true.
This is a holdover from the past where games couldn't texture stream and so vRAM limited assets in the world. Once texture streaming became popular vRAM stopped growing with texture size and start growing with other effects that need buffers in memory and also load on the GPU. The idea that you need more than 10Gb to not have blurry textures is kinda crazy, there's no evidence for that at all. High res textures are nice but people cannot tell the difference between UHD textures and something lower quality on a surface that's not right in front of their face, so most of the time the high res textures are just flushed out of the memory pool and lower quality variants are used, and the engine just streams those textures in/out of memory without a problem.
How are you able to look at the memory allocation map for Doom and many of today's games?
Do you have the source code or are one of the original programmers?
The source code for these games is secret property, they won't let you run it through a debugger or memory profiler.
You have some sort of hacking tool that can profile other peoples game programs?
Last edited: