You think they are not going to increase vram usage in future games?
I know you don't know what future games' vram usage will be, and whether or not 12 GB will prove too little. Another thing I know is that UE5 is seeing wide adoption and from what we've seen the new tech they've developed leads to much lower vram usage than UE4 (and other engines) for near quality assets, thanks to nanite. So thus far I see solid reasons for why 12 GB will be fine even for future games, and I have yet to hear any solid reasons as to why it won't - besides "numbers go up because they've always went up". Without a timeframe & performance impact it's a moot point.
If future game engines use VRAM more efficiently,
Jevon's Paradox will apply, ensuring that textures (or something else) expand to fill the available space. Resource usage is
not going to go down.
That's not how games actually run. In fact we see varying vram usage depending on game engines as well as per studio philosophy (as it relates to asset budgeting). Moreover the target around which these games are authored (which very much includes texture quality) is that of consoles, and that hasn't changed since 2020. Actually, just like how it happened with SSD speed, it's not just about the hardware available to be used but also the actual authoring of said assets & how it fits with the gameplay. That's why even for a game like R&C Rift Apart that's meant to be a showcase for maximising the fancy SSD in the PS5 (and also not burdened by having to work on all other hardware/platforms) it fails to do that & in fact can be run on much slower SSDs just as well. Just because the resources are available doesn't mean they will be fully utilised because the main goal of the game developers isn't to maximise hardware usage but rather to make the game they want to make, and very often those two goals are in conflict with each other (due to requiring extra time/resources to develop) and it's the latter that ends up prevailing (as it should). As it relates to texture quality I distinctly remember the case of The Surge where they did actually ship with more "optimal" textures where they decided to go with lower quality than the max they could've shipped, because they wanted the storage savings and felt that the quality loss to higher compression was not noticeable. Even if look at AAA examples (f.ex. AC Origins/Odyssey come to mind) we can see that they shipped a minimum viable quality based on consoles, and PC had to do with that rather than have some fancier, nicer higher quality version available. We do also get some HD pack sometimes, but it's rare, and is usually just slightly higher res but not radically different (and now with PS5 gen it will likely all be the same quality).
Ultimately let's not get lost in the weeds of different conversations. The topic remains about 4070 Ti's vram and if it will be enough for 4K*. Given that the devs target the consoles & the increases in vram usage efficiency for UE5 (as well as the for now still missing Direct Storage on PC which will also further help with vram) I contend that it is enough for all the reasons I've mentioned before.
* 4K = to be understood as using FSR/DLSS so as to fit within reasonable performance windows, f.ex. 60 fps, else the vram is irrelevant as the card already buckles to low fps without DLSS.