Simply not true. You don't have to use graphical presets in games. If you have the VRAM, you can keep all the other settings the same whilst cranking up the texture setting to ultra. This has zero effect on your performance, unless you don't have enough VRAM, whilst often providing a nice boost to image quality. Running a game at medium settings with ultra textures is still a hell of a lot better than running it at medium with medium textures. Feel free to fire up Red Dead Redemption 2 if you'd like to see a stark contrast first-hand.
This is actually something I've run into personally recently via revisiting the R9 Fury. In scenarios where it isn't limited by VRAM, the Fury is generally around 10-15% faster than an RX 580 (a card which itself is still absolutely fine for playing the latest games at 1080p). Yet in a lot of newer titles I've found it to be a complete stutterfest without dialling the texture setting down a notch or two. In some games that doesn't have too large an effect on image quality, but in others (like RDR2), it's huge, and so even an 8GB 580 provides a notably better experience, despite the Fury's extra grunt. Other titles, like Doom Eternal, completely lock out 4GB cards from choosing their higher texture settings altogether.
We'll see how VRAM usage fares over the next couple of years, but I'd expect it to balloon significantly when ports of games designed with the new consoles in mind start arriving. I reckon a 10GB 3080 will start looking a lot like a 4GB Fury before Nvidia's next cards arrive, especially since it's a card basically made for 4K+ gaming in upcoming AAA titles, not scratching around at 1080p in games designed with an APU from 2013 as a baseline.