I personally don't see the point. Those who can afford a 5090/5080 will not care much for 100W more or less. What's that gonna cost, like another 20p a day? Surely not an issue for somebody who just buys a GPU for 2 grand. And as for mid tier GPU, they don't really consume that much power, so why bother to undervolt and save a few pennies? To be clear I am not specifically talking about "undervolt", because like you have said, these terms come with loose definitions. I am talking about the incentive to reduce power draw in order to make the card most efficient at cost of some performance.Describing GPU tuning with just the words 'overclocking' or 'undervolting' ends up with confusion. Firstly, they're fundamentally the same thing - setting efficient voltage for *your* particular chip. Secondly, 2 words is not enough to describe everything people are doing. If I set a more efficient curve, don't allow the card to draw more power than stock, is that overclocking? If I do limit the power but the card is boosting higher than stock but also using less power, is that overlocking or undervolting? Saying things like 'undervolting is better than overclocking', what does that even mean? If I tune my curve so it's as efficient as possible, it is not intrinsically 'better' to limit power lower than stock. It's just using the overhead you gained for a different outcome. Creating an efficient curve is the most important thing.
My intent is always to increase FPS whilst retaining stability, so more FPS in games without crashes, artifacts and stutter/lag. That's all I want, I don't care if my card will draw another 50W or run hotter (as long as it doesn't deteriorate the components ofc). Usually my overclock results in 5-10% performance gain in a stable manner. Can do more for benchmarks, but then I am not really interested in pushing the GPU with questionable VBIOS and LN2 stuff (the thought of doing those things to my precious and expensive GPU makes me cringe).
Last edited: