Associate
- Joined
- 19 Jul 2009
- Posts
- 207
- Location
- East Yorkshire, U.K.
Just an addendum to the above. I think we need to change our mentality towardws overclocking GPUs.
Constant MAX Core Clocks are archaic now in the face of Kepler. Infact they are useful for one thing and one thing only - benchmarking.
For anyone who could not care less about benchmarks (And really, that should be 100% of gamers at the end of the day) the Kepler dynamic approach is far more sensible and, in reality, far more beneficial to gamers.
You want the high boosts when you need them to keep your min/avg frames up, max frames are irrelevant in games. Having fully dynamic voltage is preferential, keeps your heat output down when you do not need that boost to maintain min frames and gives you headroom to boost the crap out of the core when you do.
Rather than fixing your voltage and core and generating heat 100% of the time for the benefit of nothing but max framerates.
I think nVidia got this one spot on and the "Delivering a consistent smooth gaming performance" sentiment is exactly what us gamers should want. At the end of the day does it matter how it does it?
I 100% agree with that.
I'm finding the GPU Boost to be an awesome addition to this card and that's coming from someone who has never bothered with overclocking the GPU because (a) the upped clocks either proved unstable or only raised the framerate by 1-3 fps; and (b) the thought of altering voltages and frying my card in the process filled me with dread!
I have set the GPU clock offset to +135 MHz and the memory clock offset to 300 MHz on my GTX 680 and so far in every benchmark (Heaven 3.0 and 3DMark11 included) and game (over 40 so far) the GPU Boost has worked beautifully.
Now I play games on a 24" 1920x1200 HP LP2475w monitor with v-sync and triple buffering enabled so my ideal experience is to have a flawless 60 fps. The GTX 680 along with the GPU Boost works to achieve that, giving me smoother running games with higher minimum framerates than my previous GTX 580. And it achieves that by only overclocking when needed so I'm not generating excess heat nor wasting electricity doing it.
Enthusiasts and benchmarkers might gripe about not being able to disable GPU Boost but the majority of people will no doubt be pleased with it. That said, I don't see why NVIDIA can't add an option in their drivers for people to disable it if they want.
Last edited: