Has the age of overclocking and chasing numbers ended?

I used to build overclocked workstations day in day out. For years they were our most popular product. I now do the odd one or two for special customers whom money is little to no object. Other than that its not really worth it from a commercial perspective. The only company i have seen for a while does it for HFT and they are very specialised and their systems are very well developed and built but for a market that the money for the hardware means very little and its the very high performance and ultra low latency that can be the difference between making a few million dollar trade and not.
 
I only overclock for fun when I get a new CPU or GPU. Just to chase numbers, bought a 7900 XTX last week and overclocked it and enabled PBO on my CPU. Used the AMD built in auto GPU overclocking and got this result in 3D Mark...

https://www.3dmark.com/spy/42768929

Its not in anyway stable and has frozen on me and black screened, but it got through 3D Mark and I got a score that's all I wanted.
 
Traditional overclocking has been dead for almost a decade, but number chasing is still very much a thing.

The focus now is maintaining your boost clocks for as long as possible by maximising your thermal headroom. If a CPU hits max boost until it reaches X temperature, then undervolting as much as possible generates less heat, which means the CPU takes longer to reach X temperature, and therefore boosts longer.

Undervolting cores, memory timings, fabric/interconnect frequencies, chasing all these numbers is still very much the domain of the enthusiast, especially with AMD/Radeon hardware. It's just not as black and white as "overclock to X to get Y performance boost" any more.
 
I was never really one for chasing WRs but still one thing which really ruined it for me was someone could go to massive efforts with a retail card tweaking and understanding how to max it out to get a WR then someone from one of the manufacturers could come along with a factory cherry picked core and a variant of a product not available at retail and wipe the floor with that record without half the effort.

Personally I liked the satisfaction of having a well tweaked, optimised system especially if you could buy an upper-midrange part and with some effort get within the ballpark of the top end part performance.
 
Unless you work for one of the companies, are part of a media company such as LTT, GN, J2C, backed/sponsored by one of these companies or are just particularly wealthy its pretty pointless using their scores on 3D mark, Cinebench et al as anything but a guiding benchmark. They often have multiple staff and tens if not hundreds of thousands of pounds in development budgets to get these leaderboard scores. There will be the odd person that can come close and even be competetive.. And then they often get employed by one of these companies and then the envelope gets pushed further our of reach or normal enthusiasts.

By all mean, get overclocking to squeeze out every bit of performance you can to make todays products better performance for the price, its your money at the end of the day.. Unlike most of the people working at these companies that more often than not dont have any skin in the game.
 
Unless you work for one of the companies, are part of a media company such as LTT, GN, J2C, backed/sponsored by one of these companies or are just particularly wealthy its pretty pointless using their scores on 3D mark, Cinebench et al as anything but a guiding benchmark. They often have multiple staff and tens if not hundreds of thousands of pounds in development budgets to get these leaderboard scores. There will be the odd person that can come close and even be competetive.. And then they often get employed by one of these companies and then the envelope gets pushed further our of reach or normal enthusiasts.

By all mean, get overclocking to squeeze out every bit of performance you can to make todays products better performance for the price, its your money at the end of the day.. Unlike most of the people working at these companies that more often than not dont have any skin in the game.

IIRC some of the overclocking pics recently of the new TR are from inside the actual AMD cpu lab. Can't get much more support than that!
 
Overclocking across most hardware now is not as worthwhile as it used to be.

People still push their hardware, but they do it in a different way.

Before it was a case of increasing voltages and bumping up multipliers, or core speeds on graphics cards.

Now it’s all about undervolting. On the 4090 I set voltage to 0.97 from 1.05 stock and get 2800mhz on the core.

With the cpu I can undervolt it and bump clocks up to 5.8 all core.

The days of buying a 4.0ghz cpu and bumping voltage and clock speed up to 5.0ghz are dead.

Hardware is pushed to the limit at factory.
 
Last edited:
I've noticed this.

Looking for upgrade options for a 2019 build - 9600K Pre-Binned 5.0GHZ/RTX 2070. Previously had a q6600 too I think.

Naturally I've looked for the latest bang for buck 'Q6600 but overclocked option'. I can't keep up with the constant generational changes in CPUs but it seems Intel have done the overclocking for us now.

Cost of hardware is alarming now too.
 
I stopped ages ago. I'll let the BIOS manage it automatically but the real world gains are so small it's not worth the time.

Over the past decade progress with gaming has just halted. Look at Starfield, it doesn't look much better than Skyrim did in 2011. The drive to upgrade as often and overclock things isn't there anymore. The coin mining craze and huge price hikes (which ofc have stuck afterwards) have killed it.
 
Last edited:
I stopped ages ago. I'll let the BIOS manage it automatically but the real world gains are so small it's not worth the time.

Over the past decade progress with gaming has just halted. Look at Starfield, it doesn't look much better than Skyrim did in 2011. The drive to upgrade as often and overclock things isn't there anymore. The coin mining craze and huge price hikes (which ofc have stuck afterwards) have killed it.
Spot on, it's not like it makes something go from unplayable to playable either. My big issue is a the innovation in gameplay. I'm getting on a bit now and I feel like I've seen it all before. Great for the kids coming to it new but it all feels like a rehash of old games and if you're lucky they look a bit shinier.
 
The old days of running my intel 200mmx at 233mhz and thinking WOW :cry:

My first ever cpu was a cyrix 166 or 200 and it was crap. :(
 
Last edited:
I stopped ages ago. I'll let the BIOS manage it automatically but the real world gains are so small it's not worth the time.

Over the past decade progress with gaming has just halted. Look at Starfield, it doesn't look much better than Skyrim did in 2011. The drive to upgrade as often and overclock things isn't there anymore. The coin mining craze and huge price hikes (which ofc have stuck afterwards) have killed it.

That’s a poor game to use as an example. Cyberpunk 2077 shows a big leap over both Skyrim and Starfield in both graphical fidelity and computational power required to run the play environment. Game progress hasn’t halted but it has slowed down
 
I miss the old days where overclocking a CPU is easily testable. It either works or will fail a torture test.

Now especially ryzen CPUs may pass hours of torture but then crash at idle doing nothing but staring at desktop or other trivial stuff. Not to mention clock stretching where you think you are getting more performance but in reality its faking it.

It used to be simpler... miss those days. Buy an average 'overclockable' CPU, try optimise the cooling so you can up the voltage and get more clock speeds (and often spending all that money you saved on the CPU on the cooling/case/tinkering not to mention hours of time and effort) --> get more performance, was straight forward!
 
Used to spend a fortune searching for specific steppings, just to get an extra 50mhz. Then setting an alarm for 3am
so you could get a screenshot of that 12hr Prime95 run, with the neighbours wondering why you had a window open in the middle of January.

Now it doesn’t seem worth the bother of doing anything more than setting PBO to +100mhz and fiddling with the IF a bit. The gains are tiny.
 
Last edited:
That’s a poor game to use as an example. Cyberpunk 2077 shows a big leap over both Skyrim and Starfield in both graphical fidelity and computational power required to run the play environment. Game progress hasn’t halted but it has slowed down
I think polygons are going to die out. The computational power to process billions and multi trillions of them is going to hit a stagnation point,and 'they' will have to come up with something else. I lived through centipede on the ST, Amstrad, commodore,tape to cartridge to CD to DVD blah blah. VR isn't the answer. I don't know what it will be but it'll probably be mind blowing.
 
Back
Top Bottom