• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

More than 8-10 CPU cores just for games, is it pointless?

Soldato
Joined
30 Jun 2019
Posts
8,110
I noticed that some games do benefit from the 10900/10850's 10 CPU cores, but the difference is small vs 8 core CPUs like the 10700k and 5800X.

None of the CPU benchmarks I looked at showed any advantage in gaming for CPUs above 10 cores.

So, should gamers stick to 8-10 core CPUs for now? Then save their money for AMD 5nm /10nm Intel CPUs released in the next 1-2 years? Surely, that would be much better bang for buck?

Also, PS5/Series X games are optimised/designed for 8 CPU cores, so I'm doubtful most developers will bother optimizing for 10 or more CPU cores on PC ports.
 
Virtually all new gens bring perf bonuses, 8 cores is a good shout as consoles are now eight cores, so games that can make use of such things eventually will.
Beyond that, most titles will want clock speed and IPC over cores.
Any modern processor at high end terms will happily play games for years, if you have one, stick with it, and add a better GFX card rather than anything to do with processor at all.
 
Tbf, I think some strategy games like TW:Warhammer 2 may use 12-16 CPU cores in scenes with lots of units them, particularly in large 4 player matches.
 
I noticed a difference in my current game, No Mans Sky going from a 4.9Ghz 8700K with my 1080ti @ 3440 x 1440. GPU Load and Power went up, 140 watts to 200, it felt smoother, Obviously rendering more frames. I have a old tablet which I am running with Remote System Monitor and noticed a good utilisation of threads in No Mans Sky. Which surprised me.

Its not all about Clock Speed. Intel's been well and truly left breathing fumes with the latest AMD chips.
 
Well yeah, that’s the raw clock speed advantage I was talking of. As much as I love Ryzen, I still believe an 8 core Intel cpu like x900k’s are still the best chip for pure gamers. But Ryzen is just so good in other areas like editing, rendering etc
 
My son has just inherited my old 2700X and 2080Ti. Should do better than the consoles for quite some time.

Gave me an excuse to get a 5950X for myself, so the more cores the merrier in my book. Seems to boost just over 5Ghz fresh out the box but I’ll eventually get around to looking into PBO2 to eek a little more out of it.

Should do me easily for the next 5 years with just a GPU upgrade somewhere along the way.
 
I've heard this said a lot about 4K resolution. It's not really true, the games with high CPU utilization at 1080p resolution, will still have high CPU utilization at 4k resolution, providing the graphics card is powerful enough to cope with the higher res. The CPU utilization is generally only reduced if the GPU can't cope (running at lower framerates).

This is even more true with DLSS 2.0 enabled with 4k resolution, which can drastically reduce the load on the GPU (it's similar to running at lower display resolutions), and likely increasing the CPU utilization. The GPU utilization can usually be lowered quite a bit just by tweaking graphics settings too (if needed).

Higher clocked CPUs with more cores (8-10 CPU cores but maybe more in the future) tend to result in higher minimum framerates, especially when combined with high speed RAM.
 
Last edited:
I think we are still in the remnants of lower core designed game engines and it will take a few years of the 8 core consoles to really push forward the underlying engines that can truly make the most of higher core count and so create the need for the higher core cpus.

Diminishing returns on core count is not limited to games either. You see it in design / production applications as well such as Adobe. Puget do a number of articles which often seem to show a tail off in the benefits of extra cores above between 8.

For the OP question, 6 will get you by a couple of years, 8 has a bit more insurance, but for just games I don’t see there being much benefit beyond it. As mentioned, the GPU tends to quickly be the limiting factor ... followed closely by the display refresh rate.
 
6-8 new(ish) cores would be best for now, save the money instead of buying more cores you likely won't use and upgrade to 2nd or 3rd gen CPUs on the next platforms with DDR5, that will likely get a 30-40%+ gain.
For the OP question, 6 will get you by a couple of years, 8 has a bit more insurance, but for just games I don’t see there being much benefit beyond it. As mentioned, the GPU tends to quickly be the limiting factor ... followed closely by the display refresh rate.
You'd already be at 8ms with 120hz and 6ms with 165hz, going higher is almost pointless unless you play at 720p or something and can maintain 200+FPS. Who would go 480hz when you would have to maintain that in FPS for 2ms? Gets really stupid. 1440p and 120/144/165hz is the sweet spot if you care about speed and quality tbh, even then you can't maintain that kind of performance even with a high end GPU in newer games.
 
Games which utilise more than 8 threads are the exception but they are out there. Most games so far don't really utilise beyond 6 cores / 12 threads but games like CP2077 for instance while mostly doesn't demand it there are areas where 8 cores / 16 threads provides a small advantage (though it doesn't really need 16 threads but needs at least 8 real cores and around 10-12 threads in those circumstances or you will see GPU utilisation from from mid to high 90s to low 80s).
 
For modern games I'd go for 8 cores minimum these days especially as that's what the new generation of consoles have, anything more for the moment is just nice to have.

10+ cores will be used in the future, remember when 4 cores were "pointless"?

I remember when lots were using core 2 duos and then said that was way better for gaming than running a quad core Q6600, that didn't work out well not too long in the future when games started to go multithreaded plus running the OS and all the crap you had in the background. That old quad core way out lived those dual cores no matter how high you clocked them.
 
I've heard this said a lot about 4K resolution. It's not really true, the games with high CPU utilization at 1080p resolution, will still have high CPU utilization at 4k resolution, providing the graphics card is powerful enough to cope with the higher res. The CPU utilization is generally only reduced if the GPU can't cope (running at lower framerates).
While cpu utiluzation may be the same at 4k the diffrence in fps is less between the lower and higher end cpu,s

At 1080p its more prominent where you need a fast cpu check out cyberpunk .


https://www.tomshardware.com/amp/news/cyberpunk-2077-cpu-scaling-benchmarks
 
Well he does a point as it is more to do with the fact FPS are lower at 4K rather than the resolution itself. Same could be said if you were playing at 1080p but simply had a weaker GPU.
 
A PC will use any number of cores to some extent, the only reason why they seem superfluous to gaming is that the bottlenecks tend to be in single threaded workloads.
The problem is, and always has been that, certain software functions are difficult to run in parallel, so the additional work more cores can do becomes wasted waiting on bottlnecks.

More cores at the very least will free up the system for more time slices to be used for the main application, although 8-10 cores are certainly not necessary at the moment.
 
Back
Top Bottom