In terms of gaming performance, there's nothing out there that's significantly faster than a 2500k, which is disappointing really. Other chips may be faster in the future once games that are capable of utilising 6 or 8 cores come out, but for the time being, no.
It really does depend on the game, the amount of GPUs and the settings you're running it at.
For the most part and most people though it's definitely true.
I've yet to see any benchmarks showing a CPU at the level of a 2500k becoming a bottleneck in games before the graphics card in any meaningful way, when the following happens I'll be upgrading...
crysis 4 with a 1080GTX and 2500k - 30fps
crysis 4 with a 1080GTX and 6500k - 60fps
But I just can't see it this console generation, 8 cores or not.
I think we will, however a lot of benchmarks tend to ignore 2500K age chips and use only the more recent ones.
But as above, it depends on the games. This new console generation ha barely started, we've had the first wave of games, the next are still in development. I expect we'll start to see changes somewhat soon.
CPU is responsible for game logic, physics and shunting data about from RAM but the vast majority of work is done by the GPU hence why 2500ks are still going so strong even with multiple generations of graphics card released since.
Improved physics will hit the CPU but even then we're seeing it offloaded to GPU to an extent with PhysX etc so who knows.
It depends, the CPU tells the GPU what it needs to do (in simple terms). However this is the point of Mantle and DX12, the idea is to reduce the CPU overhead to bring minimums up with PCs that are using weaker CPUs.
With regards to PhysX, it really is an unfortunate situation. It's barely used at all. Very few games actually offload the physics to the GPU, and it's because nVidia are more interested in using it as a checkbox feature than actually putting some real good work in to it.
Plus, the proprietary nature of it means very limited take up and usage from developers anyway, as no developer can fully integrate GPU PhysX so that a PhysX cable GPU is a minimum requirement without damaging sales. As anyone who doesn't have an nVidia GPU couldn't play it.
So devs are limited to using it to add tacked on features that aren't game changing.
I hope to still have this chip in 2020 tbh. Overclocked it up to 4.6ghz stable but then put back to stock because I realised there was no point lol. When games come out to challenge it I'll up it back to 4.2...then 4.6...then see about an upgrade
I suppose it's one of those things, even if it's bottlenecking, if the performance you're getting is satisfactory, then it doesn't matter all that much.
Wtf is with the walls of text from spoffle and Dave?? Grow up kids!
Yeah, no. That's also offensive to kids, a lot of kids are actually happy to learn and taken in information and not completely reject it just because they don't like or understand it.
Dave didn't want to listen to the way games are developed. I see people complain about consoles and blame them for PC games not being more advanced than they currently are, as well as moaning and whinging about "console ports".
So I addressed it, and pointed out to him that games aren't made on consoles then converted from the console version to run on the PC. The guy didn't want to know, but tried to carry on insisting that his complete ignorance was how it was done.
This is something everyone should be bothered about, because when the majority of people think buggy games are because of consoles, the people who are responsible for shipping buggy games don't have to take responsibility for it because very few people are calling them out on it.
So the more people that realise how games are ACTUALLY developed, and the better, and they'll stop blaming consoles for crappy builds of games on PC and blame shoddy developers like Ubisoft for shipping crap.