@Terrorfirmer
you slap a 2080ti with a 4.2ghz overclocked ryzen 2600x vs a 2500k at 4.5ghz across everything game in steams library - you'll find 955 will favour intel and its faster core speed and IPC
latest games... there are hundreds released every day that are still only single/dual core . What your meant to say is recent AAA titles that are vulcan/dx12 supported or with good multi core coding with DX11
going 2600x at 1080p to use a 144hz screen will get you ZERO % increase in most games! specially with current GPU - Upgrade GPU first , check CPU usage and wait till ryzen 3600 is released
100's of videos out there showing intel 4 cores beating ryzen 1***/2*** at 1080p gaming .
Sorry but this is absurdly off-base. I think we can all take for a given that when someone comes asking advice about a new gaming PC, they're not talking about playing CS1.6 or Chips Challenge.
A 2500K at 4.5Ghz is no match for Ryzen 2600X in modern games. It's not even close, so to say that you'd get zero increase in most games on a 144hz monitor is really misinformed, sorry. Have you ever tried to play Battlefield 1 or 5 on 64p servers for example? The 2500K runs like total crap (dips to 30-35fps), the 2600X runs like butter. This is because a) extra cores and threads are what many modern games are demanding. Assassins Creed is another CPU cruncher.
I've personally played some modern games on a 2500K at 4Ghz and some were insufferable with the bottlenecking and bad minimum frames. Obviously I'm not saying "every single game made since astroid will run way better", but to suggested that an OC'd 2500K from 2011 is just as good as a Ryzen 2600X in modern games, or that the latter isn't a worthwhile upgrade, is total nonsense.
but more then likely a ryzen 3600x will perform better then your current 4c i5
Honestly, you're literally out of touch with reality if you think a Ryzen 3600X will only 'likely' perform better than an i5 from 2011 in modern games. The 2600X destroys it, the 3600X will be what, 20% faster than that?
Compare modern, demanding and popular games and they aren't even same league. Obviously in games from 2013/2014 the i5 is fine. In the latest, mainstream popular games, the 2600X is significantly faster, and in some cases, the i5 cannot cope very well at all. To suggest they're the same is ludicious.
I don't know if posting links is allowed but it's easy to find benchmarks online comparing CPUs in modern games. If we take that a 4c/4t i3-8350K with DDR4 is easily better than an OC'd 2500K, according to Techspot with a 2080Ti in BFV 64P, it's still almost 50% slower than a 2600X (at stock).
And that's a modern, 8th gen 4c/4t part with DDR4 RAM. I would wager Assassins Creed would tell you the same thing. Obviously some games are more sensitive than others, but on a 144hz monitor, there's a vast gulf between a 2500K and a Ryzen 2600X, and most certainly the upcoming 3600X.
I work with PC's so I've personally played on all sorts of machines, I'd say at this point I've used the vast majority of all consumer CPU's imaginable with all combination of GPUs. The last two years or so I've found 2nd/3rd gen i5 and all FX processors intolerable due to bad minimum frames in stuff like Battlefield and Assassins Creed titles. Obviously they're OK (or at least >60fps) in some other titles, but they're a mile off the reliability and consistency of Ryzen hex-core parts.
Anyway, bottom line is that 2600X destroys 2500K, to suggest they're somehow on-par in 2019 is way off. It's not even close. Throw a 144hz monitor into the mix and it's even worse and enters chalk and cheese territory.