Totally false. You still run into CPU walls even at 4K, moreso if you are talking raytracing, but particularly if you use an Nvidia GPU. Reviewers are still slow on the uptake & mostly incompetent so you aren't going to see proper benchmarks on CPU RT, but ask the users out in the wild.
- So, if you're buying an Nvidia GPU, then the 3700x will 100% be a bottleneck, with or without raytracing (see HW unboxed for tests).
I can tell you from my own experience with an RX 6800 & i7 6800K that in Cyberpunk the fps plummets and more than halves once raytracing kicks in (this is at 360p or so, so it's not the GPU), staying closer to 45-50ish fps than 60 fps. In general open world games are particularly brutal on the CPU once you also add RT, WD:L was also quite hard to run properly initially until they did some more work, and it's still not perfect (esp. if you start ungimping the streaming settings that are meant for consoles).
- If you're turning raytracing on, then you'll run into a bottleneck with either GPU vendor (you'll have to hunt more data for yourself, but see tests here: https://www.pcgameshardware.de/Rayt...acing-CPU-Cost-Benchmarks-Frametimes-1371787/).
I would say that for next-gen games you'll most likely want to upgrade the CPU too, with a Zen 3 at the bare minimum, if you get yourself a nice beefy RDNA 3 or other GPU. The CPU requirements will only go up from here on out. Games like Avatar? That's gonna be an absolute bloodbath with its RT foliage, heck even TD2 already gives CPUs a very nice workout.
I am the "users out in the wild" and been happily 4k gaming with a ryzen 5 2600 (2080ti/3080ti).
I can't run GTA 5 with grass on Ultra so I guess I should just throw it in the bin.