The real test is open world games, the rest can more easily be fudged (corridor-like progression of Gears et al in particular) and when we do look at OWGs we can see a big gulf between PC & Consoles, because everything gets stressed GPU, CPU, memory, storage, the software itself, everything is stressed. And that's without even adding in RT, which will absolutely keep the hardware on its knees and barely let it breath. Certainly if you want 60 fps you can't stay on consoles because even nuking the settings it's going to be tough to maintain that particularly as their production starts making RT necessary (Avatar: FoP, Jedi Survivor, etc.), which means you'll have a medium/high raster settings + a half-nuked RT implementation compared to PC, and hopefully a stable 30 fps, usually upscaling from 1080p-ish.
On the other hand, we look at something like the 4070, it's rumoured to be ~40 TFlops (fp32) which is 4x the PS5 (which is already >2x the actual baseline - Series S), and even though that doesn't mean it's 4x faster based on that, if we look at how much faster the RT is done on Nvidia, plus DLSS (which has a huge advantage in performance vs standard TAAU/FSR 2.0), it's easily going to run 3x faster at a minimum. Sure, assuming the msrp stays the same $499, it's not going to be cheaper than consoles but considering how much faster it is, and the cost savings on PC.... it's tough to really stay committed to consoles if you're not computer-illiterate, and that's only 2 years in! Given that we can see how much longer the console generations are now, it's certainly better to invest in PC than a console (hell, you'll be able to re-use everything in the PC for a "next-gen PS6" build besides CPU/Mobo/GPU).