As a 4K owner ever since 4K was a thing I can confirm this to be accurate. I use 4K because my main reason for PC is gaming with significant Photoshop thrown in. So 4K at a decent size is a must for texture creation. Prior to VRR you needed to get as close to 60FPS to get smooth gameplay, or you had to put up with tearing and or stuttering.
- When 980Ti was top dog it could not push Witcher 3 anywhere near 60 FPS average, even with reduced settings.
- When GT1080 was top GPU it had Deus Ex Mankind Divided that brought it to it's knees. The 1080Ti helped a bit but the same game was not close to 60FPS average on ultra settings.
- 2080Ti, massively overpriced but allegedly the first true 4K 60Hz GPU. Except you now had Ray Tracing to contend with (when some actual games where eventually released).
- RTX 3080 and even the 3090 can't push CP2077 RT, or some other recent non-RT titles to 60Hz 4K ultra.
So there is always some new game out there that destroys the "top end" GPUs at 4K.
When driving 4K in the latest and most demanding games, it has always been 100% necessary to reduce settings to achieve playable FPS at 4K. Honestly the biggest impact to 4K gaming is not GPU power, but the introduction of VRR. The 4K 32" Freesync screen I use has a 33-60Hz VRR range. Without VRR even 45-50 FPS can be unplayable, with VRR even mid 30s can be playable.
So for me the next biggest thing in 4K is not higher refresh (it would still be nice of course), but HDR.