Any game can cripple any GPU. Hairworks and RTX are the two obvious culprits. I've been gaming at 4K since I got a 780 Ti.
Try running RDR 2 at 4k on that 780ti in native resolution.
I play on triple 1080p screens for a while as well (as I prefer the larger FoV and can always fall back on a single 1080p when performance is not enough or the game doesn't support multi displays), and that's about 75% of 4k. There are quite a lot of new and relatively older games that require tweaking to various degrees to keep it stable at 60fps - that means no dropping into 50pfs or even lower, no 50 to 70fps and call it a 60. And no, no "dirty" hairworks (aka The Witcher 3) or RT of any kind. For instance, RDR 2 has the majority of the settings to low or disabled. Never mind "medium" across the board as it would deep constantly into 50fps (and perhaps lower).
Anthem, Crysis 3, Deus Ex MD, Just Cause 4, Kingdom Come Deliverance (if I remember exactly), Metro Exodus (without RT), Tom Clancy's The Division 2 (and the 1st one I think), Watch Dogs 2, Quantum Break (I think) are just some that require various adjustments - some minor, some important (as in "it bothers me that I have to lower settings that much or is noticeable). Increase even more the resolution to 4k and the drop will be more significant where it would probably bother me a lot.
Sure, each to their own, some won't mind 30-40fps or even lower - after all, there are millions of people playing games that go into 20fps and don't have the best image quality (plenty example on consoles), or just drop settings as low as needed, ergo wanting constant 60fps at relatively high settings (not maximum, mind you!), may be seen as... ridiculous?
Still, in my book, if is X resolution at Y fps, then that Y fps has to be maintained and frame rates should not drop lower. Everything else is just marketing.
Can the next consoles do 4k@60FPS now, at native resolution? Sure, for most part should be no problem, but in the future... well, that's another story.