Aside the fact of FSR being still very young on the market (and soon the Intel's solution will have same problem) and that it took NVIDIA 3? years to get to the place they are now - it's still very subjective. Aside Cyberpunk 2077 (which I just NOW started to play, as just now it feels playable), I have 0 games that need DLSS on my 6800XT to actually run properly in 1440p ultrawide (so not far from 4k). CP could really use FSR too (CAS is nice but not the same quality for sure) - though going down from Ultra to Very High allowed it to be always above 60FPS, with minimal visual fidelity loss (as in, I can't see a difference without magnifying glass on stop frames).
People tend to forget that "Ultra" settings in games are added "for the future" and not for current gens - that's not new, it's been like that for many years now. As in, it kills FPS with hardly any visual difference, in many games, just so one could go back in few years on new GPU and say "Nice, I can finally run it on Ultra!" and then stop playing after 2 minutes, as even said Ultra usually looks nothing like "very high" of modern AAA games from that time.
EDIT: RT is a different matter but I simply don't care in the current state of it - maybe in 10-20 years, when it matures, games stop resembling highly polished mirrors and GPU performance will catch up with requirements for it.