Now listen here...
I've always said and will still say that I have used DLDSR where appropriate, Hogwarts at the time I was piling some hours into it was in a abd state engine wise, and actually still is having recently reinstalled it twice to see what's new. It's an engine that I felt had no meaingful difference whether DSR or not because it's so broken as an engine whereby the high settings look better than ultra, and ray tracing is basically broken still too, along with the traversal stuttering which continues. So pixel peeping DSR res was the least of my concerns at the time
Cyberpunk on the other hand looks superb regardless of DLDSR or not, so I favour the 3440x1440 DLSS Quality over DLDSR 5160x2160 DLSS Performance. My screenshots are proof of the image qulsity, especially since the 2.12 update which added some additional engine tweaks for path tracing which cleaned up more noise and such.
The reason for such a high render res in TotK is because it's an engine/emulator quirk. The game's shadow resolution can be increased to 8096 via mods, but it's unstable because the Switch wasn't ever designed for such high memory allocation for shadow maps, and the emulation is basically cloning what a Switch does with some tweaks. Plus shadow resolution is internally mapped to render resolution, so the higher your render res, the less jagged the shadow edges are at a distance. At native render 3440x1440 there is visible stepping on edges of shadows out in the world regardless of what AA/AF etc is used. The only way to smooth this out is to render at 2x or greater than native res, hence 6880x2770. It's insanely demanding on the PC, as evidenced by the RAM usage, but it can hit 60fps which is all that's needed for console emulation, any more and you run into cutscene timing issues which isn't lol.
As the Ryujinx devs say too, anything above 2x is really overkill for any system as 2x cleans hings up nicely.
So yeah, I use higher render res based on context, where it matters most
And exhale!