It seems there's at least one thorough analysis of Radeon Image Sharpening using a resolution of 0.7-0.8x of 4K then upscaled, with apparently excellent preservation of image quality and a very decent bump to frame rates.
I'm wondering how lowering the resolution and relying on RIS works for 1440p - or more specifically 3440x1440, which shares 1440p's DPI but is getting on toward 4K resolution in sheer number of pixels to push.
I know I'm not alone in asking this having seen this question posed and go unanswered elsewhere.
I had a 1070 Ti with my 100hz 3440x1440. While I know the 5700XT is more powerful than that card in all but an extreme minority of games/settings, the ability to turn 65 fps into 75-80 fps with no real image quality loss would be a game changer in my opinion, and no doubt make the 5700XT > 2070S/2080, yet comments about this possibility are almost non-existent. Where comments do come in, it seems they get a lot of responses regarding FidelityFX on Vegas, which RIS is not.
I'm wondering how lowering the resolution and relying on RIS works for 1440p - or more specifically 3440x1440, which shares 1440p's DPI but is getting on toward 4K resolution in sheer number of pixels to push.
I know I'm not alone in asking this having seen this question posed and go unanswered elsewhere.
I had a 1070 Ti with my 100hz 3440x1440. While I know the 5700XT is more powerful than that card in all but an extreme minority of games/settings, the ability to turn 65 fps into 75-80 fps with no real image quality loss would be a game changer in my opinion, and no doubt make the 5700XT > 2070S/2080, yet comments about this possibility are almost non-existent. Where comments do come in, it seems they get a lot of responses regarding FidelityFX on Vegas, which RIS is not.