• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Radeon Image Sharpening

Soldato
Joined
22 Apr 2008
Posts
3,927
Location
Bryn Celyn Wales
Well, I've had a quick search and not sure anything answers my actual question. So, I'm currently running a 28" 4k Monitor. Owned my 5700XT for a few days, however noticed this RIS thing last night and wondered what it was... started reading into it and seemed too good to be true. This could be a very stupid question.... so please, ignore my ignorance but the main article I read seemed to imply I just reduced the texture quality to around 80% rather than the actual res of the game.

https://www.techspot.com/article/1873-radeon-image-sharpening-vs-nvidia-dlss/

Although RIS can sharpen everything, however, when they're stating it can up detail levels from lower res, what is lower res? Do they mean the texture res or the actual game output res?

Example, in Battlefield 5 when I run 4k native on ULTRA (without AA) I'm running around 60fps. So, obviously wanting more fps without quality loss, if using RIS, do I literally

a). Run the game at a lower res or,
b). Is it as I think it is, that I keep the actual output of BF5 at 3840 × 2160 but reduce the textures to say 83% (3187x1793) and RIS upgrades the textures?

Im assuming it's b). Anyway last night I did it using option b) with textures at 83% in BF5 and the sharpness left at default in Radeon at 80% and got around 25-30fps more consistantly and couldnt' really notice any reduction in the quality fo the game when running around. Normally when you reduce the texture quality everything starts looking a bit crap...

I know this sounds a stupid question, but wanna make sure I've got it in my head... it might be that you can do a combination of ) and b), as long as I know it'll work either way, I can mess around until I have a picture that's as close to pure 4k as possible with the highest FPS.

If i'm not being thicker than a thick thing, and it's working as I think i.e. b). wow, like I say I was running around and just could't notice any difference in quality tbh YET was in the high 90fps as a billy bonus. I'm just wondering why not that many people (unless once again I'm being thick) aren't shouting from the rooftops about this as it "seems" to be one of the best inventions in graphics for a while, something that actually works!
 
Last edited:
I think its if you were 1440p usually then drop it to 1080p you can use IS to bump the quality back to 1440p. With sharp textures instead of blury ones. Not sure if it effects ui or 2d stuff as much tho.
 
Well, I've had a quick search and not sure anything answers my actual question. So, I'm currently running a 28" 4k Monitor. Owned my 5700XT for a few days, however noticed this RIS thing last night and wondered what it was... started reading into it and seemed too good to be true. This could be a very stupid question.... so please, ignore my ignorance but the main article I read seemed to imply I just reduced the texture quality to around 80% rather than the actual res of the game.

https://www.techspot.com/article/1873-radeon-image-sharpening-vs-nvidia-dlss/

Although RIS can sharpen everything, however, when they're stating it can up detail levels from lower res, what is lower res? Do they mean the texture res or the actual game output res?

Example, in Battlefield 5 when I run 4k native on ULTRA (without AA) I'm running around 60fps. So, obviously wanting more fps without quality loss, if using RIS, do I literally

a). Run the game at a lower res or,
b). Is it as I think it is, that I keep the actual output of BF5 at 3840 × 2160 but reduce the textures to say 83% (3187x1793) and RIS upgrades the textures?

Im assuming it's b). Anyway last night I did it using option b) with textures at 83% in BF5 and the sharpness left at default in Radeon at 80% and got around 25-30fps more consistantly and couldnt' really notice any reduction in the quality fo the game when running around. Normally when you reduce the texture quality everything starts looking a bit crap...

I know this sounds a stupid question, but wanna make sure I've got it in my head... it might be that you can do a combination of ) and b), as long as I know it'll work either way, I can mess around until I have a picture that's as close to pure 4k as possible with the highest FPS.

If i'm not being thicker than a thick thing, and it's working as I think i.e. b). wow, like I say I was running around and just could't notice any difference in quality tbh YET was in the high 90fps as a billy bonus. I'm just wondering why not that many people (unless once again I'm being thick) aren't shouting from the rooftops about this as it "seems" to be one of the best inventions in graphics for a while, something that actually works!

There's a lot to say, honestly, and there's a lot of confusion in your post because truth is a lot of things are obfuscated and even sites ike techspot are generally clueless and don't bother getting things right. I'll try and summarise and give you the useful info.

First, RIS itself. It's nothing more than a sharpening filter but with the benefit over other sharpening filters in that it focus more on contrast in order to achieve its sharpening which means there's less "side-effects" (ringing artifacts etc) and it also runs with very little performance cost. It does NOT add detail and it can NOT compensate for resolution loss, but in terms of PERCEPTION it can aid with visual clarity loss when running games at lower resolutions or for when games have very heavy temporal AA solutions.

Secondly, when you hear talk of reducing render/resolution scale, it has NOTHING to do with textures it's about the actual resolution at which you render (to clarify: textures can appear sharper & clearer at higher resolutions versus lower resolutions even while the TEXTURE RESOLUTION itself does not change; render scale doesn't mess with the textures themselves & in particular, but it affects everything - shadows, particles etc). You can think of it as no more complex than just reducing resolution in the first place. There's a couple of caveats here: 1) you usually want to reduce render scale instead of reducing resolution outright because that allows you to still have UI elements at native resolution, so they look clearer. 2) it's not exactly a 1:1 when either reducing render scale or reducing resolution because in some cases, especially for games that use TAA in order to upsample the image, the visual results are different. eg in RDR 2 recently I noticed this very clearly, where reducing render scale to an equivalent resolution gave better visuals than simply reducing resolution to that level (eg 65% of 4K (which is equivalent to 1440p) vs 1440p itself). So sometimes results will be different but since we don't know the inner workings of the games, we can't pin-point exactly why. In rare cases (like RDR2) it will be more obvious, but most of the time it won't make a difference and you can absolutely treat lowering the render resolution as no different than lowering the resolution itself.

Practically, yes, just reduce render resolution until it becomes very obviously downgraded or you've reached your performance target, and then add whatever amount of RIS looks most pleasing. It boils down to simple trial & error. While what I said is true about not being able to get more detail or preserve it when reducing render resolution, in reality vision is very complex so as you (and so many) have noticed, it's possible to render at lower than 4K and get near indistinguishable results for various reasons (setup, eye sight, subjectivity, the way the game itself renders etc etc). In general what I've noticed (4K, 55" TV, <1.5m from it) is that 1800p (with rare exceptions) looks very very close to 4K but once you go to 1440p it starts becoming subtly blockier enough in that it is more video gamey looking rather than getting closer to realistic rendering (which 4K would look close to). It's like the difference between mosaics (or impressionist) and hyperrealist art, if you are far enough away from a mosaic you can see more clearly what it is trying to represent but it never really gets that feeling of realism, and that's not bad but it's a marked difference in style.
 
Back
Top Bottom