One of the problems with SDR, is games are typically using the 6 bit colour depth, thats why software HDR via reshade has a big effect as well. It would be interesting to see a comparison of HDR10 vs HDR via reshade.
This could probably be done with final fantasy 15, as that game supports HDR, and it also has a reshade profile.
There is absolutely no question though that screenshot posted showing basic SDR vs HDR is a great difference. The issue with HDR of course is that one needs to break the bank to get a display that fully supports it. However given HDR 400 displays will accept the wider gamut colours been used still, and do at least have some higher illuminance I expect there is still a benefit (because I know from experience software HDR is very noticeable and thats on sRGB screens), the problem been though if you have seen and experienced HDR on a £1000+ OLED, then HDR 400 of course will be comparatively poor. HDR 600 should probably be the min standard though, if anything but to stop monitor manufacturers from cheaping out so much.
I expect most people who are amazed by HDR 10 have never seen reshade before.
As an example it seems a lot of people think that to get vibrant colours you need a DCI-P3 display. To get better visual contrast you need a VA or OLED screen. When many games out now dont fully utilise the extremes of the ranges on existing screens, so the difference is not all down to hardware.
This could probably be done with final fantasy 15, as that game supports HDR, and it also has a reshade profile.
There is absolutely no question though that screenshot posted showing basic SDR vs HDR is a great difference. The issue with HDR of course is that one needs to break the bank to get a display that fully supports it. However given HDR 400 displays will accept the wider gamut colours been used still, and do at least have some higher illuminance I expect there is still a benefit (because I know from experience software HDR is very noticeable and thats on sRGB screens), the problem been though if you have seen and experienced HDR on a £1000+ OLED, then HDR 400 of course will be comparatively poor. HDR 600 should probably be the min standard though, if anything but to stop monitor manufacturers from cheaping out so much.
I expect most people who are amazed by HDR 10 have never seen reshade before.
As an example it seems a lot of people think that to get vibrant colours you need a DCI-P3 display. To get better visual contrast you need a VA or OLED screen. When many games out now dont fully utilise the extremes of the ranges on existing screens, so the difference is not all down to hardware.