Nvidia cards, Hardware Calibration, HDR and Banding

Associate
Joined
25 Aug 2005
Posts
668
Hi,

I've got a 20 series card connected to a 4 year old mid range Samsung TV and a 30 series card connected to a HDR1000 edge lit monitor.

The banding bothers me the most, I'm aware of the reshade debanding filter for games.

HDR is non existent on the TV and worse than SDR on the monitor. The monitor can do black to painful without HDR enabled, It's a Phillips 436M6. The TV does nothing exciting at all.

Would buying a spyderx pro and calibrating the displays help with either issue? I've googled and I'm more confused than when I started.

Thanks.
 
Hi, thanks for the input. I can enable HDR and it looks fine in videos but not a great deal different to SDR. I understand that's a limitiation of the displays. In most games I get higher brightness but washed out colours. It looks better just leaving it in SDR and bumping the brightness a notch, so I wondered if it was possible to do a calibration while in HDR mode and get a more usable profile?

Banding is apparent on both displays if I look for it, even in my VR headset. It's probably an issue with the content more than anything. I've seen mention of hacks to enable dithering in the drivers, reshade filters and the suggestion that calibrating the displays may help. I wondered if anyone had experience of trying to fix it?
 
Thanks for that, very slight banding on the TV, none on the monitor. Seems to be a content issue, mostly in games, sometimes on the background of desktop apps.
I've tried reshade on a couple of the worst gaming offenders and it helps. I'll stick with that and stop worrying about it.
 
Back
Top Bottom