- Joined
- 25 Nov 2011
- Posts
- 20,675
- Location
- The KOP
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Not only wuld AMd be all over it but Nvidia drivers would fail MS certification. DX specifies in detail exactly the precison of output required for all functions in the API. The output of the framebuffer has to be exact to a framebuffer within any limits of floating point arithmetic. There simply isn't any scope to change the image quality of the internal rendering. Anyone talking about compression or drivers not rendering anything are taking complete BS. The differences are simply in the output and how each IGV defines their color space, contrast, WB, gamma defaults.
Nvidia aim for a more professional look, AMD go for an overly saturated look like TVs in a salesroom. Both can be made to look like the other within 30secodns of driver settings
Isn't this a little like 'Here's a screenshot from my HDR monitor, look how glorious it is on your SDR system!'
![]()
No? this has been discussed soo many times in this thread alone. It shouldnt need repeating. It's just one setting, a setting that default to the wrong parameters for *some* displays on HDMI only. How is yours a balanced opinion when you still ignore this?
Jesus, GC is an absolute cesspit.
It does matter, because people like you lambaste nVidia for a simple setting that takes seconds to change. If it was any other piece of kit you would ignore out of the box settings, but because its nVidia you cling to it like faeces. It's dishonest at best and it needs to stop.
Clearly, no it isnt. It's just one of many 'problems' people are misunderstanding.
Never seen you call out 4K8K and he is the king of misinformation in these parts. Just saying![]()
Oh geez, we're back to this again. Nvidia is using default 0-255 on my system. If you connect it to a TV it will use 16-235, it's correct as the majority of TVs is 16-235. It's an auto detect feature.
I always love how some speak of limited/16-235 is some god awful settings. Your broadcast and movies is viewed in 16-235. It's only bad, full or limited if both are setup wrong. Which so many fall foul of. Google displays so many of those who has set it up wrong. Or don't know.
It's no wonder old man yells at the cloud.
Yes, Display Port. Even DVI uses 0-255. HDMI is auto detect for TV. You can tell right away if the range is wrong with test patterns or that LCD Lagom site.
Can you confirm the blacks aren't crushed if AMD defaults to full? Assuming the TV is also set to full or whatever it is called on said model. Some people think the image was better when you then find out their shadow and white detail was crushed.
For the people thinking they will never see anything from me again with regards to image quality in this thread this is what ive spent a lot of today doing. Yes my camera stand is a modified pringles pot and yes i did draw a circle around it and marker lines so i can get it into the same place every time.
Im still trying to finish off some tests but hopefully ill have some shots and videos even if im not completely finished testing very soon.
Probably already posted but conclusion was no difference bar minor profile colour.