Associate
- Joined
- 29 Dec 2009
- Posts
- 649
- Location
- Germany
Hey.
I am currently using a HSG1074 as my primary monitor, which is advertised as supporting "1080p" and has a huge 1080p sticker at the bottom, as well as numerous "HD ready 1080p" stickers along the backside.
However the way this is handled really confuses me. When I send it a 1920x1080 signal, it "downscales" the entire image (leaving a ~1 inch border on each side), making it completely blurry and unreadable.
If I use 1680x1050, it covers the entire monitor area but again it's awfully blurry.
The only resolution that it doesn't blur everything is 1776x1000, but again there's a ~1" border on each side of the image (which is really annoying).
Now what I'm guessing is: Could my ATI graphics card be automatically assuming my HDMI monitor overscans and deliberately adding a black border to it? That would explain the fact that 1080p gets downscaled, since it's actually "larger".
The monitor itself is attached straight to the HDMI port, there's no adapter of any kind inbetween.
Otherwise, why would they advertise a monitor as "1080p ready" if it doesn't truly support 1080p as its native resolution? It seems like it should fit, because guesstimating the pixels left on each side when running in 1776x1000, it could be 40 on both the top/bottom, and 72 on each side.
I don't have the chance to test it with another 1080p-providing device, but if I get the chance I'll be sure to test it. Just seems strange that the monitor wouldn't actually display the image over the entire screen area if it has enough pixels to fit.
I am currently using a HSG1074 as my primary monitor, which is advertised as supporting "1080p" and has a huge 1080p sticker at the bottom, as well as numerous "HD ready 1080p" stickers along the backside.
However the way this is handled really confuses me. When I send it a 1920x1080 signal, it "downscales" the entire image (leaving a ~1 inch border on each side), making it completely blurry and unreadable.
If I use 1680x1050, it covers the entire monitor area but again it's awfully blurry.
The only resolution that it doesn't blur everything is 1776x1000, but again there's a ~1" border on each side of the image (which is really annoying).
Now what I'm guessing is: Could my ATI graphics card be automatically assuming my HDMI monitor overscans and deliberately adding a black border to it? That would explain the fact that 1080p gets downscaled, since it's actually "larger".
The monitor itself is attached straight to the HDMI port, there's no adapter of any kind inbetween.
Otherwise, why would they advertise a monitor as "1080p ready" if it doesn't truly support 1080p as its native resolution? It seems like it should fit, because guesstimating the pixels left on each side when running in 1776x1000, it could be 40 on both the top/bottom, and 72 on each side.
I don't have the chance to test it with another 1080p-providing device, but if I get the chance I'll be sure to test it. Just seems strange that the monitor wouldn't actually display the image over the entire screen area if it has enough pixels to fit.
Maybe you could post how to help other in the same problem?

