Poor image quality using HDMI cable

Soldato
Joined
12 Dec 2004
Posts
3,269
Location
the south
Hey all.

I have an older AOC HD ready tv that I use with my pc and to watch telly.

It only has HDMI ports and a VGA port.

Until recently I've just used a vga cable with a dvi converter into the graphics card and all works fine apart from I get horrible banding on white or black backgrounds which I just put up with.

However I recently bought a HDMI cable to use instead of the vga but when I use the hdmi cable the image quality is really bad.
Image looks very grainy and text is all blotchy.

I've tried all the different settings in CCC and nothing makes any difference.

I've done a search and read a few people have had similar issues on other hd tv's but I cant find a resolve.

some people have suggested to set the hdmi port to PC on the tv but my tv wont allow that, it seems pc is fixed to the VGA port.

Am I just going to have to face the fact that my tv is just not compatible with a hdmi connecting from a pc or could it be something else?

any help is much appreciated.
 
Cheers for the reply.

The tv's resolution is 1280x768 and I've tried every resolution option available.

I'm not sure its a resolution problem than something else?
 
Could do with the refresh rate as well; on top of that, some old HDtv's dont accept PC HDMI input at all, so check the manual.

Even on my new Samsung I need to rename the HDMI-input to "PC", so check the manual for any instructions relating to PC input.
 
The problem is most likely with the TV being only HD-ready.

"HD-ready" combined with HDMI is nowadays pretty much equal to showing a 1280x720 or 1920x1080 resolution via native 1366x768 resolution (or in your case, 1280x768...?). Meaning you can't get the native resolution in any easy way with HDMI. The reason VGA works better is because it actually lets the TV to accept a true 1366x768 signal (or again, in your case 1280x768), and therefore to show it pixel by pixel, without interpolating it to a different resolution. With "Full-HD" TVs (native 1920x1080) there's no such problem.

Haven't tried this myself, as I fortunately have 1920x1080, but if you're interested, have a look at this (aka. "the-not-so-easy-way"):
http://pixelmapping.wikispaces.com/Guide+to+1366x768

It's from Windows XP era, but still worth a look.

Ps. If you want to know more of why HD-ready TVs are 1366x768 instead of the actual 1280x720, then read this:
http://hd1080i.blogspot.fi/2006/12/1080i-on-1366x768-resolution-problems.html
 
Last edited:
Can't check that right now As i'm using the vga cable but is that the settings along the lines of RGB 4.4.4?

If so I tried all available options and it made no difference.


If I where to force a resolution as in the link above using "powerstrip" should I use 1280x768 or 1366x768?
I'm pretty sure I've tried both of those already with no luck.?
 
Right quick update.

After a good few hours messing around with powerstrip trying all sorts of different settings and configurations I've still had no luck getting it to work.

Seems this TV just cant display a good pc connection via the HDMI port.

On a side note I borrowed an LG full hdtv off my brother and it worked perfectly.
 
With regards to settings, just in case:
You should first try the one that's your native resolution. The 1280x768 is quite unusual, though, that's why I put the "...?" in my first post. Usually it's 1366x768. 1280x768 is closer to 16:10 aspect ratio, which is quite rare for TVs, from what I've seen. They're usually 16:9. Though not to say they couldn't exist. Actually, it might also be that the TV is the "real" 1280x720. They're just quite rare, too. Though if it were 1280x720, then there probably wouldn't be a problem, to begin with.

Well, in any case, try the native resolution first. And if that doesn't work, then try the nearest 8-divisible integers, like the 1360x768 and 1368x768. If the 1280x768 is really the native, then that's already in itself the nearest 8-divisible resolution. And if that doesn't work, then not sure what to do next.

To determine whether you have succeeded to get the native resolution and 1-to-1 pixel mapping over the VGA, check this:
http://superuser.com/questions/51694/dvi-and-pixel-mapping

The checkerboard link in the "best answer" will also be useful for the HDMI 1:1 pixel mapping check.

Personally, I would recommend to use the VGA until you get a new monitor/TV. As this can REALLY become a hassle if it doesn't work at first try.

Ps. It's also important that the TV doesn't try to Auto-Zoom, Auto-Scale or anything like that. Check for possible Overscan settings, too. Though it might even be possible that the manufacturer hasn't given a user-option for those, or provides only the basic ones, like a choice of "4:3", "16:9", "Pan-scan" and "Letterbox". Which makes your journey even more difficult.

Ps2. As a comparison, my HDTV (1080p) has an "Unscaled" format option, which is what I use, and it enforces 1:1 pixel mapping. And so, if I choose a resolution lower than 1920x1080, it will give black bars around the actual image.

Ps3. Just wanted to make sure: do you have a DVI-HDMI cable at hand? Or even a DVI-HDMI adapter? The "1366x768" guide assumes that you have. They recommend (or rather require) DVI-HDMI, as the HDMI connection might be a little finicky over what resolutions it's willing to output (and even moreso with regards to what sort of data).
 
cheers for all the help aatu.

Just done a quick search on the TV (should have done this first) and it seems the native resolution is 1366x768

I tried several different resolutions in powerstrip, including 1366x768 at 60hz and it didn't fix the issue.

The TV's options are extreamely limited, In fact I have no options other than to change the format from auto to 4:3 or 16:9.
I can change the colour options and the phase?
there is also an auto sync option which doesnt seem to do much? and thats about it.

I dont have a DVI cable, I only a VGA with a DVI adapter and a HDMI cable.

Would a DVI to HDMI have different results?
 
Check this link:
http://forums.entechtaiwan.com/index.php?topic=2578.0

I would also recommend to read the thread a bit further, to get a little perspective on what's behind the scenes (and what to do with the given numbers/strings). The thread is from 2005, but that shouldn't matter, I think.

In any case, try focusing on the 1360 and 1368. The native (1366) is usually just something that should be tried in case it miraculously just "works", as that would be the optimal resolution, but the 8-divisible resolutions are the ones that usually work, if at all.

With regards to HDMI-HDMI vs. DVI-HDMI:
It COULD affect the result, but it might also just be a waste. Mainly the primary thing people get from DVI-HDMI is that it automatically disables the audio data from the stream, which for some reason is the usual culprit for breaking the 4:4:4 chroma sub-sampling and 1:1 pixel mapping. But then again, some GPUs (and TVs) are fine with either one, so there's no guarantee that it would even make a difference in your case.

In other words, if you already would have had it, then yes, definitely should use that. But I wouldn't purchase one just for this. Unless, of course, you can get one for a very cheap price, like around £3. Not sure if a simple adapter would be sufficient (those are cheap), would recommend a real cable.

With regards to the available TV settings:
The color settings probably won't affect the clarity, but the phase settings might affect the VGA's output. Check this for more info and for a check test (also good for checking whether you are working on the native resolution):
http://www.lagom.nl/lcd-test/clock_phase.php

Not sure about auto-sync. Would depend where it's at. If it's in the phase-section, then that could indeed be useful (for the VGA).

Furthermore, it seems like you're getting a fairly functional image with the VGA, apart from slight color banding. Don't get your hopes up with regards to the banding going away. It might be that it's not affected by the VGA at all, but the TV itself. And you should definitely first try the Powerstrip's VGA settings (from the link), before you purchase anything.

Also, if you have a VGA port in your computer, I would suggest using VGA-VGA instead of the DVI-VGA. But wouldn't recommend buying a new cable (nor GPU!), if you don't have one now.

Ps. One thing I want to clarify a bit:
1280x768 was fairly common back in the day (2005 and earlier). Though nowadays it's indeed quite rare. But as your native is 1366x768, then you don't have to concern yourself with this anymore, in any case.
 
Back
Top Bottom