(LCD tv + laptop) * resolution = confusion

Associate
Joined
18 Oct 2002
Posts
466
I'm afraid this is going to be a bit of a long post and that pretty much reflects my level of confusion. I would really appreciate some input though. So I'll start by saying I'm a bit of a noob when it comes to LCD tv but, in preparation for building a HTPC and buying an LCD, I had thought I was starting to get somewhere.

Now I'm at home for Christmas and my Dad has a 37" Philips LCD tv that's a couple of years old. I assume it's of reasonable quality as it cost about a grand. I thought it would be interesting to see how well the TV supported output from my laptop (bear in mind this is all kind of me testing out how easy it's going to be to hook a HTPC upto a tv). The laptop runs Vista Business and has an ATI Radeon x1600 card. The TV has various inputs but the two that were of interest to me were the VGA D-Sub and DVI-I. Sadly we only had a VGA lead, so I connected said laptop and TV. Now what I (naively) expected was for the TV to be picked up like a normal monitor and for me to be able to select the resolution in the normal way.

Well the first problem was that I got a 'resolution not supported' on the TV when I picked the highest available (first question - why can't the TV tell the pc what resolutions it supports like a monitor?). Going through the other available settings yielded some 'resolution not supported' messages and some displayed. Of the ones that displayed non filled the screen - indeed none seemed to match the correct proportions of the screen (i.e. there was always letter-boxed off at the top of the screen vs the sides or vice versa). Am I mad for thinking it should be possible to find a 1:1 match where the card just delivers the signal appropriate to the number of pixels on the screen.

Now to make things for complicated I found that by accessing a certain menu I could stretch the picture to fill the screen. However, even this seemed imperfect, invariably a bit of the sides and/or top and bottom of the desktop was missing.

Unfortunately there were further levels of complexity. The main one being that under a separate set of menus on the TV there was an option to swap to HD input (which is claimed could still be delivered over the VGA connection). Changing to this did make the image slightly less blurry, though the main effect this had was to show up lots of noise and also to show something that looked like horribly colour fringing. I still had the problem that some of display resolutions did not show and the others had edges of the image missing.

I don't know if messing around with the range of refresh rates available would have helped but by this point I'd run out of steam.

So, sorry for the long post and I guess it didn't really contain any questions mainly observations. I'm now really worried that even if I do build a HTPC I'll have a horrific time trying to get an image on to the screen. Is this typical? Thanks for listening and merry christmas.

edit: Should add, the manual to the TV was next to useless basically just showing where to plug the leads.
 
Panel resolution is likely to be 1360x768/1366x768

You should be able to find 1360x768 listed as a selectable resolution.

If not, try the Omega drivers for the graphics, because that's what I use and I can select that resolution :p
 
Last edited:
Ah right, Omega drivers are a good idea. I'm off to download now. And I've had a quick look behind the TV to find out what model it is (Philips 37PF5520D) which does have a native resolution of 1366 x 768. I was quite surprised to see that lots of 720p panels are 1366/1360 by 768, wouldn't you have thought they'd be something x 720? I'm either getting old before my time or this really is quite confusing ...


edit: Damn, Omega drivers are still not available for vista...
 
Last edited:
Yes, that is incredibly stupid, but I assume for whatever reason they are cheaper to make.

1280x720 would be a more sensible resolution by far, as a 1366x768 panel will have to scale, well, everything you play through it, bar maybe a PC input.

Still, most HD panels are this res, unless you get a 'Full HD' panel, which will be a proper 1920x1080 :)
 
I just use powerstrip. Even though my laptop graphics card allows me to add custom resolutions, it's not totally ideal. Powerstrip allows me to choose everything then just click add. Then when I open up display properties the resolution is there.
 
Well I've tried connecting the laptop to the TV again and it doesn't look good. Using the ATI control centre or the Vista control centre it's only possible for me to go up to a resolution of 1280 x 1024, so less than the 1366 x 768 of the TV. I guess it's possible that my graphics card (Radeon x1600) just card deliver a sufficiently wide input.

I'm not too worried about this to be fair - after all it's my Dad's TV and my laptop. I'm, however, keen to avoid a similar problem when I build a HTPC of my own and buy a TV. So how can I avoid hitting a similar problem - should half decent card (perhaps an ATI 2400) be up to delivering 1366 x 768?
 
Back
Top Bottom