Associate
- Joined
- 18 Oct 2002
- Posts
- 466
I'm afraid this is going to be a bit of a long post and that pretty much reflects my level of confusion. I would really appreciate some input though. So I'll start by saying I'm a bit of a noob when it comes to LCD tv but, in preparation for building a HTPC and buying an LCD, I had thought I was starting to get somewhere.
Now I'm at home for Christmas and my Dad has a 37" Philips LCD tv that's a couple of years old. I assume it's of reasonable quality as it cost about a grand. I thought it would be interesting to see how well the TV supported output from my laptop (bear in mind this is all kind of me testing out how easy it's going to be to hook a HTPC upto a tv). The laptop runs Vista Business and has an ATI Radeon x1600 card. The TV has various inputs but the two that were of interest to me were the VGA D-Sub and DVI-I. Sadly we only had a VGA lead, so I connected said laptop and TV. Now what I (naively) expected was for the TV to be picked up like a normal monitor and for me to be able to select the resolution in the normal way.
Well the first problem was that I got a 'resolution not supported' on the TV when I picked the highest available (first question - why can't the TV tell the pc what resolutions it supports like a monitor?). Going through the other available settings yielded some 'resolution not supported' messages and some displayed. Of the ones that displayed non filled the screen - indeed none seemed to match the correct proportions of the screen (i.e. there was always letter-boxed off at the top of the screen vs the sides or vice versa). Am I mad for thinking it should be possible to find a 1:1 match where the card just delivers the signal appropriate to the number of pixels on the screen.
Now to make things for complicated I found that by accessing a certain menu I could stretch the picture to fill the screen. However, even this seemed imperfect, invariably a bit of the sides and/or top and bottom of the desktop was missing.
Unfortunately there were further levels of complexity. The main one being that under a separate set of menus on the TV there was an option to swap to HD input (which is claimed could still be delivered over the VGA connection). Changing to this did make the image slightly less blurry, though the main effect this had was to show up lots of noise and also to show something that looked like horribly colour fringing. I still had the problem that some of display resolutions did not show and the others had edges of the image missing.
I don't know if messing around with the range of refresh rates available would have helped but by this point I'd run out of steam.
So, sorry for the long post and I guess it didn't really contain any questions mainly observations. I'm now really worried that even if I do build a HTPC I'll have a horrific time trying to get an image on to the screen. Is this typical? Thanks for listening and merry christmas.
edit: Should add, the manual to the TV was next to useless basically just showing where to plug the leads.
Now I'm at home for Christmas and my Dad has a 37" Philips LCD tv that's a couple of years old. I assume it's of reasonable quality as it cost about a grand. I thought it would be interesting to see how well the TV supported output from my laptop (bear in mind this is all kind of me testing out how easy it's going to be to hook a HTPC upto a tv). The laptop runs Vista Business and has an ATI Radeon x1600 card. The TV has various inputs but the two that were of interest to me were the VGA D-Sub and DVI-I. Sadly we only had a VGA lead, so I connected said laptop and TV. Now what I (naively) expected was for the TV to be picked up like a normal monitor and for me to be able to select the resolution in the normal way.
Well the first problem was that I got a 'resolution not supported' on the TV when I picked the highest available (first question - why can't the TV tell the pc what resolutions it supports like a monitor?). Going through the other available settings yielded some 'resolution not supported' messages and some displayed. Of the ones that displayed non filled the screen - indeed none seemed to match the correct proportions of the screen (i.e. there was always letter-boxed off at the top of the screen vs the sides or vice versa). Am I mad for thinking it should be possible to find a 1:1 match where the card just delivers the signal appropriate to the number of pixels on the screen.
Now to make things for complicated I found that by accessing a certain menu I could stretch the picture to fill the screen. However, even this seemed imperfect, invariably a bit of the sides and/or top and bottom of the desktop was missing.
Unfortunately there were further levels of complexity. The main one being that under a separate set of menus on the TV there was an option to swap to HD input (which is claimed could still be delivered over the VGA connection). Changing to this did make the image slightly less blurry, though the main effect this had was to show up lots of noise and also to show something that looked like horribly colour fringing. I still had the problem that some of display resolutions did not show and the others had edges of the image missing.
I don't know if messing around with the range of refresh rates available would have helped but by this point I'd run out of steam.
So, sorry for the long post and I guess it didn't really contain any questions mainly observations. I'm now really worried that even if I do build a HTPC I'll have a horrific time trying to get an image on to the screen. Is this typical? Thanks for listening and merry christmas.
edit: Should add, the manual to the TV was next to useless basically just showing where to plug the leads.