• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

DVI - HDMI overscan..

Soldato
Joined
29 Nov 2002
Posts
2,672
Location
Midlands. UK
I got my PC connected to my TV via a DVI to HDMI cable..

But the TV has a native resolution of 1366 x 768.. but i cant pick that resolution..

How can i get my native resolution with 1:1 pixel mapping. Its cpnected to a Panasonic Viera 32" LCD and a Geforce 6800 with latest drivers.


Thanks
 
Hi,

If the TV is capable of HD resolutions (such as 1080p) then using the converter will probably only allow you to output in HD resolutions.

My HDTV works with a D-SUB cable at 1366x768 but that's only because I don't have a HD compliant GPU, the HDTV will actually have 1920x1080 pixels (1080(p) resolution).

So if you're running the resolution at the TV's maximum HD resolution, then you've got 1:1 pixel mapping. But as far as I know a GeForce 6800 won't output in HD...

I'm pretty sure that's what's happening, although I might be wrong with all of this, so if anyone knows better then tell me.

;_;

Banjo
 
Only very new 'true' hdtvs do 1080p and 1920 horizontal.

Mine is currently set at 1360 by 768 yet is 1366 I believe this is deliberate

a 1366 by 768 tv does 720p or 1080i (i think)
 
What i can't get my head round is,how can a 1366x768 show 1080i when it only has 768 vertical pixel's??yet 1080p is just the same just not interlaced.i do have a 1366x768 lcd and you can tell the difference in pic quality when viewing 1080i and 1080p,so is this just down to a conversion and scan down?
 
My TV is capable of 720p = 1366 x 768 pixels.

If i connect via the VGA i get that exact resolution in windows and all is well. But i want to use DVI - HDMI cable as my VGA is in use by the Xbox.

Now windows detects the TV and lets me choose 720p.. but its 1280 x 720 .. not 1366 x 768.. (i think this is PAL 720p)

BUT also the picture is too large for the sceen so you have to use this Overscan function on the drivers to shrink the image back to its intended size.. but its says the final resolution is: 1176 x 664.. so im looseing a fair few pixels..

Im a bit confused.. anyone else got this setup with some advice?? i have tried googleing it and seems to be a common problem using this Cable type butmost are using american 720p HDTV's ie 1280*720
 
If your TV supports a res of 1366x768, you are best off using a vga lead, and running 1360x768 as desktop resolution. Myself and a friend have tried all the other options, and came to the conclusion vga(d-sub) is the way to go. Even if it means using a vga switcher if u got somethink else using the vga port on your hdtv. I cant say for the higher resolution (tru 1080p) what is better, but dvi /hdmi either were to vibrant or overscanned too much. Just bear in mind if you do choose a vga lead, make sure it has decent sheilding to prevent ghosting.
 
To OP: You can't do what you want to I'm afraid, the only way you'll get 1360x768 1:1 mapped is using the vga input.

HDMI can only do hdtv (and sd) resolutions - 720p, 1080i, 1080p etc, not standard pc resolutions. So yeah, that's just the way it is.

billbennett said:
What i can't get my head round is,how can a 1366x768 show 1080i when it only has 768 vertical pixel's??yet 1080p is just the same just not interlaced.i do have a 1366x768 lcd and you can tell the difference in pic quality when viewing 1080i and 1080p,so is this just down to a conversion and scan down?

The picture is just downscaled, you will lose a fair amount of detail.
 
Last edited:
wellibob said:
If your TV supports a res of 1366x768, you are best off using a vga lead, and running 1360x768 as desktop resolution. Myself and a friend have tried all the other options, and came to the conclusion vga(d-sub) is the way to go. Even if it means using a vga switcher if u got somethink else using the vga port on your hdtv. I cant say for the higher resolution (tru 1080p) what is better, but dvi /hdmi either were to vibrant or overscanned too much. Just bear in mind if you do choose a vga lead, make sure it has decent sheilding to prevent ghosting.


ok i think i have come to the same conclusion.. been playing around with this for hours now..

Ill get a VGA lead from work tomorrow and try that. Will i need to add a custom resolution?
 
Dependant on your HDTV, and your graphics card will depend if you can acheive 1360x768. It should do, but it has been know that for some reason or another it may only see 1280x720. Suck it and see really, but you will not be able to get 1366x768 , its a division number thing with graphics cards/windows and your panels display.
 
i keep hearing that but i've had no problems setting up and running a 1366x768 resolution with my 8800. i am *positive* you can do it with other cards as well.

My TV is capable of 720p = 1366 x 768 pixels.

1280x720 = 720p. 1366x768 = 768p. the clue is in the vertical resolution.
 
If you get a 1366x768 pixel display on a 1366x768 hdtv panel, then you must be using powerstrip or somethink. Ive not been able to acheive that res on my A321 display , mainly because 1360x768 (small tiny wafer thin line on the left on my screen vertically) is good enough not to get upset about it. I am on about connecting my graphics card via a vga lead here.
 
Last edited:
i used the nvidia drivers in xp, the same way that i did it to get 1080p output to my 40w2000:) unfortunately there doesnt appear to be a facility to enter custom resolutions with the current drivers in vista, which is a bit of a pain as that means no 1080p for me also lol

but for the record yes, this is with the VGA output.
 
Last edited:
Have a samsung HDTV.
1360x768 is the best IQ you'll get.
It should give you a very very thin line on the left & right of the screen, 3 pixels each side ofr 1:1 pixel mapping ;)
 
Back
Top Bottom