Associate
- Joined
- 14 Jan 2010
- Posts
- 798
- Location
- Dover, Kent
Hey guys, currently got this new setup going:
The TV is a LG 32LH5000, and it can go up to 1920x1080@100Hz. However when I plug it in and my graphics card detects the display, it thinks that it can only show 1360x768@60Hz and will not let me go any higher. The only way I have gone around this so far is to connect my previous monitor and let it detect so I can set the display at 1920x1080 (Still only with 60Hz though), then plugged the TV back in.
This is all well and good until I restart my PC then I am back to 1360x768 again.
Would me connecting the display with just a VGA cable (VGA to VGA) be causing this sort of problem with it not detecting the proper display?
I have the latest drivers on my GTX260 etc and I know the screen can display 1920x1080 because it has been at that.
If you think it is not the cable, does anyone have any ideas or suggestions? Thanks.

The TV is a LG 32LH5000, and it can go up to 1920x1080@100Hz. However when I plug it in and my graphics card detects the display, it thinks that it can only show 1360x768@60Hz and will not let me go any higher. The only way I have gone around this so far is to connect my previous monitor and let it detect so I can set the display at 1920x1080 (Still only with 60Hz though), then plugged the TV back in.
This is all well and good until I restart my PC then I am back to 1360x768 again.
Would me connecting the display with just a VGA cable (VGA to VGA) be causing this sort of problem with it not detecting the proper display?
I have the latest drivers on my GTX260 etc and I know the screen can display 1920x1080 because it has been at that.
If you think it is not the cable, does anyone have any ideas or suggestions? Thanks.