it doesnt always work though as i said earlier. your still taking a risk that the tv can pick up the signal even on hdvi-dvi...
i dont see why the manufactures cant have a standard and stick to it, its clearly rubbish firmware in the tv and how it scans signals.
You shouldn't be taking a risk at all with HDMI, it is a very precise and quite small standard, and is fully compatible image wise with DVI.
Any HDMI TV will report via EDID what resolutions it supports over HDMI, there are only 3 of note
720p (1280*720) @ 50/60Hz
1080i (1920*1080) @ 50/60Hz
1080p (1920*1080) @ 50/60Hz
As long as your PC video card can generate one of these, it'll work.. Every modern video card for the last 4-5 years probably supports all these with ease, and every modern LCD for the last 3 years or so also seems to support native and standard HDMI resolutions.
The only real issue I'm aware of is that some older cards/drivers struggle in detecting the display at startup where there are 2 outputs, in which case using a VGA monitor on the other connection, and then set the video card up in windows, detecting the TV etc, and then it usually works fine from then on.. Sometimes I have just forced 1920*1080/60Hz on the PC to ensure that no matter what, when in Windows the HDMI->DVI connection always works. This can lead to nothing being displayed while booting on some very old video cards, but we are talking 6 year old things here..
I've built and installed so many HTPC's on so many TV's over the last 5 years, I haven't seen a single instance of a video card/TV refusing to play ball, other then as mentioned above, having to initially get the video card drivers set correctly..
Some older LCD's (not full HD ones) sometimes only support 720p or 1080i, and as Malachy mentions, 1080i isn't ideal for PC purposes, but invariably even then, if you know the native res of the screen, many will accept that via HDMI, or at worst, 720p, which if not the native res, will still look odd for PC Text, but fine for video.