I think people get confused because there is the (incorrect) perception that a computer screen needs to directly support 1080p to display it. This is only true with TVs.
On a computer, the 1920×1080 image can be resampled to fit whatever monitor you have. On a 20-22" LCD it would be resampled
down to 1680x1050, resulting in a loss of detail but no reduction in image quality. On a 24" LCD (1920x1200) it would be displayed at native resolution, or possibly resampled
up (keeping all detail but slightly degrading image quality) depending on the aspect ratio of the image.
This is all exactly the same as with a standard DVD - it's scaled to fit the window or full screen size. The only difference is the higher resolution so usually, on most displays around in use at the moment, 1080p will be downsamped not upsampled.
The other cause for confusion is HDCP. HDCP is a copy protection system which prevents the signal being intercepted between the player output and the display input by encrypting the signal. For this to work on a PC the operating system, graphics card and monitor all need HDCP support.
HDCP is
not widely used on HDDVD or BlueRay at the moment but is likely to become commonplace within a few years. HDCP is
only relevant to HD movie playback from an HDDVD or BlueRay disc. It is not a factor for games or the desktop.
Bottom line:
If you plan to keep your display for 2 or more years and you want to watch HD films on it, you should probably buy one with HDCP support even if your graphics card doesn't currently have it.
So far as I know your graphics card DOES NOT support HDCP. If you want to be sure, try this tool:
http://www.cyberlink.com/english/support/bdhd_support/diagnosis.jsp