Hi all, please forgive the spam, I asked a similar question in the Graphics Cards sections but they suspected my power supply and so on, which I am 100% sure is fine. Hence, I'm asking in the other relevant section... Monitors
.
My question is simple really, I just need an answer: Would a dual-DVI 8800GTX 'require' a DVI->DVI cable? At the moment I'm running an adapter on the graphics-card end of things, that turns my old VGA / D-Sub cable into the DVI port on my 8800GTX. However, the other end of the cable is standard old-school, and is plugged into the D-Sub port on the back of my 19" monitor. Whenever I install the Nvidia drivers, the monitor goes black and into standby, and stays there. It works fine in Safe Mode when the Nvidia drivers aren't running, but my only option then is just to uninstall them
. Naturally, without the VGA installed the computer runs sluggish and is a useless POS.
I'm really not sure what could be causing the problem; clutching at straws here. Some Forums say that it's because the graphics card is overclocked too far... mine is an OCUK pre-build and isn't overclocked at all. Other Forums say the drivers are trying to execute a refresh rate that's impossible for screen... my screen has a native hardware refresh rate that cannot be changed in Windows anyway. I've swapped around and tried my cable on both the GTX's DVI ports, but still the same problem.
SO, in short: DVI->DVI cable. Necessary? Or should it work with a D-Sub?
Thanks,
.My question is simple really, I just need an answer: Would a dual-DVI 8800GTX 'require' a DVI->DVI cable? At the moment I'm running an adapter on the graphics-card end of things, that turns my old VGA / D-Sub cable into the DVI port on my 8800GTX. However, the other end of the cable is standard old-school, and is plugged into the D-Sub port on the back of my 19" monitor. Whenever I install the Nvidia drivers, the monitor goes black and into standby, and stays there. It works fine in Safe Mode when the Nvidia drivers aren't running, but my only option then is just to uninstall them
. Naturally, without the VGA installed the computer runs sluggish and is a useless POS. I'm really not sure what could be causing the problem; clutching at straws here. Some Forums say that it's because the graphics card is overclocked too far... mine is an OCUK pre-build and isn't overclocked at all. Other Forums say the drivers are trying to execute a refresh rate that's impossible for screen... my screen has a native hardware refresh rate that cannot be changed in Windows anyway. I've swapped around and tried my cable on both the GTX's DVI ports, but still the same problem.
SO, in short: DVI->DVI cable. Necessary? Or should it work with a D-Sub?
Thanks,