Associate
I'm having a rather complex issue with a couple monitors (ViewSonic 19" jobbies, couple years old) jacked into my new HD4870 dual DVI. Apologies for the length of this but, as you'll see, it really is an odd one and needs a proper explanation.
Backstory: my old GPU (Ati x1900xtx) was running at 105 degrees C under load, so I took it out of the PC, blew the dust out of it, put it back, temps were fine but suddenly had a weird problem with my monitors (described in lower paragraph), put it down to a static charge from handling/blowing the GPU semi-breaking a component (had a similar issue with a PSU several years back), tried to flash the GPU BIOS, flash utility didn't work, bricked the GPU, bought a new Ati HD4870, plugged everything in, monitor problem had vanished!
Until today, that is, about 3 weeks and several reboots later. Whilst un-installing and re-installing an external soundcard the monitor problem has come back. Given that it's now happened across two graphics cards I don't think it's the cards, and I bricked my old one for no reason, grr. (Altho, nerdcraft runs at max settings at 1280x1024 on this so it's kind of a bonus. Anyway...)
Problem: One of the monitors won't display the desktop when it's in "use both monitors independently as separate desktops" mode, but only sometimes. If it's in "mirror the same desktop on both monitors" it always displays something, but as soon as I go into "separate desktops" mode... it breaks. And by breaks I mean, windows still renders the 'other' desktop, it still exists, just the monitor goes into some weird mode where the power light blinks and it goes blank (not the same as its normal 'no power' standby mode). The mouse can still be moved onto the 'hidden' monitor and, with a bit of lucky clicking and dragging, programs that were open 'fished' back onto the one that's still displaying stuff. Windows still knows the blank monitor's there too as I get the familiar "device added/removed" noise if I yank the DVI out. Given that it's worked ok for the last three weeks in "separate" mode... and the problem coming back today was triggered by me messing about with uninstalling stuff (aforementioned external sound card)... despite it only affecting one phyiscal monitor (comfirmed to an absolute degree by hours of swapping cables and spare monitors around (sadly my spares are too slow and blur visibly in games)) I think it's a software issue, somehow, rather than the monitor. Because, surely the signal that gets sent to the monitor in "mirror" mode is the same as in "separate" mode, right? Just 1280x1024 pixel colour values? So the monitor itself (I theorise/hope) is fine but there's some residual broken stuff in windows somewhere.
Finally, The Question: anyone got a clue what I can do/try here? What can I do to remove every last trace of this particular physical monitor from windows (registry, drivers, etc)?
Thanks for bearing with me on that
Backstory: my old GPU (Ati x1900xtx) was running at 105 degrees C under load, so I took it out of the PC, blew the dust out of it, put it back, temps were fine but suddenly had a weird problem with my monitors (described in lower paragraph), put it down to a static charge from handling/blowing the GPU semi-breaking a component (had a similar issue with a PSU several years back), tried to flash the GPU BIOS, flash utility didn't work, bricked the GPU, bought a new Ati HD4870, plugged everything in, monitor problem had vanished!
Until today, that is, about 3 weeks and several reboots later. Whilst un-installing and re-installing an external soundcard the monitor problem has come back. Given that it's now happened across two graphics cards I don't think it's the cards, and I bricked my old one for no reason, grr. (Altho, nerdcraft runs at max settings at 1280x1024 on this so it's kind of a bonus. Anyway...)
Problem: One of the monitors won't display the desktop when it's in "use both monitors independently as separate desktops" mode, but only sometimes. If it's in "mirror the same desktop on both monitors" it always displays something, but as soon as I go into "separate desktops" mode... it breaks. And by breaks I mean, windows still renders the 'other' desktop, it still exists, just the monitor goes into some weird mode where the power light blinks and it goes blank (not the same as its normal 'no power' standby mode). The mouse can still be moved onto the 'hidden' monitor and, with a bit of lucky clicking and dragging, programs that were open 'fished' back onto the one that's still displaying stuff. Windows still knows the blank monitor's there too as I get the familiar "device added/removed" noise if I yank the DVI out. Given that it's worked ok for the last three weeks in "separate" mode... and the problem coming back today was triggered by me messing about with uninstalling stuff (aforementioned external sound card)... despite it only affecting one phyiscal monitor (comfirmed to an absolute degree by hours of swapping cables and spare monitors around (sadly my spares are too slow and blur visibly in games)) I think it's a software issue, somehow, rather than the monitor. Because, surely the signal that gets sent to the monitor in "mirror" mode is the same as in "separate" mode, right? Just 1280x1024 pixel colour values? So the monitor itself (I theorise/hope) is fine but there's some residual broken stuff in windows somewhere.
Finally, The Question: anyone got a clue what I can do/try here? What can I do to remove every last trace of this particular physical monitor from windows (registry, drivers, etc)?
Thanks for bearing with me on that
Last edited: