- Joined
- 30 Apr 2015
- Posts
- 321
- Location
- Newcastle upon Tyne
Its going to be a crying shame having to upgrade from that old 560ti to something much faster![]()
Haha

I think I'm resigned to the fact I'm going to need to replace it come pay day though.
Its going to be a crying shame having to upgrade from that old 560ti to something much faster![]()
It doesn't show up on that one either.
All I get is Standard VGA Graphics Adapter, and if I untick the box, it's all the same options as the Intel one, but no sign of a Microsoft Basic Display Adapter.
I have changed the 560Ti so that it now says 'Standard VGA Graphics', disabled the Intel HD one, and rebooted the PC into normal mode and the display is working.
Hmmm then you probably don't use Windows 7 or 8. The screenshot of your Device Manager seemed to be from Windows XP.
Glad you fixed it by disabled Intel IGPU.
but if you can access the settings for the nVidia card in safe mode and manually setting the refresh rate you might be able to fix the problem without having to replace the card.After reading all the information here, I'd have to hazard a guess that there is a setting causing the refresh rate to be beyond what the monitor can handle which only affects the nVidia card when it's drivers are installed (since the card works with the generic VGA drivers), but my only experience with this issue was with a game that ran at a dodgy refresh rate that the monitor I was using didn't support and I fixed that by going into the direct X diagnostic tool and force setting a single refresh rate which might help (it might not be possible in windows 7 since that option has been removed as of the more recent versions of windows)
Forgive me if this is a stupid question, but would reinstalling Windows and freshly installing all the Nvidia stuff not have defaulted the settings in there anyway? Although what you are describing certainly seems plausible to me.