• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Suddenly can't put any load on week old 2080 Ti without crashing

Associate
OP
Joined
21 Aug 2018
Posts
15
When you say that it can't boot into windows, could you be a little bit more specific?

Does it hang on the login screen? Does it turn black on the login screen? Does the boot process appear to complete? Does it lock while the windows loader is spinning?

My 2080 Ti runs perfectly, but if I were to swap it out with another video card temporarily, then switch back as you've done, then mine would lock at the windows login screen every single time until I force it to switch over to the NVidia drivers, rather than the default windows "safe" ones.

You might have fixed the original issue, but are now experiencing what I have.

By "can't boot into Windows" I mean as soon as I turn the PC on, the fans on the GPU spin up to 100% and my monitor stays black (stopping me from even making it to the BIOS, let alone the login screen) - basically replicating the "crash" I was having last night while under load. However, I just checked Windows notification and saw this:

5LD226F.png

13:56 would have been roughly the time when I last turned the PC on with the 2080 Ti in, so it would seem the PC is actually managing to boot into Windows, just without me actually being able to see it.
 
Associate
Joined
28 Oct 2006
Posts
457
By "can't boot into Windows" I mean as soon as I turn the PC on, the fans on the GPU spin up to 100% and my monitor stays black (stopping me from even making it to the BIOS, let alone the login screen) - basically replicating the "crash" I was having last night while under load. However, I just checked Windows notification and saw this:

5LD226F.png

13:56 would have been roughly the time when I last turned the PC on with the 2080 Ti in, so it would seem the PC is actually managing to boot into Windows, just without me actually being able to see it.

So it is a driver issue as GordyR suggested.
 
Soldato
Joined
1 Dec 2003
Posts
6,476
Location
Kent
Interesting... It sounds like what you're experiencing now is separate to your original problem, and it's indeed a driver issue.

Here's what I would do.

1. Re-seat the 2080 Ti once more, checking all connections etc.
2. Plug your monitor into your motherboards iGPU so that you're using onboard graphics to boot (even though your 2080ti is still installed).
3. Load into windows, pop into device manager and uninstall any video drivers other than the one for your onboard graphics. Actually, use control panel to remove any NVidia software also.
4. Reboot once more.
5. Install the latest NVidia drivers, even though you're not currently using the card.
6. Once installed, shut down your PC, switch your monitor's HDMI/DP cable back to your 2080 Ti.
7. Boot her up and see if she makes it to the windows login screen.

If you make it to the login screen and the resolution isn't that of your native monitor, and is instead lower (usually 1024x1024 or something), then it will probably hang as the system is trying to use Microsofts default, supposedly "safe" drivers.

If that happens, then simply hard reset your system and let it boot again. It's occasionally taken a couple of resets like this for me, then eventually it picks up the correct drivers and I get native 4k at the login screen.

From then on, everything is perfect.
 
Last edited:
Soldato
Joined
1 Dec 2003
Posts
6,476
Location
Kent
The driver will fail if the card has a problem though. Which is why you get TDR errors in the vent log from the driver when the card is unstable.

Indeed, but it seems a little coincidental that this appears to only have surfaced since he swapped in his old GPU, suggesting that it's an entirely separate issue.

I experience very similar locking up issues if Windows attempts to fire up any driver other than the most recent NVidia ones, even the standard "Microsoft Display Adaptor" drivers, that are usually considered "safe".

So it's definitely worth eliminating that as the cause, at least.
 
Associate
OP
Joined
21 Aug 2018
Posts
15
Indeed, but it seems a little coincidental that this appears to only have surfaced since he swapped in his old GPU, suggesting that it's an entirely separate issue.

I experience very similar locking up issues if Windows attempts to fire up any driver other than the most recent NVidia ones, even the standard "Microsoft Display Adaptor" drivers, that are usually considered "safe".

So it's definitely worth eliminating that as the cause, at least.

Thank you for your advice, I'm gonna try the steps you gave me as soon as I get the chance, so fingers crossed!

I should clarify though, this issue didn't appear for the first time after I swapped in my old GPU - that was the last step I took in order to be able to boot into Windows and see if the different card worked fine. What initially triggered the fan speed/black screen problem when switching on the PC was me reseating the 2080 Ti. Before that, it only happened under load, only since then has it been like this.
 
Soldato
Joined
28 Sep 2014
Posts
3,437
Location
Scotland
It look like your Corsair TX750 PSU could be 11 years old since launched sometime in 2007, it not fully modular. and able to handle 60A 12V single rail.

http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story&reid=73

I think your PSU at very old age handled RTX 2080 Ti heavy load demand without issue in first few days but struggled and finally gave up after a week.

I suggest you to use HWINFO to check motherboard +12V load to see if it fall below 12V then you will need a brand new PSU.

Also download latest Nvidia driver 416.16 that will supported Windows 10 October 2018 Update and enabled DirectX RayTracing support.
 
Soldato
Joined
1 Dec 2003
Posts
6,476
Location
Kent
What initially triggered the fan speed/black screen problem when switching on the PC was me reseating the 2080 Ti. Before that, it only happened under load, only since then has it been like this.

Oh I see. Well given that you received a display driver fail notification, I would still recommend you giving the steps I've mentioned a go regardless.

Anything that triggers windows reverting to the basic Microsoft drivers causes hard locks and all sorts of issues, of exactly the sort you're describing, for me also.

The MS drivers simply don't play ball with the card.

Granted, I haven't tried this since the latest Windows 10 1809 update a couple of days ago, so it's possible that MS have just updated their standard VGA drivers, but I was having this problem all last week.

If we can get you booting back into windows, using the 2080ti, with the correct drivers at full resolution, then you just might find that everything is working properly now that you've re-seated your GPU and are using two PCIE power cables.

If not, then i'm sorry to say it, but you just might have a dud. :(

My money is still on it being a PSU/power issue though, especially since as AthlonXP1800 mentioned, it's probably quite old now.
 
Last edited:
Associate
OP
Joined
21 Aug 2018
Posts
15
Interesting... It sounds like what you're experiencing now is separate to your original problem, and it's indeed a driver issue.

Here's what I would do.

1. Re-seat the 2080 Ti once more, checking all connections etc.
2. Plug your monitor into your motherboards iGPU so that you're using onboard graphics to boot (even though your 2080ti is still installed).
3. Load into windows, pop into device manager and uninstall any video drivers other than the one for your onboard graphics. Actually, use control panel to remove any NVidia software also.
4. Reboot once more.
5. Install the latest NVidia drivers, even though you're not currently using the card.
6. Once installed, shut down your PC, switch your monitor's HDMI/DP cable back to your 2080 Ti.
7. Boot her up and see if she makes it to the windows login screen.

If you make it to the login screen and the resolution isn't that of your native monitor, and is instead lower (usually 1024x1024 or something), then it will probably hang as the system is trying to use Microsofts default, supposedly "safe" drivers.

If that happens, then simply hard reset your system and let it boot again. It's occasionally taken a couple of resets like this for me, then eventually it picks up the correct drivers and I get native 4k at the login screen.

From then on, everything is perfect.

Tried to follow these steps, but ran into a snag of my own making - the prongs from the I/O shield are poking into the HDMI port so that I can't plug anything into it, as seen here:

https://i.imgur.com/rl1nn94.jpg

I could use DVI... but my monitors have no DVI ports. My older monitor does have a VGA port... but I don't have any VGA cables. After I spent a while searching I got annoyed, gave up and put the R9 390x back in for now. My own fault for not checking properly when I installed it, I guess.

I think what I'm gonna do is go ahead and buy a new PSU to arrive on Saturday, get that installed, reseat the motherboard and hope for the best. I really can't put into words how annoying it is to work with this current power supply's endless stream of cables so I'm trying to see this whole situation as a positive now.

Not sure if this shows any problems, but here are the voltage readings in HWInfo:

U2cGDkb.png

Every time I've seen +12V before now it's been on 12.000 exactly. Is it normal for it to be a bit below?
 
Soldato
Joined
1 Dec 2003
Posts
6,476
Location
Kent
the prongs from the I/O shield are poking into the HDMI port so that I can't plug anything into it

Been there, done that :p

I think you're doing the right thing by updating your PSU anyway. Even if it doesn't turn out to be the culprit it will be a good upgrade for you going forward.

Not that your current PSU is a bad one at all, it's just that it's a little old to be paired with a 2080 Ti.

Sometimes even high end, and otherwise great products can have issues when they're slightly older and are matched with something very new. This is particularly the case with PSU's as power requirements have changed a lot over the years.

But anyway, if you can't wait, depending upon the cable, you can often bend the prongs outwards enough to allow them to plug in.
 
Soldato
Joined
12 Feb 2009
Posts
4,326
"Multi Rail" PSUs are essentially two separate power circuits in one unit. Single rail PSUs have just one. IIRC. So a 1000Watt dual rail PSU behaves more like two 500W PSUs working in tandem. If you power a highly overclocked CPU and a powerful GPU from just one of the two rails, it will shut down as it tries to draw more than 500W even though it's a "1000 W" PSU. Even on a single rail PSU you probably don't want to be using a pair of 8 pin PCIe connectors which are connected to the same port on the PSU as you can draw too much power from that port.

The use of the term "Multi Rail" is misused for virtually all PC PSUs anyway when it is used to describe multiple +12v lines.
(The 3.3v & 5v lines are separate rails, so technically all PSU are multi rail, but not in the context we are using here)

The +12v circuit is rarely separated, in reality it just has load limits on multi power lines/groups.
This is good in that it does offer some protection if you overload the line it cuts out before catching fire, but can limit your options on high end hardware.

Very high spikes and shorts will still cut out a so called single rail PSU with no line limits (other than the total output), but it's certainly possible to overload a cable without it cutting out and you could get a melting or burning cable/connector.

There was a 1000W Corsair I believe that was effectively two 500 W PSUs stuck together that could really claim to have "Multi Rails" as it had to separate +12v circuits. :)


Anyway this problem looks likely to be a faulty card or the PSU not performing correctly.
The voltage regulation could be starting to go on the old PSU, too much ripple could cause the card to crash under load. Your older GPU may be more resistant to higher ripple, or it's only performing badly under a higher load.

f you had enough to stump up the £70 extra, would it be worth going for this one instead?

https://www.overclockers.co.uk/seas...-titanium-modular-power-supply-ca-061-ss.html

Titanium is a bit overkill, you don't really need peak efficiency at idle loads, If you were going to spend an extra £70, I'd just get a higher output PSU, but that's not necessary. :p (Subject to it fitting in your case)
 
Associate
Joined
28 Oct 2006
Posts
457
Indeed, but it seems a little coincidental that this appears to only have surfaced since he swapped in his old GPU, suggesting that it's an entirely separate issue.

I experience very similar locking up issues if Windows attempts to fire up any driver other than the most recent NVidia ones, even the standard "Microsoft Display Adaptor" drivers, that are usually considered "safe".

So it's definitely worth eliminating that as the cause, at least.
I think I will make sure to update my wife's Nvidia drivers *before* replacing the 980Ti - I am gonna have fun with my machine though which has an R9 390X in it....
 
Back
Top Bottom