Associate
- Joined
- 1 Apr 2013
- Posts
- 103
At the risk of flogging a dead horse, what is the current consensus on the best way to connect, say, a 5090 - 12VHWPR included with PSU or 4 x PCIe cable adaptor included with GPU?
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Push in gently and here the click job done ..
I'd choose the native cable because there's less mess, but ultimately the weakest point is in exactly the same place (the graphics card's power connector), for whichever option you choose. It is rare that they melt on the other end.At the risk of flogging a dead horse, what is the current consensus on the best way to connect, say, a 5090 - 12VHWPR included with PSU or 4 x PCIe cable adaptor included with GPU?
Yeah I do wonder if it might have been better to rate the 12-pin cables as 300W and then used 2 if you were going too much over that. A bit the same as 8-pin is double a 6-pin. It could still be smaller so it takes up less room if it's not caring so much wattage.Several people in this field who I trust have said that multiple connectors on the PSU end is preferable due to the larger surface area and thus better heat dissipation. However the difference is probably pretty small. Corsair didn't even want to put a single native socket on their PSUs but only did so due to demand. No one really knows if 12V-2x6 connectors themselves achieved anything at all. Again, most people knowledgeable on these things think the new connector and sense pins were a complete waste of time.
The real answer is to not buy a 5090 but you probably don't want to hear that. IMO ~350w is a more sane limit for 12VHPWR.
I think Nvidia are hell-bent on having a single connector and the only reason the 12VHPWR exists in the first place. You kinda need to ask if 600w GPUs over 12v is a good idea in the first place.Yeah I do wonder if it might have been better to rate the 12-pin cables as 300W and then used 2 if you were going too much over that. A bit the same as 8-pin is double a 6-pin. It could still be smaller so it takes up less room if it's not caring so much wattage.
It even if they made it 400W there may have been far less issues.
But it's not (just) Nvidia that gets to decide that is it? If the standard was agreed with a lower output Nvidia would have to work with that.I think Nvidia are hell-bent on having a single connector and the only reason the 12VHPWR exists in the first place. You kinda need to ask if 600w GPUs over 12v is a good idea in the first place.
Pretty much isBut it's not (just) Nvidia that gets to decide that is it?

I think only way to be safe is to get the ThermalGrizzly VHPWR cable monitor which alerts for high temps/current, not cheap though.
That does seem to be the long and the short of it, yes.In short words, in many cases it's just about pure chance - you could do everything right, use all the right components and it will melt anyway. Just no good way to know for sure with this PoS connector, sadly.
They'll struggle to reject it, I think, if you use your PSU's native cable as well. But as per the end of the video... if you have to go the RMA route, just don't say anything about the PSU.Could using the 8 pin PCIe adaptor that comes with the cards be the safer option for now?
It probably won't fail on the PCIe side and if it fails on the GPU side they can't reject your warranty as you used the adaptor they included.