• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Check Your 4090's 12vhpwr Connectors To Make Sure They are in Pristine Condition

The funny thing is just how many failure points there are; it's not just one issue, the 12hwpr has a whole range of issues that can individually or in combination cause failure

I wonder when 6 and 8 pin was introduced several decades ago; did they have these teething issues as well?
Daisy chaining was the main reason I remember, and the occasional not seated properly. Imo if we take the survey as representative of reality, the difference between 3.3 and 4% I don't think warrants enough of a reaction from 3.3 is fine to 4 omg what a bunch of whoppers, etc. Although I suspect real world numbers will be lower for both.
 
Last edited:
Just watching that video now. I don't see why it had to change, but Nvidia thought otherwise the absolute whoppers.
I don't think Nvidia really shoulders much blame over all this (for once). The idea of introducing a new connector to serve the needs of power-hungry modern cards isn't a bad one. Frankly, it's silly having to run three or four seperate 8-pin cables to higher-end cards. It's just that the resulting design that's a complete mess thanks to a lack of proper standards from PCI-SIG and manufacturers cutting corners to save half a cent.
 
Gamers nexus has done a new survey of 23k users

What they found is 12hwpr is 22% more likely to fail than 6 and 8 pin PCIE

I thought they mentioned that most of the failures with the old (6/8 pin PCI-E) connectors involved daisy-chained setups. That's an extra level of stress that none of the 12VHPWR connectors have had to deal with.

So, we are comparing failure rates of old connectors that failed while doing double-duty to 12VHPWR connectors that, as far as I know, have never been daisy-chained at the GPU (failure) side like that.

It's not apples to apples, if that's the case.

The 12VHPWR cable is designed within an inch of its life, with nowhere near the margin of error the old connectors had. To make matters worse, the clip allows "slop" on the opposite side of the connector, creating a situation where a fully-locked connector can *still* end up with a big enough gap to be concerning.

The old connectors also allowed for gaps opposite the locking mechanism but they have so much headroom that the gap is far less likely to cause a problem in the wild.

The updated 12VHPWR connectors should fail less than the originals, or at the very least, fail "better" in that the mode of failure should be failure to power up rather than melting.
 
Last edited:
Steve did have a couple interesting lines in the video in regards to safety upgrades for 12hwpr. Firstly, nothing is mandatory, and secondly any manufacturer can adopt any or none of the updates and features. Thats why even after supposed updates to the spec you had adaptors and cables releasing but not supporting any of the changes - there are even adapters and cables that have the 4 sense pins and they do little or nothing.

The second thing mentioned is the room for error, which twinz mentions above as well. The entire design is more compact and more precise than 6 and 8 pins, and any manufacturer who wants to cheap out or try to do things cheaply is going to build a failing product, so you have manufacturers in China who are used to how cheaply and easy it was to build 6 and 8 pin cables and adapters and now struggling with the 12 pin because the required manufacturing is so much more precise to get it right. I wonder if there is actually any cables or adapters that 100% meet the 12hwpr spec in build quality and features support
 
Last edited:
The second thing mentioned is the room for error, which twinz mentions above as well. The entire design is more compact and more precise than 6 and 8 pins, and any manufacturer who wants to cheap out or try to do things cheaply is going to build a failing product, so you have manufacturers in China who are used to how cheaply and easy it was to build 6 and 8 pin cables and adapters and now struggling with the 12 pin because the required manufacturing is so much more precise to get it right. I wonder if there is actually any cables or adapters that 100% meet the 12hwpr spec in build quality and features support
Good point around the manufacturers skimping on the build. This has been an issue for a long time. Certain cables are designed way over the current needed specs, so the cable designers build to a lower spec and nobody notices as at the time nowhere near reaching the claimed spec. This became very apparent when 2.1hdmi opened to 120hz. Many people suddenly having issues. Another time this happened was usb cables and vr. Vr was first hardware that would need the full spec, all of a sudden many people found out their cables couldn't handle the bandwidth and in many cases even the board manufacturers had skimped, so there were even motherboards that couldn't handle it.
 
Last edited:
I don't think Nvidia really shoulders much blame over all this (for once). The idea of introducing a new connector to serve the needs of power-hungry modern cards isn't a bad one. Frankly, it's silly having to run three or four seperate 8-pin cables to higher-end cards. It's just that the resulting design that's a complete mess thanks to a lack of proper standards from PCI-SIG and manufacturers cutting corners to save half a cent.
What they should be doing is pushing for more efficiency and less power draw. Nvidias upcoming 5090 is rumoured to be a 600w+ card. This is obscene, things are supposed to get better not worse. Everyone was saying the GTX480 was a joke with it's power draw but even that pales into insignificance when compared to these new cards. Same with cpu's, AMD's FX9590 was a 220w part and could draw even more than that when pushed yet Intels 14900k sails way past that and nobody bats a eyelid. The big three, Intel, Nvidia, AMD, all need to get on top of power efficiency. In AMD's favour at least Ryzen is going in the right direction but they have a lot of work to do on their GPU's.
 
I wonder what these psu's that have native 12hwpr power points are gonna do when the rtx5090 launches with two power points. Many psu's only support one 12hwpr, how will the second be wired as the 5090 is rumoured to have two 12hwpr points

They will either have to buy a new PSU or most likely use an 8x pin to 16 pin adapter. My Corsair AX1600i has no 12VHPWR connector to begin with so if I go 5090 and it happens to use two then I will be splitting it over 4x 8x pin to 16 pin cables with two 8x pins per 12VHPWR connector as these cables are capable of 600w over two 8x pins anyway.
 
Last edited:
What they should be doing is pushing for more efficiency and less power draw.
Efficiency and power draw aren't the same thing though. The 4090 is one of the most power efficient cards ever made, despite its high power draw. It's only beaten in that respect by the 4080 and 4080 Super, which are also 300W+ cards. Despite their high power consumption, they deliver the performance to justify it. Parts like the 9590 and 14900K do not, and simply chug power for relatively underwhelming performance increases over other options. Additional performance can't just be magicked up out of nowhere when it comes to GPUs. The way you make them perform better is by making them bigger, which also makes them consume more power. That's somewhat offset by architectural improvements and node jumps, but there are limits. It's going to be even more of a struggle with the RTX 5000 series, as this time Nvidia don't have a large node jump to help them out. It's essentially the same manufacturing process as the 4000 series, so beyond whatever architectural improvements they've come up with (and most of the low-hanging fruit when it comes to efficiency was picked a long time ago), the only way to push more performance is going to be bigger GPUs and higher power consumption.

Ultimately, people want to see meaningful performance increases gen on gen. You only need to look at the reception the Ryzen 9000 series got recently, which delivered some impressive power efficiency gains over the 7000 series but little in the way of a performance uplift, and it was roundly panned for that. The average consumer doesn't care about power efficiency and just wants to see benchmark number go up.
 
What they should be doing is pushing for more efficiency and less power draw. Nvidias upcoming 5090 is rumoured to be a 600w+ card. This is obscene, things are supposed to get better not worse.
It is, in lower SKUs. The xx90 series are for enthusiasts who don't care about power use. Also, these chips come from datacentre design, where power use isn't a big issue where performance matters the most, though they have enough power and cooling for that and do not care about noise - not the case at home with heat and noise. Still, we are at the edge of what silicon tech can do, so you either can get more performance or more power, or less power used - not both anymore, in most cases. In addition, it seems power efficiency in PCs just do not sale - customers historically did not care one bit about that, they only cared about more performance.
 
They will either have to buy a new PSU or most likely use an 8x pin to 16 pin adapter. My Corsair AX1600i has no 12VHPWR connector to begin with so if I go 5090 and it happens to use two then I will be splitting it over 4x 8x pin to 16 pin cables with two 8x pins per 12VHPWR connector as these cables are capable of 600w over two 8x pins anyway.
Corsair's original cable that can be used with many of their PSUs indeed has 2x8pin on one side and 12VHPWR on the other side. You just need 2 of these, there should be no problem with that. And each can handle 600W by itself anyway, so with 2 it would be just sharing the load, so 300W max per cable, which is even easier and safer to run. I am currently using one from them with my 850W PSU with 0 issues with 4090 (so far, but you never know with this bloody connector!).
 
I wonder what these psu's that have native 12hwpr power points are gonna do when the rtx5090 launches with two power points. Many psu's only support one 12hwpr, how will the second be wired as the 5090 is rumoured to have two 12hwpr points

Still don’t think they’ll need two cables as the 5090 should come in under 600W. Could be wrong but I can’t see it happening.
 
I'm really hoping it comes in under 600W. I've got a Corsair RM1200X Shift, which has one 16-pin socket. That should be enough for a 600W 5090 (paired with a 7950X3D) and even if it had two 12vhpwr ports on the card I assume I could use an 8-to-16 pin adapter for the 2nd. But still.. I just don't think this is reasonable or healthy.

The 4090 can retain almost all of its performance well below 450W so I doubt a 5090 is really going to get much out of going to 600W. Plus at 600W, Nvidia really would be waving their buttocks at environmental regulators, particularly the EU, and inviting them to take their best shot. It only takes one or two articles on the growing power cost of gaming PCs in reasonably mainstream news outlets to draw the Eye of Sauron.
 
I’ve “only” got a 1000W PSU but still expect it to be fine for a 5090FE and a 9950X3D.

EDIT: Having just written that it’s almost inevitable that it won’t be enough now!
 
Last edited:
Gamers nexus has done a new survey of 23k users

What they found is 12hwpr is 22% more likely to fail than 6 and 8 pin PCIE

I think you're understating the difference. PCIe 6/8 pin connectors have been around for about 20 years now, while 12VHPWR has only existed for 2. If we assume that failures are distributed evenly over time, then it would actually be a 12x difference. If we assume a bathtub curve for failure rate then it's even worse (because GPUs using 12VHPWR connectors are still a long way from the end of their lives, and thus aren't expected to fail yet).
 
I’ve “only” got a 1000W PSU but still expect it to be fine for a 5090FE and a 9950X3D.

EDIT: Having just written that it’s almost inevitable that it won’t be enough now!

It'll be fine (as long as it's a reputable make).

I have a Phanteks 1000W PSU powering a 12700K, 4090, multiple RGB fans, lots of drives and it never gets stressed.

If i was planning on getting a 5090, wouldn't give it a second thought about my PSU not being powerful enough.
 
Last edited:
  • Like
Reactions: HRL
I think you're understating the difference. PCIe 6/8 pin connectors have been around for about 20 years now, while 12VHPWR has only existed for 2. If we assume that failures are distributed evenly over time, then it would actually be a 12x difference. If we assume a bathtub curve for failure rate then it's even worse (because GPUs using 12VHPWR connectors are still a long way from the end of their lives, and thus aren't expected to fail yet).
we will find out that 12hwpr was a mistake when nvidia finaly decides to change it :)
 
Still don’t think they’ll need two cables as the 5090 should come in under 600W. Could be wrong but I can’t see it happening.

Even if it's under 600w I could still see a scenario where they put two power points on it , that way each cable has less current and is less likely to catch on fire
 
Even if it's under 600w I could still see a scenario where they put two power points on it , that way each cable has less current and is less likely to catch on fire
This is what I was thinking.

I watch my 4090's connector like a hawk, but I'll only check my 4080S when I open the case for other reasons because it's only trying to pull about half of the spec's rated wattage.

A 600w card with two 12VHPWR connectors would actually have a comfortable margin for error.
 
Last edited:
Back
Top Bottom