• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Friendly warning concerning Vega 64 and pcie plugs

Associate
Joined
26 Aug 2016
Posts
561
That didn't happen because you used one cable. I am certain the single cable from the PSU is good for the 300w maximum draw the two 8 pin connectors on the end of if are rated for. No, that kind of localized damage almost always happens as a result of a faulty connection.
Agreed. That looks like a bad connection or a short at the socket end.

They would not provide or certify doubled plugs on a single cable if there was a potential fire hazard. The PSU rail is far more likely to trip out before the cable even gets warm.
 
Man of Honour
Joined
21 May 2012
Posts
31,940
Location
Dalek flagship
Agreed. That looks like a bad connection or a short at the socket end.

They would not provide or certify doubled plugs on a single cable if there was a potential fire hazard. The PSU rail is far more likely to trip out before the cable even gets warm.

The fire hazard is because of what the card can pull at the other end which is sometimes beyond the PCI-E spec.

I have owned cards in the past that can pull nearly 600w imagine most of that going through a single cable.:eek:
 
Associate
Joined
26 Aug 2016
Posts
561
The fire hazard is because of what the card can pull at the other end which is sometimes beyond the PCI-E spec.

I have owned cards in the past that can pull nearly 600w imagine most of that going through a single cable.:eek:
It's not a single wire though. It's 4 pairs (although not all are 12V), and you forget that your motherboard supplies up to 75W of that directly.

Edit: I believe each 8-pin PCI-E socket should be rated at 150W. A double-headed cable would be rated at at least 300W or there is no way it would be allowed to be sold. So connecting a "single" cable with two plugs to a single card should be good to supply the card with 300W (plus an additional 75W from the PCI slot itself).

How many sockets did your 600W card have out of curiosity (or was it a custom socket with a custom cable?)

If the cable was a problem, I reckon it would have melted the insulation rather than a specific pin on the plug. This is more likely a soldering defect or a bad connection between the socket and pin.
 
Last edited:
Soldato
Joined
14 Aug 2007
Posts
4,100
It's a mistake that many of us have made though in the pursuit of neatness. When I had my R9 390X I used only one cable, there were no warnings against doing so in the manuals of either the card or the PSU and I simply assumed that the fact there were two plugs on the cable meant that it could handle it, thankfully no damage was done. Now I know better of course, but the companies that sell the hardware need to do a much better job of informing customers, particularly when they buy power-hungry cards.

Yep, completely agree with this. I recently put together a new system, there were no instructions with the PSU and the instructions that came with the GPU were absolute *****. I used two separate cables as I'd happened to have read this before, but there was nothing to tell me I should, it seems fair to assume that if there are two connectors on one cable it should be safe to use, and yeah it would certainly be a lot tidier.
 
Permabanned
Joined
12 Sep 2013
Posts
9,221
Location
Knowhere
I hate the fact that my cables have 2x 6+2 pins on them, None of them are just 1 x 8 or 1 x 6+2 so I'm stuck with 2 extra plugs just hanging there by the gpu sockets, How do you tidy that up?
 
Soldato
Joined
26 May 2009
Posts
22,101
This again? /facepalm

Seriously this **** is going to keep on happening until PSU manufacturers either stop putting 2x8pin on their cables or make it extremely clear in the instructions (or better yet on the PSU) that they are there so people can use 8+6 in the order they prefer not so people can use 8+8 >.>
 
Soldato
Joined
26 May 2009
Posts
22,101
That didn't happen because you used one cable.
It did, this is a known issue with PSUs that has been occuring on and off for about a decade now (more common with the Superflower design that used a smaller wire gauge 9pin connector instead of the regular 8pin tho).

Basically, only the connectors at the end of the cable that connects to the GPU have to conform to the ATX standard, the manufacturers are free to put whatever the hell they like on the modular end, and most (due to simplicity/historical precedent) use a single 8pin connector which isn't capable or powering two 8pins on the other end. The reason for this is that you're not supposed to use both the 6+2 connectors in 8pin configuration at the same time, sadly most PSU manufacturers fail to make this clear resulting in dozens of burnt connectors over the years with high power GPUs.
 
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
The ax860 uses 18awg throughout. That lead has 4 pairs of pins for 12v transmission at the PSU end. If that Vega was pulling 400w from the pcie cable alone, that's 400w/4 = 100w per pair or about 8 amps at 12v. That's barely enough to warm the conductor up. its way more likely that pin was ill-fitting from the start or damaged on insertion at some point which lead to arcing and eventual burn out, in which case using a second lead might have stopped it happening but it still would have be possible.
 
Last edited:
Soldato
OP
Joined
10 Oct 2012
Posts
4,421
Location
Denmark
Okay, okay, lets get one thing clear before this topic goes completely of the rails. The fault for this little mishap is mine, the OP, and mine alone. Not corsairs, not AMD/Vega. If I had done my research I would have known that the vega chip would go out of spec if allowed. I had just become so use to nvidias heavy power constraints that I forgot that AMD/RTG don't do that. Again the fault is my own and this topic was meant as a friendly warning/remind to others not to repeat the mistake I had done.

The ax860 uses 18awg throughout. That lead has 4 pairs of pins for 12v transmission at the PSU end. If that Vega was pulling 400w from the pcie cable alone, that's 400w/4 = 100w per pair or about 8 amps at 12v. That's barely enough to warm the conductor up. its way more likely that pin was ill-fitting from the start or damaged on insertion at some point which lead to arcing and eventual burn out, in which case using a second lead might have stopped it happening but it still would have be possible.

Considering i've measured peaks in power draw, at the wall socket, in the 750 watts area i would say it has been pullin way north of 400 at times considering i was benching only the gpu and the system sits at 150 idle.
 
Man of Honour
Joined
21 May 2012
Posts
31,940
Location
Dalek flagship
It's not a single wire though. It's 4 pairs (although not all are 12V), and you forget that your motherboard supplies up to 75W of that directly.

Edit: I believe each 8-pin PCI-E socket should be rated at 150W. A double-headed cable would be rated at at least 300W or there is no way it would be allowed to be sold. So connecting a "single" cable with two plugs to a single card should be good to supply the card with 300W (plus an additional 75W from the PCI slot itself).

How many sockets did your 600W card have out of curiosity (or was it a custom socket with a custom cable?)

If the cable was a problem, I reckon it would have melted the insulation rather than a specific pin on the plug. This is more likely a soldering defect or a bad connection between the socket and pin.

You really don't get it do you.

High end graphics cards can and do pull more watts than the PCI-E standard, this is the bit that causes the problems.

Anyone who is into overclocking is not going to be put off buying a Kingpin 2080 Ti for example because it totally trashes the PCI-E standards. IIRC for cards like this 8 Pack uses a PSU dedicated to each card.

Also decent motherboards can supply a lot more than the 75W you quoted above.

For high end GPUs the PCI-E standards go out the window.
 
Soldato
OP
Joined
10 Oct 2012
Posts
4,421
Location
Denmark
An interesting finding btw: Ive only really played Path of Exile of late, which is why im baffled when i see 740watts at the wall, but just now i noticed that i had Global Illumination turn on in the game. Turning it off drops the power draw at the wall by as much as 150watts. Goes from 740 to around 600. Now of course some of it is due to the gpu going down to 85% spike gaining 30+ fps.. but that is still mental for not much in return. Guess i better test and see if this is similar in other games.. Anyone have some pointers to games with Global Illumination?
 
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
if I had done my research I would have known that the vegadering i've measured peaks in power draw, at the wall socket, in the 750 watts area i would say it has been pullin way north of 400 at times considering i was benching only the gpu and the system sits at 150 idle.

Crikey that's more than most average PSUs will provide lol. Your mesurements would suggest the card is drawing up to 600watts (load minus idle wattage of course). Up to 75w of that will be pulled from the pcie slot assuming the card stays within the limits so call it 525w from the pcie leads. 525w / 4 = 131.25w per pair of conductors or 10.9a at 12v. Or a bit more if the voltage was sagging. Not much of a difference and I still blame the plug.

If you ran another lead from the PSU you'd half that load (on a perfect world, I'm not considering resistant and temperature here) so call it 5.5 Amps per pair. It's not really much of a drop and still enough to cause arcing on a faulty pin. In other words, it would have probably happens with both plugged in. It is still good practice to use two leads, of course - that's still am important point to make. :)

kapstaad said:
Also decent motherboards can supply a lot more than the 75W you quoted above

That just means less of the total load is pulled from the cables, but I don't for a second think that the 12v rails to the card are just summed up in a way that makes that very useful.
 
Last edited:
Soldato
OP
Joined
10 Oct 2012
Posts
4,421
Location
Denmark
Crikey that's more than most average PSUs will provide lol. Your mesurements would suggest the card is drawing up to 600watts (load minus idle wattage of course). Up to 75w of that will be pulled from the pcie slot assuming the card stays within the limits so call it 525w from the pcie leads. 525w / 4 = 131.25w per pair of conductors or 10.9a at 12v. Or a bit more if the voltage was sagging. Not much of a difference and I still blame the plug.

If you ran another lead from the PSU you'd half that load (on a perfect world, I'm not considering resistant and temperature here) so call it 5.5 Amps per pair. It's not really much of a drop and still enough to cause arcing on a faulty pin. In other words, it would have probably happens with both plugged in. It is still good practice to use two leads, of course - that's still am important point to make. :)



That just means less of the total load is pulled from the cables, but I don't for a second think that the 12v rails to the card are just summed up in a way that makes that very useful.

Well i guess we just have to disagree then which is perfectly fine. There is room for both of our opinions :).
 
Back
Top Bottom