• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA RTX 50 SERIES - Technical/General Discussion

The 5070ti was the card on paper that made the most sense albeit 750 is still big money for that range of card, when it wasn't long ago you could get the top card for that price.

Looking at the prices in America which generally is reflected here (UK) as $1=£1 it would be foolish to buy it. You'd better wait for the 5080 to come back into stock if you're adamant you're buying a 50 series card

I was adamant I was buying a new card, but if the rumoured prices for the amd are as high as being reported, the nvidia cards are extremely high, I think I'll hold onto my 6950xt which was only supposed to be a temporary card in my system from the summer to launch of the new cards.

There's no value/stock in the market
 
Apologies if this has been answered already, but do we have any good data yet on how far you can dial down the power limit on the 5090 without losing, say, more than 3% performance? When mine finally does ship, I'd definitely like to run it below 575W (500W or less would be ideal). I've capped my 4090 at 90% power from the day I got it.
 
Apologies if this has been answered already, but do we have any good data yet on how far you can dial down the power limit on the 5090 without losing, say, more than 3% performance? When mine finally does ship, I'd definitely like to run it below 575W (500W or less would be ideal). I've capped my 4090 at 90% power from the day I got it.
Having a look at this thread for some discussion on undervolting a 5090 and the minimal performance loss. https://old.reddit.com/r/nvidia/comments/1irplq5/my_results_of_undervolting_a_rtx_5090_founders/

You can do the same on your 4090 instead of just power capping it.
 
I still don't understand why people weren't more curious about the 8-pin side of the adapters and why *that* end of the included adapters were not what was showing up on Reddit.

It was as if people were plugging in one side of the adapter correctly (the 8-pin side) and then forgetting how to plug in cables when it came time to plug in the other end of the same adapter. (The 12VHPWR side)

The difference between the two sides of the adapter isn't the user plugging in the connectors or their propensity for "error". No, the difference has always been the saftey margin (or lack thereof) on the two different kind of connectors on that adapter.

12VHPWR has been designed within an inch of its life. I think that is the root-cause of this issue.
True it's designed with too little headroom. The 8-pin to 12V connectors may offer a degree of extra safety, as the PSU side will cap draw from each 8-pin socket.
 

Oh goody, I sure am looking forward to upgrading my PSU again after already upgrading to an ATX3.1 thinking it would prevent cables melting. Oh and then again to ATX3.2 when Nvidia finally fix it properly.

Here's an idea, how about those same PSU manufacturers go back to the PCI SIG they're part of and fix the godamn broken standard they had a hand in creating?

If this logic is purely in the cable and doesn't require a new PSU, firstly, it would be breaking the spec as the sense pins cannot change during operation. Secondly, there's no way people have the room to fit even more width between their GPU connector and side panel.
 
Last edited:
True it's designed with too little headroom. The 8-pin to 12V connectors may offer a degree of extra safety, as the PSU side will cap draw from each 8-pin socket.
Are you sure about that? It's still a single 12v rail in the PSU. 150w is only the standard, not the limit of the physics.
 

Oh goody, I sure am looking forward to upgrading my PSU again after already upgrading to an ATX3.1 thinking it would prevent cables melting. Oh and then again to ATX3.2 when Nvidia finally fix it properly.

Here's an idea, how about those same PSU manufacturers go back to the PCI SIG they're part of and fix the godamn broken standard they had a hand in creating?

If this logic is purely in the cable and doesn't require a new PSU, firstly, it would be breaking the spec as the sense pins cannot change during operation. Secondly, there's no way people have the room to fit even more width between their GPU connector and side panel.
It was nvidia and then a little Dell input, that designed the connector, not the PSU manufacturers.
 
Are you sure about that? It's still a single 12v rail in the PSU. 150w is only the standard, not the limit of the physics.
Using multiple 8 pins on the PSU side is an improvement (in theory at least) as potentially problems are spread over more strands PLUS the surface area of 2 plugs is much higher which dissipates heat better.

This has been mentioned by Aris (Hardware Busters/PSU cert), R&D head from Corsair and the Intel electrical engineer.
 
Are you sure about that? It's still a single 12v rail in the PSU. 150w is only the standard, not the limit of the physics.
AFAIK (not a sparkie by any means) PSUs also limit the draw of their connectors, separately to single/multi-rail OCP. e.g. my Corsair PSU has a limit of 300w per 8-pin (others are likely lower) which my CableMod cable translates to two pairs of 12v cables. So a 300w limit per two cables; it's not much, but it's some protection.

By comparison a 12v connector on the PSU side would just limit draw to ~600w, which could in theory all go to one cable. Just shows how screwed-up and inferior the 12v connector spec is at every end.
 
Last edited:
True it's designed with too little headroom. The 8-pin to 12V connectors may offer a degree of extra safety, as the PSU side will cap draw from each 8-pin socket.
It would not, a PSU will supply whatever power a device draws right up to the PSU's rated maximum output.

Things that supply power, computer PSU's, the wall sockets in your house, the local substation, and even power stations will supply whatever power something draws regardless of the type of socket, cable, or whatever it is that's drawing that power. We put measures in place to cut power if something tries to draw more than the system is rated for, but those a safety measures they're not only send this much power down this wire, through this socket, measures.
my Corsair PSU has a limit of 300w per 8-pin (others are likely lower) which my CableMod cable translates to two pairs of 12v cables. So a 300w limit per two cables; it's not much, but it's some protection.
It says that's what each 8-pin is rated for, it does not say you could not draw more than 300W if you were silly enough to try.

If you took a DMM to one of the +12v pins on those 8-pin PCI-E connectors i pretty much guarantee you that it would measure near enough the wattage & amperage that's listed on the specs for the entire +12 rail.
 
Last edited:
Back
Top Bottom