• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA RTX 50 SERIES - Technical/General Discussion

Yeah it’s just a waiting game…

As much as I want to just get the waiting over and done with, I think I might cancel my pre-order if that retailer does decide to increase prices… whether that related to manufacturing costs, tariffs, they don’t have a choice or whatever.

As ‘suboptimal’ as a 5080 is if you’re lucky enough to be able to stretch to a 5090, there is absolutely no way an AIB 5090 is worth >2.5x the cost of an AIB 5080. That’s just silly. I think I’d rather have the 5080… or stick with my 3090 for now.
Sounds sensible. The amount you have paid for the Suprim is already plenty. Any more is utter madness.
 
Absolutely do not waste money on a new ATX3.1 power supply when you have a perfectly good PSU. 3.1 is actually a relaxation of the spec so in theory (although unlikely), a v3.1 PSU could be *worse* than the 3.0. The sense pins on the 12V-2x6 ports also achieve basically nothing. You don't even need the single 12VHPWR port on the PSU as multiple 8pin into 12V-2x6 (in a single cable atleast) is considered better than the single cable solution*. ATX 3.1 was a complete waste of time.

* mentioned by both the intel engineer and the Corsair psu R&D head.
This is really interesting. It's intuitive to me, as the PSU can then at least sense and limit the power going out of each 8-pin (which goes to two 12v-6x2 pins in my cable). Do you have a link to either reference?
 
Last edited:
Absolutely do not waste money on a new ATX3.1 power supply when you have a perfectly good PSU. 3.1 is actually a relaxation of the spec so in theory (although unlikely), a v3.1 PSU could be *worse* than the 3.0. The sense pins on the 12V-2x6 ports also achieve basically nothing. You don't even need the single 12VHPWR port on the PSU as multiple 8pin into 12V-2x6 (in a single cable atleast) is considered better than the single cable solution*. ATX 3.1 was a complete waste of time.

As in, the revision from H+ sockets to H++ sockets? I don’t think that’s right. There is greater scope for an improper connection / error with a H+ socket, compared to an H++ socket.

Of the single cable / single socket solutions (assuming you have all pins at their proper spec lengths):

Best by far: H++ to H++

Works but isn’t ideal: H+ to H++ (because having H++ at one end won’t save you is there is a problem at the other end).

Worst by far: H+ to H+
 
As in, the revision from H+ sockets to H++ sockets? I don’t think that’s right. There is greater scope for an improper connection / error with a H+ socket, compared to an H++ socket.

Of the single cable / single socket solutions (assuming you have all pins at their proper spec lengths):

Best by far: H++ to H++

Works but isn’t ideal: H+ to H++ (because having H++ at one end won’t save you is there is a problem at the other end).

Worst by far: H+ to H+
Both buildzoid and Der8auer have said the new sense pins are a waste of time.
 
The 5070ti is being listed at $1000 by microcenters in the US. This why you don't listen to people who tell you to wait for new GPUs at launch day. Black Friday was the best time to buy GPUs.
People paying more at launch for for slower cards than were previously available is quite a new thing.
 
Both buildzoid and Der8auer have said the new sense pins are a waste of time.

Yes, but only the context of the ‘true’ solution being ditching the 12VHPWR cables entirely and using something else.

If we are sticking with 12VHPWR (which is the reality that we’re currently living in - unfortunately!) then the move from H+ to H++ is meaningful and not a waste of time.

Put another way, if you ARE using a 12VHPWR cable and you want to maximise safety, then you should be aware of whether your PSU socket is H+ and (if so) try to avoid that socket if possible.

The way you are writing it suggests that nobody should be concerned with whether their sockets are H+ or H++. I’m saying that you definitely SHOULD be concerned with it, or at least aware of it.
 
Last edited:
try to avoid that socket if possible.
I mean sure, if you have 2 power supplies in front of you, one with, one without, then why not. But don't go spending money on a whole new PSU for this crappy feature that's a complete red herring.

There is no way to make a high power 40 or 50 series card safe. There is no special cable or technique to fix this. The 12VHWPR spec is bunk. The power regulation on the cards is bunk. It was bunk on the 4090 and it's bunk again now. Your choice is to just avoid it all completely OR try and determine what level of power draw is actually safe for these cards. Spoiler: it's not 600w.
 
Both buildzoid and Der8auer have said the new sense pins are a waste of time.
It's not so much they're a waste of time, it's that they're kind of badly named as while they do 'sense' what they're actually sensing is how much power a PSU can deliver on the 12vhpwr connector. They do this...

PCIe-Gen5-Power-Cable-5.png

(Source)

Is calling them sense pins misleading, IDK, I'll let others decide. I've only ever seen 600W 12vhpwr.
 
Last edited:
@UKGigaProb thinking about it, if you are currently using a “4 way octopus” adaptor for your 4090 (and you don’t want to change your PSU) your best solution is probably to just use whatever equivalent adaptor arrives with your 5090.

That way, if you got an MSI 5090 (or whoever), the GPU maker and the PSU maker can’t accuse you of using a ‘third party cable’ if anything goes wrong.

Plus, the adapters issued with the 5090s should all be made in the H++ era (I doubt any 5090 is coming with a dodgy, poorly made octopus adaptor considering they would be crucified if it melts).
Well, I using a cablemod cable now and not going to change the PSU anytime soon and was thinking of using the cable adapter that came with my 5090 which I will need to use as I am not trusting my old cable, but the problem is the EVGA PCI-E cables are rigged and thick and connecting 4 of them to the adapter and Nvidia putting the connector now on its side the strain from the EVGA cables will be tucking on the connector.

Will be fitting it tomorrow and I will see if it looks okay and go from there.
 
I mean sure, if you have 2 power supplies in front of you, one with, one without, then why not. But don't go spending money on a whole new PSU for this crappy feature that's a complete red herring.

There is no way to make a high power 40 or 50 series card safe. There is no special cable or technique to fix this. The 12VHWPR spec is bunk. The power regulation on the cards is bunk. It was bunk on the 4090 and it's bunk again now. Your choice is to just avoid it all completely OR try and determine what level of power draw is actually safe for these cards. Spoiler: it's not 600w.

I agree in principle.

I’m commenting on this because a few people here are currently in positions where their existing PSU is giving sub-optimal options and a new PSU might be the best way forward.

For example, some PSUs don’t have enough space PCIE cables spare to use an 4 way octopus adaptor that comes with a 5090 (real examples of this came up in this thread earlier today). In which case, they still might have the option of using whatever 12VHPWR option is available on that PSU. In these circumstances:

(i) if the socket on the PSU was H++, I’d be going with getting a new 12VHPWR cable issued by that PSU manufacturer (or using the one issued with the PSU).

(Ii) if the socket on the PSU was H+… I’d personally consider buying a new PSU that enabled using the octopus adaptor… or at least one that had a H++ socket.

It’s a quirky situation, but explains how this sort of this can be relevant. Of course, what a load of rubbish that this is even a thing….!
 
Nvidia is one the biggest companies in the world. Why can't they mass produce their cards rather than few FE editions and leave gamers by the mercy of these AiB partners ? Imagine if Sony/Microsoft decided to limit their own consoles and left it to brands like Asus and Gigabyte
 
Nvidia is one the biggest companies in the world. Why can't they mass produce their cards rather than few FE editions and leave gamers by the mercy of these AiB partners ? Imagine if Sony/Microsoft decided to limit their own consoles and left it to brands like Asus and Gigabyte
Because they sell they same silicon to AI buyers for 10x more money so they are obviously prioritising those customers above gamers which get the overpriced left over scraps.
 
Back
Top Bottom