• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA RTX 50 SERIES - Technical/General Discussion

Last edited:
I’d agree but the problem is you can’t get an RTX 5080 anywhere close to the RRP either. You’re going to be paying nearer to £1100-1200 even when they’re eventually in stock as far as I can see?

On that basis I can see why many will still be tempted by an RTX 5070Ti even at £800-900.

More frustratingly, from what I’ve seen so far even when the stock situation improves there’s still no reason to expect prices to fall. The only chance of that happening is if the new AMD cards are competitive in terms on performance and features.

Between features like DLSS and Ray tracing Nvidia seem to have a monopoly on features/game support even if AMD can match the performance of the 5070 series.
FSR4 remains to be seen how good it is compared to DLSS but the RT in this generation apparently is much closer and the gap has closed more than expected but I'll wait to see 3rd party reviews before taking that as gospel.

On the 8/900-1200 quid for 5070 cards, it just doesn't appeal to me at all, it's overpriced for what it is as far as I'm concerned. But it depends on what you're coming from. If it's any 4000 series card, I really don't think the value is their to upgrade. If it's 3080 upwards again I don't see the value and the card I bought as a stop gap in the summer the 6950xt, I don't see the value either.

That is just my personal opinion, everyone else can make up their mind what they are happy to pay. People will pay £1100 for a 5070ti, I personally think that's nuts but each to their own
 

NVIDIA RTX50 series doesn’t support GPU PhysX for 32-bit games​


Modern CPUs still can't run advanced physx

As shown in the examples, using a 5090 with a 7950x3d and turning on physx in Cryostasis results in fps of 13, where as using a 4000 series GPU it is 100fps as physx can run on those GPU's
 
Last edited:
Sentences matter, you can prove anything if you quote out of context.
Your response suggests you realise that you're wrong, but let's make it obvious for anyone else reading. You said:
....a PSU will supply whatever power a device draws right up to the PSU's rated maximum output.

Things that supply power, computer PSU's, the wall sockets in your house, the local substation, and even power stations will supply whatever power something draws regardless of the type of socket, cable, or whatever it is that's drawing that power....
At your request I provided a link from techpowerup which proves the above is incorrect:
According to Corsair, this unit's +12V OCP is set to 40 A on each modular 8-pin connector (for PCIe and EPS connectors),
However you misunderstood the article, saying that it meant the opposite, i.e. that the PSU was:
Limited to the output of the single +12v source rail.

Limited to 40 amps across all 8-pin, EPS, 24-pin, and 6-pin connectors combined.
Really? You think the HX-1000 is limited to under 40A x 12v = 480w? A 1 KW power supply?

That aside, Techpowerup disproves the claim that a PSU will supply whatever it a device draws, regardless of socket, which is what we've been 'discussing' for several days.
 
Your response suggests you realise that you're wrong, but let's make it obvious for anyone else reading. You said:
No it does not and i know what i said. Look obviously you're trying to argue that electricity doesn't work how electricity works, that maths isn't maths, and nothing is going to dissuade you from that so have at it.

You clearly don't understand what you're talking about so I'll leave you to it, maybe someone else will have more success in teaching you about electricity.
That aside, Techpowerup disproves the claim that a PSU will supply whatever it a device draws, regardless of socket, which is what we've been 'discussing' for several days.
Like i said take a DMM to one of the +12v DC pins on your PSU and you can easily prove me wrong if you like.
 
Last edited:
That's an interesting point. Der8auer used the Corsair multiple 8pin to 12VHPWR cable when showing that massive current imbalance where 2 strands of cable were taking most of the load. I think however this cable just merges all 8x 12V strands into one blob of 12v before sending it down the 6x 12v strands to the GPU, so doesn't actually ensure that half the 6 strands can only receive a maximum of half the load i.e 600w i.e 200w per strand max.

Mods might want to insist all power cable talk is moved to the 12VHPWR thread, this might be annoying for people who don't care lol
It is an interesting one, although Corsair at least only limit the 8-pin to 480w, not the 300w as I thought so the protection is a lot less. My specific cable merges each 8-pin into two live and two ground wires and connects that all the way to the GPU socket so I guess there's that? But the PCB of the 4090 Strix shows what looks like two shunt capacitors in parallel coming off the 12V-HPWR, so there's probably slightly more protection there. Either way it's still worrying and just a poor design by Nvidia.

Fair point r.e. the spam, should probably minimise it.

If you check the manual of a Corsair PSU, for example AX1200i it states that the OCP is per rail (single or multiple)

Page 5
You're right. However the manual says nothing about what each rail is linked to in multi-rail configuration. We know from the Techpowerup article that there is one 40A rail per 8-pin in the HX-1000i at least.

It's not great but 480w max would be slightly better than 600-700w GN etc have seen drawn over 12v-6x2.

Like i said take a DMM to one of the +12v DC pins on your PSU and you can easily prove me wrong if you like.
Calm down, nothing worse than a sore loser.
 
Last edited:
It is an interesting one, although Corsair at least only limit the 8-pin to 480w, not the 300w as I thought so the protection is a lot less. My specific cable merges each 8-pin into two live and two ground wires and connects that all the way to the GPU socket so I guess there's that? But the PCB of the 4090 Strix shows what looks like two shunt capacitors in parallel coming off the 12V-HPWR, so there's probably slightly more protection there. Either way it's still worrying and just a poor design by Nvidia.

Fair point r.e. the spam, should probably minimise it.


You're right. However the manual says nothing about what each rail is linked to in multi-rail configuration. We know from the Techpowerup article that there is one 40A rail per 8-pin in the HX-1000i at least.

It's not great but 480w max would be slightly better than 600-700w GN etc have seen drawn over 12v-6x2.


Calm down, nothing worse than a sore loser.
If you have one rail you get the full rated current from any connector on that rail. It's the responsibility of the part drawing power to limit it to meet the standard.

If you go multi rail you can limit each rail to less than the full rated power of the supply.

Nearly all supplies these days are single rail.

If you have a 1kW PSU, it will hold up that 12v rail until you ask for more than 83.3A. That's it.
 
Nvidia press release about how well the 5000 series launch has gone....

IQQqu0fVmwowSLpXBrsL5j54AYJwI7VXJEwGxbqq4jI9SCM


The screen I get when trying to download the latest driver.
 
It's not looking good guys :D

I'm not sure where that level of FOMO comes from. (assuming people are buying at those prices) At least during the mining craze, I could see people rationalize that they could get their money back in x amount of time. GPU's were basically money-printers back then. I don't think GPU's can print money for end-users anymore, and I can't imagine anyone getting ten thousand dollars-worth of "happy" out of a graphics card.
 

Just one month ago, Cablemod saying that 12V-2x6 also included changes to the cable plug. FFS, what is it with cable companies talking nonsense?

What they have said there makes sense.

The new H++ standard didn’t change the minimum requirement spec for their cables, but they still made improvements to their cables (regardless of the new H++ standards) and incorporated those changes into their cables marked H++.
 
Last edited:
What they have said there makes sense.

The new H++ standard didn’t change the minimum requirement spec for their cables, but they still made improvements to their cables (regardless of the new H++ standards) and incorporated those changes into their cables marked H++.
They're basically doing what MODDIY did. Make some small improvements to the cable that are NOT part of the spec but their own choice and then call the new cable '12V-2x6' to try and frame it like it's part of the new spec. They also seem to be telling people the new cable is required for 50 series. Seems like they're being misleading on purpose.

Look at this answer to this person

mixed-info-on-12vhpwr-compatibility-with-50-series-v0-8aa9j85hk4he1.jpeg
 
^ that’s not ‘wrong’ though. They are just saying “don’t use cable 1, use cable 2 - because it has improvements”.

That’s not going quite as far as Moddiy who did say ‘the specs have changed because of the new standard’ - that is the bit that’s BS.

In the Reddit thread you posted, they do generally stick with ‘use cable 2 - it’s better’.

I agree that does get a little murky when they say ‘Nvidia recommends using cable 2 with 50 series’ - have they? I’m not sure they have… publicly, at least.

Was all of this confusion avoidable? Undoubtedly, yes.
 
Last edited:
Back
Top Bottom