• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA RTX 50 SERIES - Technical/General Discussion

3rd party cables < 1st party cables even if it's been fine for 2 years on the last card. Just get the official cable with the PSU or use the new better Nvidia adapter.
 
3rd party cables < 1st party cables even if it's been fine for 2 years on the last card. Just get the official cable with the PSU or use the new better Nvidia adapter.
That's half the problem but I suspect it may happen more at these wattages unless Nvidia don't ship as many as the 40x series.
 
  • Like
Reactions: mrk
And so it begins haha, So much for it being fixed and there's hardly any out there yet, It's only going to get worse at 575w.

Ok he was using a Moddiy cable bit it was still certified, It's just too far close to the edge of the power limit imo.
I can see a lot of discussion about 12VHPWR vs 12v-2x6 and many comments correctly identify that the change in spec DOES not affect the cable/cable plugs. The changes are on the female connectors device side. This is GPU and, perhaps more importantly in this case, ON THE PSU ALSO.

The OPs Asus Loki SFX-L appears to use a 16 PIN connector on the PSU side which is likely to be the old 12VHPWR spec and not the updated 12v-2x6. The Asus Loki SFX-L launched in early 2023 and 12v-2x6 in late 2023. It seems unlikely the PSU is 12v-2x6 unless it was bought recently and the PSU received a revision (As far as I can see, it did not)

All being said, even if the GPU and cable is 12v-2x6 the PSU is not.
 
This is partly why I've shunned from chasing the 5090 at the mo, aside from the obvious facts that they are ridiculous money, and you can't get them.

But, pushing 600w is just ludicrous for a consumer GPU. I'm just not comfortable with that at all. I'm riding the tiger a bit using a 3rd party cable adapter for my MSI Trio 5080, but I'm curve undervolting now and don't plan on going over 300w so fingers crossed eh....
 
Last edited:
Everyone will blame the user or the cable when we all know 90 series cards are pulling too much power through these cables.
I mean I think both can be true. I think they should have reconsidered the connector but using third party connectors is always a risk.
 
And so it begins haha, So much for it being fixed and there's hardly any out there yet, It's only going to get worse at 575w.

Ok he was using a Moddiy cable bit it was still certified, It's just too far close to the edge of the power limit imo.

By his own admission he was using this:


Really, he should have been using this:


It seems silly to enable issues like this by using such similar connectors. The whole ‘yeah it’s all backwards compatible’ stuff is confusing / misleading.

Some manufacturers are suggesting that the H+ and H++ cables are identical and it’s only the GPU and PSU ends that have changed. But, if that was true, why are they selling two different cables with 2 different power ratings.

… confusing!
 
By his own admission he was using this:


Really, he should have been using this:


It seems silly to enable issues like this by using such similar connectors. The whole ‘yeah it’s all backwards compatible’ stuff is confusing / misleading.

Some manufacturers are suggesting that the H+ and H++ cables are identical and it’s only the GPU and PSU ends that have changed. But, if that was true, why are they selling two different cables with 2 different power ratings.

… confusing!
The guage of the cable might be different? (i haven't looked into it as I haven't got the card)
 
3rd party cables < 1st party cables even if it's been fine for 2 years on the last card. Just get the official cable with the PSU or use the new better Nvidia adapter.
Yea’ I was using my old Corsair cable, which was the first aftermarket cable they did, it served my 4090 well, then someone in this thread recommended using the adapter that came with it, so I switched over. At least if anything goes wrong I’ve got warranty.
 
Last edited:
  • Like
Reactions: mrk
The only game that I own that has a setting requiring more than 16gb is Indiana Jones. With a 5080, you need to reduce the texture pool setting to ultra (reduce to ultra... read that again lol) which just means the cache size is smaller... but it's still a massive cache size. In reality this just means higher res textures are drawn in closer to the camera but this has been tested by reviewers and they all say that you can't see the difference. The textures still look identical, it doesn't change the texture quality or anything like that. It's just a cache thing. You could argue that the supreme cache setting is actually completely pointless.

A 3090 can't run Indiana Jones at an acceptable frame rate at the highest settings, even with Lossless Scaling, especially in the Thailand level where it's just unplayable. A 5080 can.

Another use case might be a heavily modded Skyrim. Pretty niche I would say.

The whole 16 vram thing is massively overblown especially when you look at the reality of it in games.
While VRAM is somewhat of a concern the real elephant in the room is that the 5080 is really only a modern 60/70 class so if a 3090 which was the top chip at the time is already unable to run the latest games at an acceptable frame rate then whats going to happen to the 5080 in 4 years?
 
Vram is very cheap for the manufacturers and with the money they are charging for gpus it's quite clearly planned to make you need to upgrade sooner in the future.

That's the point, a £1400 (realistically) 5080 with vram that can almost be filled right now never mind in a few years is wrong. Don't justify that crap.

That. But I would argue it is used more to to upsell. It bloody worked wonder with the 3080 and 3090 :cry:
 
Still it's a bit nuts either way.

I remember my uncle bought a packard bell, Pentium 60Mhz PC in 95. It was 'top of the range' (at least that was what he was told, I was only 10 and knew nothing at the time) for it's time and cost him circa £1600 and he got it on tick from Currys, by the time he paid it off, he already had a complete new setup and this computer was relegated to the utility room off the kitchen for me and my cousin to mess about on. My point is, it's stupid to take a loan for PC parts unless it's a short term one because by half way through paying it off, you're likely to want to upgrade again.

For those that might be interested It looked liked this(this is not actually it, I stole this pic off the web)
309311_a2hp0h3w7ujy.jpg

Man, a part of me really misses those computers. Even my first proper PC was a tower in 99. Never had those cases but they were all like that at school.

Oh and that printout reminds me of school days :D
 
While VRAM is somewhat of a concern the real elephant in the room is that the 5080 is really only a modern 60/70 class so if a 3090 which was the top chip at the time is already unable to run the latest games at an acceptable frame rate then whats going to happen to the 5080 in 4 years?

The 3090 can still run Indy at 90+ FPS. Just not at max settings.
 
Back
Top Bottom