Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
And so it begins haha, So much for it being fixed and there's hardly any out there yet, It's only going to get worse at 575w.
That's half the problem but I suspect it may happen more at these wattages unless Nvidia don't ship as many as the 40x series.3rd party cables < 1st party cables even if it's been fine for 2 years on the last card. Just get the official cable with the PSU or use the new better Nvidia adapter.
I can see a lot of discussion about 12VHPWR vs 12v-2x6 and many comments correctly identify that the change in spec DOES not affect the cable/cable plugs. The changes are on the female connectors device side. This is GPU and, perhaps more importantly in this case, ON THE PSU ALSO.And so it begins haha, So much for it being fixed and there's hardly any out there yet, It's only going to get worse at 575w.
Ok he was using a Moddiy cable bit it was still certified, It's just too far close to the edge of the power limit imo.
To be honest that's the reason I settled for the 5080 as I was concerned about the power draw on the 5090.
I mean I think both can be true. I think they should have reconsidered the connector but using third party connectors is always a risk.Everyone will blame the user or the cable when we all know 90 series cards are pulling too much power through these cables.
Sure, I wouldn't do it myself but there's so much gaslighting of buyers that this is their fault. It's pretty obvious this spec is a problem.I mean I think both can be true. I think they should have reconsidered the connector but using third party connectors is always a risk.
And so it begins haha, So much for it being fixed and there's hardly any out there yet, It's only going to get worse at 575w.
Ok he was using a Moddiy cable bit it was still certified, It's just too far close to the edge of the power limit imo.
The guage of the cable might be different? (i haven't looked into it as I haven't got the card)By his own admission he was using this:
![]()
ATX 3.0 PCIe 5.0 600W 12VHPWR 16 Pin to 16 Pin PCIE Gen 5 Power Cable
Buy ATX 3.0 PCIe 5.0 600W 12VHPWR 16 Pin to 16 Pin PCIE Gen 5 Power Cable for $24.99 with Free Shipping Worldwide (In Stock)www.moddiy.com
Really, he should have been using this:
![]()
ATX 3.1 PCIe 5.1 H++ 12V-2X6 675W 12VHPWR 16 Pin Power Cable
Buy ATX 3.1 PCIe 5.1 H++ 12V-2X6 675W 12VHPWR 16 Pin Power Cable for $29.99 with Free Shipping Worldwide (In Stock)www.moddiy.com
It seems silly to enable issues like this by using such similar connectors. The whole ‘yeah it’s all backwards compatible’ stuff is confusing / misleading.
Some manufacturers are suggesting that the H+ and H++ cables are identical and it’s only the GPU and PSU ends that have changed. But, if that was true, why are they selling two different cables with 2 different power ratings.
… confusing!
Yea’ I was using my old Corsair cable, which was the first aftermarket cable they did, it served my 4090 well, then someone in this thread recommended using the adapter that came with it, so I switched over. At least if anything goes wrong I’ve got warranty.3rd party cables < 1st party cables even if it's been fine for 2 years on the last card. Just get the official cable with the PSU or use the new better Nvidia adapter.
While VRAM is somewhat of a concern the real elephant in the room is that the 5080 is really only a modern 60/70 class so if a 3090 which was the top chip at the time is already unable to run the latest games at an acceptable frame rate then whats going to happen to the 5080 in 4 years?The only game that I own that has a setting requiring more than 16gb is Indiana Jones. With a 5080, you need to reduce the texture pool setting to ultra (reduce to ultra... read that again lol) which just means the cache size is smaller... but it's still a massive cache size. In reality this just means higher res textures are drawn in closer to the camera but this has been tested by reviewers and they all say that you can't see the difference. The textures still look identical, it doesn't change the texture quality or anything like that. It's just a cache thing. You could argue that the supreme cache setting is actually completely pointless.
A 3090 can't run Indiana Jones at an acceptable frame rate at the highest settings, even with Lossless Scaling, especially in the Thailand level where it's just unplayable. A 5080 can.
Another use case might be a heavily modded Skyrim. Pretty niche I would say.
The whole 16 vram thing is massively overblown especially when you look at the reality of it in games.
Vram is very cheap for the manufacturers and with the money they are charging for gpus it's quite clearly planned to make you need to upgrade sooner in the future.
That's the point, a £1400 (realistically) 5080 with vram that can almost be filled right now never mind in a few years is wrong. Don't justify that crap.
Still it's a bit nuts either way.
I remember my uncle bought a packard bell, Pentium 60Mhz PC in 95. It was 'top of the range' (at least that was what he was told, I was only 10 and knew nothing at the time) for it's time and cost him circa £1600 and he got it on tick from Currys, by the time he paid it off, he already had a complete new setup and this computer was relegated to the utility room off the kitchen for me and my cousin to mess about on. My point is, it's stupid to take a loan for PC parts unless it's a short term one because by half way through paying it off, you're likely to want to upgrade again.
For those that might be interested It looked liked this(this is not actually it, I stole this pic off the web)![]()
While VRAM is somewhat of a concern the real elephant in the room is that the 5080 is really only a modern 60/70 class so if a 3090 which was the top chip at the time is already unable to run the latest games at an acceptable frame rate then whats going to happen to the 5080 in 4 years?