• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Everything wrong with RTX 50 series launch [complete list]

Apart from everything else that's wrong with this launch -- and there's a lot -- the thing thay gets me is that nV obviously, 100% knew about the ROP issue and still chose to ship those chips. The defects being missed in rounds of very precise validation testing is so incredibly, infintesimally small, it's not even worth considering. Someone at nV (and AIBs) consiously decided, after seeing these chips were defective, to say "we're shipping them to consumers anyways".
 
I've decided to wait it out but honestly, what is there to wait for?

  1. 5080ti? Maybe, it's probably just going to be higher clocks I could just overclock myself. More VRAM maybe, but an even worse price
  2. Prices go down? Sure, a bit when supply catches up. Probably not much though
  3. Better power delivery? The best we can hope for is AIBs add better shunt safety mechanisms.
  4. Higher quality AIB models?. This is probable given how rushed these launch cards are

5080ti with 3x8pin for £1000 and full ROPs. Yeah right :cry:
 
Apart from everything else that's wrong with this launch -- and there's a lot -- the thing thay gets me is that nV obviously, 100% knew about the ROP issue and still chose to ship those chips. The defects being missed in rounds of very precise validation testing is so incredibly, infintesimally small, it's not even worth considering. Someone at nV (and AIBs) consiously decided, after seeing these chips were defective, to say "we're shipping them to consumers anyways".

Steve says he's seen the software Nvidia and AIBs use internally to test every single GPU that gets made and the software absolutely would have alerted them about the missing ROPs, so Nvidia has made a decision to sell damaged hardware anyway and hope no one notices

That's a very bad look for Nvidia, it means you cannot even trust that the specs for the product they sell you are accurate; Nvidia is stooping to the lowest of lows and is quickly turning into Intel. This type of profit over everything culture comes from the top down, and a lack of competition and a lack of care for customer satisfaction is leading Nvidia to make choices that hurt its long term future - Nvidia is trying to make as much profit as quick as possible for investors before the bubble pops and the company dives into the gutter
 
Last edited:
Steve says he's seen the software Nvidia and AIBs use internally to test every single GPU that gets made and the software absolutely would have alerted them about the missing ROPs, so Nvidia has made a decision to sell damaged hardware anyway and hope no one notices

Like they did with 3.5GB vram?

It's amazing what companies will do when they have no competition and they know they're the only option
 
Updated the list with:
  • new info on missing ROPs
  • removed hot-spot sensor value from Nvidia's API (which I originally forgot to mention)
  • various Nvidia drivers issues including compatibility with certain VR headsets
  • Astral 5090 catching fire and blowing up
  • My bragging about coil whine and why is it still problem on 50 series
 
Last edited:
Now Nvidia have admitted the mistake with ROPS missing is their fault I think you need to make that number 1 in your list OP.
I wonder what they are going to do with all the cards that are returned?
Sinister outlook would be they will try to sell them again in a hope that the next customer won't notice :D Btw I agree missing ROPs (maybe together with fire hazard issues) is /are sort of the most important item on the list however the list is not sorted by the importance and the order of items is not meant to change over time.
 
A quick summary of everything that went wrong. I thought I'll summarize it so we do not forget since there is so much of it and most of it is completely outrageous.
  1. RTX 5070 being presented as similar performance to RTX 4090 thanks to "AI". While this is obviously false in so many ways (FG and MFG introduces latency over native, also MFG 4x has much more artifacts over 2x not to mention over native, 5070 has only half the RAM of 4090)
  2. RTX 50 cards being artificially VRAM limited - i.e. compared to relative compute performance / VRAM vs. competition. This will no doubt become similarly crippling in the future as it is currently crippling to i.e. 3070 in certain games.
  3. 50 series presented as having 3x AI, 2x RT and 1.5x shader speedup relative to 40 series. But somehow none of this shows in the benchmarks. For example there seems to be no RT speedup relative to raster speedup in 5090 and 5080. While the consensus nowadays seems to be that Blackwell is basically the same architecture as Ada with just minor tweaks. And yet competitors were able to i.e. provide meaningful improvement in RT relative to raster despite facing similar node limitations as Nvidia.
  4. Also the naming scheme of what 5080, 5070Ti etc. are supposed to mean seems to be quite inflated relative to what it used to mean. 5080 being only small improvement over 4080Super. 5070Ti is basically 1:1 performance replacement for 4080.
  5. We were promised "great availability" yet the cards are nowhere to be found and it is by far the most paper launch of launches by Nvidia in history.
  6. MSRP prices are nowhere to be found either (as expected). But the 5070Ti launch was cause of yet another controversy as there is no FE version of the card and thus Nvidia sent Asus Prime cards to some reviewers as the "MSRP" variant. Yet Asus had this card listed with price 20% over MSRP. Thus depending on who you asked (Nvidia or Asus) the same card had completely different value proposition.
  7. RTX 40 series cards are also nowhere to be found. Creating artificial scarcity and driving prices of all Nvidia GPUs up. A scenario orchestrated by Nvidia either on purpose or by sheer incompetence (but judging by lessons from history, the former is more likely)
  8. Despite problems with melting 12VHPWR connectors & RTX 4090s, Nvidia didn't make any design improvements and is now using it for 5090s which for certain AIB variants can exceed 600W. While 4090 is a design downgrade from 3090Ti which had certain protections built-in so that you couldn't cut 5 out of 6 cables and keep the card still running. But this is completely possible on 40 and 50 series - i.e. delivering all the current via single cable until meltdown. It is now consensus of most electrical engineers that Nvidia's design is inexplicable as it leaves no safety margin thus even a slight deviation from ideal scenario can create meltdown or potential fire hazard scenario. Also the tactic of blaming customer for "improper" setup etc. is a completely inappropriate excuse. Even if the meltdowns were caused by some suboptimal handling by customer - it is quite certain it wouldn't have happened if the cards were using the standard PCIe cable connectors. While at the same time this could easily be caused by imperfections in connectors / cables and their respective manufacturing processes instead of just customer handling error.
  9. Nvidia enforces use of 12HPWR connectors even for their AIB partners for completely unknown reasons. While the competition keeps using PCIe connectors and doesn't seem to have any problems powering GPUs with comparable TDP to Nvidia cards.
  10. Blackwell has stability issues when PCIE 5.0 is enabled on certain (or maybe most?) motherboards. The problem could be farther exacerbated by early driver versions. Supposedly some cards have been bricked because of these problems, but usually it is enough to downgrade PCIE to 4.0 in motherboard BIOS. However this is a pretty big oversight and no explanation from Nvidia if this is hardware related or could be fixed by BIOS update etc. Also some PCIE extenders don't work even if you downgrade to 4.0 speed.
  11. In their flashy presentation Nvidia forgot to mention that support for 32bit PhysX on Blackwell is discontinued. Among affected games are titles like most of the games from Batman Arkham series, most from older Metro series etc. Now if you want to play these games on Blackwell GPUs you can either emulate it on CPU (while for 8core CPU you can expect around 40ish fps on your 5090) or you can buy separate older Nvidia GPU just for "PhysX". Isn't that nice and convenient? Yet no word from Nvidia of some emulation layer that could translate it to 64bit calls.
  12. Update 1: Missing ROPS from 5090, 5090D and 5070Ti cards. I wonder when was the last time something like this happened. At least there is an official statement from Nvidia: "We have identified a rare issue affecting less than 0.5% (half a percent) of GeForce RTX 5090 / 5090D and 5070 Ti GPUs which have one fewer ROP than specified. The average graphical performance impact is 4%, with no impact on AI and Compute workloads. Affected consumers can contact the board manufacturer for a replacement. The production anomaly has been corrected."
    Will be interesting to see if truly only < 0.5% GPUs are effected since at this point I am a bit skeptical of anything Nvidia is saying. Also note that by ROP they mean one ROP unit which consists of 8 ROPS. (just to elaborate on reported 8 ROPS missing vs. 1 ROP missing in Nvidia statement)
    Update 2: Missing ROPs in some 5080s have now been confirmed. Also it was shown that the worst case performance impact can be much more significant than the average 4% claimed by Nvidia. I.e. > 10% performance drop in some cases especially on lower end Nvidia cards. Also there are news that the 50 series laptops are now being delayed because of the missing ROPs investigation on mobile chips. Also according to Gamers Nexus: Nvidia most likely knew that the chips were defective and decided to ship them anyway. So most likely how it works: lot's of people who are not tech savvy won't notice the missing ROPs and Nvidia will be thus still able to make money on these defective chips.
  13. Update 3: For unknown reasons which from the outside can be characterized only as anti-consumer practices Nvidia decided to remove hot-spot value from it's API. Originally I forgot to mention it, but yeah now it's here.
  14. Update 4: Various Nvidia drivers related not only to PCIE 5.0 stability have been confirmed, including compatibility with certain VR headsets, problems with HDMI audio and supposedly stability issues in some older gen GPUs after installing latest Nvidia drivers.
  15. Update 5: At least one report of 5090 Astral catching fire and blowing up, taking down motherboard with it. Since so far it's singular occurrence maybe it was just bad luck. However will be insane if this happens again.
  16. Update 6: Coil whine still isn't fixed. Even on the 5090 FE it can routinely be quite terrible, and AIB models have also problems. This is of course not specific to just Nvidia but still it absolutely escapes me why are AIB investing to high quality quiet fans only to kill it with sometimes terrible coil whine. And that's for very expensive high-end products which as far as I am concerned should be close to flawless for such an amount of money. Could we at least have one model with guaranteed very low coil whine so it wouldn't feel as such a lottery?
There is so much of it that it's still possible I forgot about something. Anyways, since I need Nvidia for work I don't have a choice. And yes Nvidia is still the best, but I nevertheless hope the incredible arrogance displayed by them will one day lead them to similar path to Intel. As this seems highly reminiscent of Intel couple of years ago (except maybe even worse). 4 cores as desktop standard forever, miniscule 5% improvements between generations, shady tactics with other HW manufacturers, total market domination, especially server market etc. All reasons why I am not sorry for what happened to them (again despite most of my CPUs were intel)

PS: I may update this list if new facts emerge
I think it would be faster what went right rather than wrong lets start.
 
But then he would not have anything to post :)
I mean it hasn’t spontaneously gained sentience and demanded human sacrifice… yet and it hasn't kicked a puppy, but may have emotionally devastated a cat.
tKIPXKU.gif
 
The only thing truly wrong with the launch are the number of idiots buying them.

PC Gamers are their own worst enemy. Even when being blatantly scammed, the FOMO is all powerful.
Well the problem is that the new Doom The Dark Ages is around the corner - which is the only game I am really concerned about, and it should support full path-tracing and if I want to play it at max fidelity it seems like there is no other option than Nvidia. The 9070 XT did really poorly at Indiana Jones with full path-tracing enabled which is also engine based on previous gen of idtech thus most likely we can extrapolate similar performance characteristics also for the new Doom. Not to mention that path-tracing is currently so heavy that you need the best possible upscaling to make it barely playable at 4K - that is if you still want to also maintain OK picture quality. Thus for this use-case Nvidia unfortunately seems clearly far superior to AMD. Therefore if someone wants to experience the new Doom game at the launch date with max fidelity (as do I) they basically don't have any good options. I am actually starting to get worried if the market supply will be somewhat fixed at least at the beginning of the May or I will have to buy some overpriced GPU. Or just suck it on my RTX 3080 until the market normality is hopefully restored.
 
But then he would not have anything to post :)
Well I think that there are at least few things that went well:
  1. DLSS4 - only the upscaling & ray reconstruction part of it. Generally received high praise and is widely considered a generational leap over DLSS3 upscaling
  2. Media engines, better encoding quality / higher throughput
  3. The tech demos on neural texture rendering, neural radiance cache, megageometry etc. kinda look cool, but it will take quite some time until they can make significant impact in actual games (Alan Wake's 2 megageometry implementation is sort of nice but not gamechanger on 50 series cards)
There may be something else which I forgot about for which I apologize :D
 
Last edited:
Well I think that there are at least few things that went well:
  1. DLSS4 - only the upscaling part of it. Generally received high praise and is widely considered a generational leap over DLSS3 upscaling
  2. Media engines, better encoding quality / higher throughput
  3. The tech demos on neural texture rendering, neural radiance cache, megageometry etc. kinda look cool, but it will take quite some time until they can make significant impact in actual games (Alan Wake's 2 megageometry implementation is sort of nice but not gamechanger on 50 series cards)
There may be something else which I forgot about for which I apologize :D
Yeah DLSS4 looks great I couldn't care less on the frame gen but the upscale looks great...I'm meh about the media engine it's nice but it's not great and neural rendering etc is nothing burger right now as nothing uses it but I'm sure in 2 years when the 6000 series are out it will be important
 
Well I think that there are at least few things that went well:
  1. DLSS4 - only the upscaling & ray reconstruction part of it. Generally received high praise and is widely considered a generational leap over DLSS3 upscaling
  2. Media engines, better encoding quality / higher throughput
  3. The tech demos on neural texture rendering, neural radiance cache, megageometry etc. kinda look cool, but it will take quite some time until they can make significant impact in actual games (Alan Wake's 2 megageometry implementation is sort of nice but not gamechanger on 50 series cards)
There may be something else which I forgot about for which I apologize :D
Yeh, DLSS4 is good.
 
Well the problem is that the new Doom The Dark Ages is around the corner - which is the only game I am really concerned about, and it should support full path-tracing and if I want to play it at max fidelity it seems like there is no other option than Nvidia. The 9070 XT did really poorly at Indiana Jones with full path-tracing enabled which is also engine based on previous gen of idtech thus most likely we can extrapolate similar performance characteristics also for the new Doom. Not to mention that path-tracing is currently so heavy that you need the best possible upscaling to make it barely playable at 4K - that is if you still want to also maintain OK picture quality. Thus for this use-case Nvidia unfortunately seems clearly far superior to AMD. Therefore if someone wants to experience the new Doom game at the launch date with max fidelity (as do I) they basically don't have any good options. I am actually starting to get worried if the market supply will be somewhat fixed at least at the beginning of the May or I will have to buy some overpriced GPU. Or just suck it on my RTX 3080 until the market normality is hopefully restored.
If you want path tracing you’ll need to look at a 5090.
 
My biggest issue and one I can't get over is the connector. My PC runs 24/7 and I have no issues leaving it like that, but with these new cards (at least at the high end) I'd worry about still leaving it running over night.

Unfortunately there's not going to be a fix for it this gen, so at most I'd have to look at a 5070 Ti.

why do you need to leave your PC on 24/7?
 
Back
Top Bottom