• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA RTX 50 SERIES - Technical/General Discussion

Debauer saying Nvidia should take the blame and more people need to point the finger at them and less so the scalpers, I agree.

100%. They've been treating AIBs like dirt due to their obsession of plugging leaks. Who cares if details are leaked, when it ends up being a terrible GPU with no availability anyway.
Tbf, Nvidia planned it well from their perspective, keep real performance hidden so that most folks don't find out the lacklustre improvements in performance and reduce sales.
People still get excited for Switch 2, despite it seeming to be the most leaked hardware in history.

Guys,

Any advice on which 5070 Ti to go for ?


Or another that's not on that list ?

Many thanks

As someone else said, the best choice is to wait. Either to see if the 9070xt is competitive or not, or for the stock situation to get sorted, so you aren't paying scalper prices for another underwhelming 5000 series GPU.

Otherwise, MSI Gaming used to be good, but not sure anymore now that they have too many variants of the same thing.
 
Last edited:
I read around $290 for the die but I did hear the Vram was much more expensive when they were producing the GDDR6x for the 40x series compared to Vram costs supposedly being a LOT cheaper this time around.

I mean slightly bigger die but it's still the same process so you'd expect that to be cheaper too unless TSMC is hiking costs for their good yield 4nm or 5nm and as for R&D I mean lets be honest other than software R&D they haven't done anything for 2 years other than throw more power at the thing lol, It just doesn't add up.

But yeah unless we start getting some good competition from whether it's AMD or Intel I can't see the 60x series being any fairer in costs either.
TSMC lost a bunch of wafers in the last earth quake, with loses going into many millions - likely insured but such events increase prices and wait time for components (could be one of the reasons of the low stock of 5k series cards). I suspect newer nodes are even more expensive, which is possibly why NVIDIA decided to go with the same one, to not drive prices of these cards even higher. Also, the 5090 GPU die is already partially disabled to account for manufacturing errors, otherwise GPU price itself would be considerably higher again. All in all, TSMC is getting more and more expensive with each new process, not cheaper. Older processes do not fall in price that much over time anymore, vRAM also gets more expensive apparently plus 512bit complicates PCB and requires more layers plus more expensive materials in production. The bigger and more complex these graphics cards become the more expensive every single part in their production become and it quickly adds up.

4090's GPU is not just a bit smaller, it's quite a bit smaller than 5090, as we can fit 89 of these of one wafer instead of 69 for 5090, with better yields (as less defects per chip). To that less complex PCB, VRM, vRAM etc. - there's quite a bit of BOM difference. Rough calculation puts BOM of 4090 graphics card now at about $700. That's roughly $400 less than 5090. It kinds of fits the price increase, then.

Now, looking at 5080 chip, it's less than half the size of 5090, there should be 150 of these fitting on one wafer (69 of 5090), with much smaller chance of defects per chip and that means much better yields. GPU itself should be around $150 then (assuming $350 for 5090) - that price includes cost of testing them and preparing for further assembly, hence it's higher than just printing the chip. Half the RAM, less complex PCB, smaller VRM, smaller cooler, less components etc. It would seem BOM would be likely around $450 (likely less, as NVIDIA buys large number of components and gets discounts).

R&D they did quite a bit but it's all AI-related and nothing about the raster performance with just slight improvements to the RT it seems. Still, they always charged gamers for R&D that is being used in enterprise cards. :/ Competition would push all manufacturers to get less of a cut from each of the graphics cards and they could easily do that and still be profitable, but shareholders would complain so it's not going to come easily, without big push. Nothing from AMD nor Intel shows such push in the high end, we'll see how it goes in lower shelves.
 
Also, there is no way the memory costs $300+, as someone else said.
Just quickly searching around: "Back in February 2022, an 8Gb GDDR6 chip (1GB) cost about $13 on the spot market" + "Based on industry trends and reports, it is estimated that GDDR7 memory is around 10-20% more expensive than GDDR6 memory.". Let's say $15 per 1GB chip. We have 32GB on 5090, that by simple caclulation would be $480. Of course, NVIDIA gets big discounts so putting it at about $350 sounds reasonable.
 
No issues from me atm on 5080fe, hope it's gonna stay like this

VUsXpaPl.jpg
 
Last edited:
TSMC lost a bunch of wafers in the last earth quake, with loses going into many millions - likely insured but such events increase prices and wait time for components (could be one of the reasons of the low stock of 5k series cards). I suspect newer nodes are even more expensive, which is possibly why NVIDIA decided to go with the same one, to not drive prices of these cards even higher. Also, the 5090 GPU die is already partially disabled to account for manufacturing errors, otherwise GPU price itself would be considerably higher again. All in all, TSMC is getting more and more expensive with each new process, not cheaper. Older processes do not fall in price that much over time anymore, vRAM also gets more expensive apparently plus 512bit complicates PCB and requires more layers plus more expensive materials in production. The bigger and more complex these graphics cards become the more expensive every single part in their production become and it quickly adds up.

4090's GPU is not just a bit smaller, it's quite a bit smaller than 5090, as we can fit 89 of these of one wafer instead of 69 for 5090, with better yields (as less defects per chip). To that less complex PCB, VRM, vRAM etc. - there's quite a bit of BOM difference. Rough calculation puts BOM of 4090 graphics card now at about $700. That's roughly $400 less than 5090. It kinds of fits the price increase, then.

Now, looking at 5080 chip, it's less than half the size of 5090, there should be 150 of these fitting on one wafer (69 of 5090), with much smaller chance of defects per chip and that means much better yields. GPU itself should be around $150 then (assuming $350 for 5090) - that price includes cost of testing them and preparing for further assembly, hence it's higher than just printing the chip. Half the RAM, less complex PCB, smaller VRM, smaller cooler, less components etc. It would seem BOM would be likely around $450 (likely less, as NVIDIA buys large number of components and gets discounts).

R&D they did quite a bit but it's all AI-related and nothing about the raster performance with just slight improvements to the RT it seems. Still, they always charged gamers for R&D that is being used in enterprise cards. :/ Competition would push all manufacturers to get less of a cut from each of the graphics cards and they could easily do that and still be profitable, but shareholders would complain so it's not going to come easily, without big push. Nothing from AMD nor Intel shows such push in the high end, we'll see how it goes in lower shelves.
I mean I think that's kind of the other half of the issue. It's not just lack of competition in terms of the GPU itself, but also there isn't too much competition with the chips themselves inside them.
 
TSMC lost a bunch of wafers in the last earth quake, with loses going into many millions - likely insured but such events increase prices and wait time for components (could be one of the reasons of the low stock of 5k series cards). I suspect newer nodes are even more expensive, which is possibly why NVIDIA decided to go with the same one, to not drive prices of these cards even higher. Also, the 5090 GPU die is already partially disabled to account for manufacturing errors, otherwise GPU price itself would be considerably higher again. All in all, TSMC is getting more and more expensive with each new process, not cheaper. Older processes do not fall in price that much over time anymore, vRAM also gets more expensive apparently plus 512bit complicates PCB and requires more layers plus more expensive materials in production. The bigger and more complex these graphics cards become the more expensive every single part in their production become and it quickly adds up.

4090's GPU is not just a bit smaller, it's quite a bit smaller than 5090, as we can fit 89 of these of one wafer instead of 69 for 5090, with better yields (as less defects per chip). To that less complex PCB, VRM, vRAM etc. - there's quite a bit of BOM difference. Rough calculation puts BOM of 4090 graphics card now at about $700. That's roughly $400 less than 5090. It kinds of fits the price increase, then.

Now, looking at 5080 chip, it's less than half the size of 5090, there should be 150 of these fitting on one wafer (69 of 5090), with much smaller chance of defects per chip and that means much better yields. GPU itself should be around $150 then (assuming $350 for 5090) - that price includes cost of testing them and preparing for further assembly, hence it's higher than just printing the chip. Half the RAM, less complex PCB, smaller VRM, smaller cooler, less components etc. It would seem BOM would be likely around $450 (likely less, as NVIDIA buys large number of components and gets discounts).

R&D they did quite a bit but it's all AI-related and nothing about the raster performance with just slight improvements to the RT it seems. Still, they always charged gamers for R&D that is being used in enterprise cards. :/ Competition would push all manufacturers to get less of a cut from each of the graphics cards and they could easily do that and still be profitable, but shareholders would complain so it's not going to come easily, without big push. Nothing from AMD nor Intel shows such push in the high end, we'll see how it goes in lower shelves.

I mean TSMC has no competition either so yeah they're a monopoly in themselves if company's want to use their latest silicon so you can sort of understand a price increase for the 5090 but the 5080 no chance, They're deliberately segregating it in a way to force anyone with a high end GPU to go for the 5090 to get any sort of meaningful upgrade path otherwise it's dead which imo it already is, There was no reason at the price they are asking for the 5080 to have put 20 or 24gb on those GPUs but yes it's all about the £££ and the shareholders.

It's looking bleak even for the next upgrade unless there's some sort of miracle breakthrough from AMD or Intel should they decide to push harder for the higher end of the market but remain competitively priced, Intel can afford to create a price war to gain GPU market share and Nvidia wouldn't like that one bit.
 
Debauer saying Nvidia should take the blame and more people need to point the finger at them and less so the scalpers, I agree.

The discussion on timelines and lack of testing is very interesting. I would not want to be beta testing a £2500 graphics card that has only been tested for a week by MSI for example. No thank you.
 
I mean I think that's kind of the other half of the issue. It's not just lack of competition in terms of the GPU itself, but also there isn't too much competition with the chips themselves inside them.
Or rather, not enough production capacity. There are mobile chips, GPUs, CPUs, and all kinds of other chips produced there as well. They need more factories, really.
 
I mean TSMC has no competition either so yeah they're a monopoly in themselves if company's want to use their latest silicon so you can sort of understand a price increase for the 5090 but the 5080 no chance, They're deliberately segregating it in a way to force anyone with a high end GPU to go for the 5090 to get any sort of meaningful upgrade path otherwise it's dead which imo it already is, There was no reason at the price they are asking for the 5080 to have put 20 or 24gb on those GPUs but yes it's all about the £££ and the shareholders.
Oh, fully agreed. There should also be a chip being produced which is in size between 5090 and 5080 one - nothing stopped NVIDIA from designing and producing such. They on purpose decided not to. Maybe, as someone said earlier, to gather all the partially faulty 5090s, cut them down to working level and sell that as 5080 Ti or so later, saving loads of monies on that... Good for them, definitely not good for gamers.
It's looking bleak even for the next upgrade unless there's some sort of miracle breakthrough from AMD or Intel should they decide to push harder for the higher end of the market but remain competitively priced, Intel can afford to create a price war to gain GPU market share and Nvidia wouldn't like that one bit.
I am mostly counting on AMD pushing forth with their plans from cooperation with Sony - that could scale up out from consoles world to PC world easily and would be a proper new tech without relying on just AI. Will that happen - we'll see. But at least they have some plans aside just saying "AI!" all the time. That aside, if 6k series is just another bleak upgrade and things do not progress much from now, I might keep my 4090 for even longer than that and wait for 7k series... I had bought 1080Ti back in the days already used and still kept it for 3y longer, then sold it for more than I bought it. :D 4090 is shaping for me to be of a similar kind of a GPU, so far.
 
Back
Top Bottom