• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

The thing that we still have no idea on yet is just how much better the 40xx could potentially be in RT as we still have yet to see any games using all the new optimisations specific to nvidia/40xx....

Portal RTX and the cp 2077 overdrive RT mode this month should give us a rough idea though. I look forward to all the "nvidia gimping ampere" posts :D

Think if the difference is quite big, that will be me sold.
 
Last edited:
I assure you, 0.05%-0.1% is most certainly not irrelevant - that's 1,250 to 2,500 4090s of (supposedly) the 250,000 nvidia claims to have sold.
250 GPUs out of 250,000. I think it’s irrelevant because the real defect rate is much lower than that as 2 of the 3 ways to get it to melt were to not plug it in all the way or not plug it in and do an east to west bend. Even then in both cases it must not be plugged in by a huge margin. In a North to south bend, the cable didn’t melt despite not being plugged all the way in.

The only real defect is the presence of foreign debris and it’s exceedingly rarer and is likely meeting six sigma standards if we are 0.05% including most cases of user error. So nothing out of the ordinary.
 
250 GPUs out of 250,000. I think it’s irrelevant because the real defect rate is much lower than that as 2 of the 3 ways to get it to melt were to not plug it in all the way or not plug it in and do an east to west bend. Even then in both cases it must not be plugged in by a huge margin. In a North to south bend, the cable didn’t melt despite not being plugged all the way in.

The only real defect is the presence of foreign debris and it’s exceedingly rarer and is likely meeting six sigma standards if we are 0.05% including most cases of user error. So nothing out of the ordinary.
Yeah it's probably impossible to make the odds better than that even if each one were hand checked.
If those are the true numbers
 
Given the number of cases on Reddit alone though, that's a worrying amount of people not correctly connecting the power cable and would seem to indicate the cable itself is inappropriate for a 'consumer' product.

On a slightly related note, the 3x8 pin -> 12-pin cable that came with my 3090 Ti had the latch snap off the first time I had to remove the cable - I had to run it press-fit only (nothing securing it) until MSI sent me a new one.
 
Last edited:
Given the number of cases on Reddit alone though, that's a worrying amount of people not correctly connecting the power cable and would seem to indicate the cable itself is inappropriate for a 'consumer' product.

On a slightly related note, the 3x8 pin -> 12-pin cable that came with my 3090 Ti had the latch snap off the first time I had to remove the card - I had to run it press-fit only (nothing securing it) until MSI sent me a new one.
Don't get me wrong it's a rubbish cable design almost like an after thought.
 
3080 was Samsung 8nm, 4080 is TSMC 4nm (which is actually 5nm). Costs are wayyy higher for the leading edge TSMC process.
TSMC 5nm is not leading edge though as its been out 2 years, also the die size is only just over half so you're likely getting 2 as many dies per wafer with less defects.
 
3080 was Samsung 8nm, 4080 is TSMC 4nm (which is actually 5nm). Costs are wayyy higher for the leading edge TSMC process.

Well if that is the case why is the 4090 only £1699 as GA102 was Samsung 8nm and 628mm and AD102 is TSMC 5nm and 608mm yet the msrp has only gone up £100.

The costs don't add up when look at the 4090 as yields on a small die would also be better.
 

Q3'23 results just out...to all the folk complaining about pricing, look at the gross margin....65.2% this time last year....53.6% this quarter. There's more to it than just Ada pricing, but the idea that NVDA is price gouging is just not correct.

Haters gon hate, I guess.

EDIT - again, why I am sat here at nearly 10pm posting this is beyond me...but if anyone is interested in understanding if Nvidia are squeezing customers....

Gross margin
- 2021 = 65.6%
- 2022 = 64.9%
- 2023 = 60.6% (expected)

Data centre as % of sales
- 2021 = 40.2%
- 2022 = 39.4%
- 2023 = 56.3% (expected)

(Nvidia has a January 31 year end, hence FY'23 is Jan 22 to Jan 23)

...datacentre chips are way higher margin, and datacentre as a percentage of sales is growing. If anything, gamers are piggybacking off datacentre volumes and getting cheaper GPUs than they should be.

I'm as sorry as anyone that a 4090/80 costs as much as a used car, but it's not Nvidia's fault!
 
Last edited:
Gaming
•Third-quarter revenue was $1.57 billion, down 51% from a year ago and down 23% from the previous quarter.

443
 
A fool and his money...
Why does that make me a fool? I am just putting out a thought. I dont agree with the prices, but saying there is a saving to be made if you want a new card. I dont think that make me a fool, NVIDIA are not going to drop the prices, and even if they did how much of that do you think the shops would shave off?
 
Why does that make me a fool? I am just putting out a thought. I dont agree with the prices, but saying there is a saving to be made if you want a new card. I dont think that make me a fool, NVIDIA are not going to drop the prices, and even if they did how much of that do you think the shops would shave off?

Jensen dropped his pants for the likes of the 3090 and 3090ti months back, when sales suck he will drop prices.
 

Q3'23 results just out...to all the folk complaining about pricing, look at the gross margin....65.2% this time last year....53.6% this quarter. There's more to it than just Ada pricing, but the idea that NVDA is price gouging is just not correct.

Haters gon hate, I guess.

EDIT - again, why I am sat here at nearly 10pm posting this is beyond me...but if anyone is interested in understanding if Nvidia are squeezing customers....

Gross margin
- 2021 = 65.6%
- 2022 = 64.9%
- 2023 = 60.6% (expected)

Data centre as % of sales
- 2021 = 40.2%
- 2022 = 39.4%
- 2023 = 56.3% (expected)

(Nvidia has a January 31 year end, hence FY'23 is Jan 22 to Jan 23)

...datacentre chips are way higher margin, and datacentre as a percentage of sales is growing. If anything, gamers are piggybacking off datacentre volumes and getting cheaper GPUs than they should be.

I'm as sorry as anyone that a 4090/80 costs as much as a used car, but it's not Nvidia's fault!


Here is the problem; gross margin is sales minus cogs divided by sales. Cogs includes some costs that are significant to this industry; fixed costs attributable to the goods sold and inventory on hand.

So just because Nvidia's gross margin goes down doesn't mean buyers are not getting screwed over; it could simply mean that 1) Nvidia is sitting with a lot of unsold inventory and 2) they are sitting with fixed costs that is shared over a smaller amount of revenue.

When you factor this in, it's then not a surprise that a sudden significant quarter to quarter or year to year fall in sales volume and an increase in inventory on hand can cause sudden drops in gross margin even if you haven't changed the price you sell your product to the customer for and your variable costs of producing that product is the same - so just because gross margin is down, doesn't mean that the cost to make a GPU went up and it doesn't mean price came down.
 
Last edited:
NVIDIA are not going to drop the prices,
What makes you so sure? They dropped ampere prices especially the 3090ti a month or two after launch as they weren't selling and now nvidia has just seen a further drop in revenue of 23% while its cards are gathering dust on shelves.
 
Here is the problem; gross margin is sales minus cogs divided by sales. Cogs includes some costs that are significant to this industry; fixed costs attributable to the goods sold and inventory on hand.

So just because Nvidia's gross margin goes down doesn't mean buyers are not getting screwed over; it could simply mean that 1) Nvidia is sitting with a lot of unsold inventory and 2) they are sitting with fixed costs that is shared over a smaller amount of revenue.

When you factor this in, it's then not a surprise that a sudden significant quarter to quarter or year to year fall in sales volume and an increase in inventory on hand can cause sudden drops in gross margin even if you haven't changed the price you sell your product to the customer for and your variable costs of producing that product is the same - so just because gross margin is down, doesn't mean that the cost to make a GPU went up and it doesn't mean price came down.
fixed costs get captured in EBITDA, and fixed costs + depreciation/amortisation of investment are captured in EBIT. But you're right, inventory build is captured in gross margin - and NVDA ordered way too many Ampere chips before they realised the economic picture was changing - hence the drop in GM in the year to Jan'23. FY'24 (i.e. Jan 23 to Jan 24) GM is expected at around 65% again....broadly speaking they target a 65% GM subject to competitive position/demand/macro etc. If things don't go wrong, Nvidia will just go back to making the same margins they did a few years ago next year.

The point I'm trying to get across is that a lot of people are getting really mad about the pricing of the next gen, and it's just a reflection of the cost of building tiny transistors. It's not anyone profiteering or price gouging...this is the going rate for leading edge nodes and the R&D required to design chips based on the current market structure.

I'm just glad poor Jensen isn't here to see all these people hating on him....

EDIT.

Have a look at how ASML EUV works. It's ******* incredible. But it's ******* expensive...it's a great video

 
Last edited:
fixed costs get captured in EBITDA, and fixed costs + depreciation/amortisation of investment are captured in EBIT. But you're right, inventory build is captured in gross margin - and NVDA ordered way too many Ampere chips before they realised the economic picture was changing - hence the drop in GM in the year to Jan'23. FY'24 (i.e. Jan 23 to Jan 24) GM is expected at around 65% again....broadly speaking they target a 65% GM subject to competitive position/demand/macro etc. If things don't go wrong, Nvidia will just go back to making the same margins they did a few years ago next year.

The point I'm trying to get across is that a lot of people are getting really mad about the pricing of the next gen, and it's just a reflection of the cost of building tiny transistors. It's not anyone profiteering or price gouging...this is the going rate for leading edge nodes and the R&D required to design chips based on the current market structure.

I'm just glad Jensen isn't here to see all these people hating on him....

It is profiteering though. They’ve been pushing the market for generations.

The margin on a 4080 or 4090 at these price points is massive.
 
@Wintermute2 I'll warrant that the dies are likely stupid expensive due to the new node and current economic conditions, but the rest of the card is a reused 3090 Ti board and the same GDDR6X they've been using for the 30-series - the dies are top-shelf, everything else is Ampere. In the case of the 4080, the boards are 50% empty - these things have been cost-reduced in every way possible except the dies and the coolers (even to the point that presumably it's cheaper to re-use the 4090 cooler on the 4080 than it would have been to make a new one that matches the TDP - economies of scale I guess).

front.jpg
 
@Wintermute2 I'll warrant that the dies are likely stupid expensive due to the new node and current economic conditions, but the rest of the card is a reused 3090 Ti board and the same GDDR6X they've been using for the 30-series - the dies are top-shelf, everything else is Ampere. In the case of the 4080, the boards are 50% empty - these things have been cost-reduced in every way possible except the dies and the coolers (even to the point that presumably it's cheaper to re-use the 4090 cooler on the 4080 than it would have been to make a new one that matches the TDP - economies of scale I guess).

front.jpg
Thanks for the info @Aegis - the only bit I follow for work (at a very high level) is leading edge semiconductor design and manufacture...would love to know more about PCBs, VRMs, caps, MOSFETs etc. but would need to find the time! With the caveat that this isn't my area, my understanding is that the design of the chip (Nvidia), the manufacturing of the chip (TSMC) and the equipment used ot manufacture the chip (ASML...and perhaps some suppliers to ASML) are all leading edge tech where intellectual property rights mean others can't replicate...which is a way of justifying the massive amount of R&D required. I'd guess that the other stuff is relatively commoditised and low margin, because anyone can do it....hence why you've got a single producer of EUV lithography equipment (ASML)...one fab at the leading edge (TSMC) with a couple of others (Intel and Samsung) who are also able to spend the cash to be able to manufacture....and a few chip designers (Nvidia, AMD, Intel) who are able to spend on R&D to create the designs.

I'd guess that cost of the chip itself reative to the rest of the board is pretty high.
 
Back
Top Bottom