• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

When the Gpu's prices will go down ?

Okay, taking $17,000 and making that column a bit narrower (everyone knows what pw means, right?), and adding AD107 (I've never gone that low before but then this gen x107 is the x060 cards :(

W9lcf6h.png

Plenty of margins either way.
R&D and masks etc. have gone up, but the biggest mystery is precisely that: AMD's R&D and other fixed costs have gone up too*, but after that huge fixed costs they seem totally unwilling to go for volume. It's almost like the are playing in the dGPU market just to get experience for the next console update.

The choice to only have Navi 33 on 6nm and not to go for a 400mm² 6nm monolith as a true volume part is very puzzling from my armchair silicon strategists PoV!

*Yes, chiplets might save something there.

Where did you get those yeild figures from?

Here.
 
The yields for large dis even with great defect density (and 0.07 per cm² on a 5nm node can be considered great) can be pretty poor.
Hence the holy grail of chiplets.

Regarding binning, there is such a thing as binning good parts for clocks, voltages etc. as well.

While things you cannot test until after packaging, I am sure they can test some thing before too.

Then between all the various test rigs at their packaging plant I am sure they will find uses of many of the "bad" dies.
Some will be complete failures the defect was in a part without redundancy, but I would expect the throwaway bad to be 5-10%.

TPU lists AD102 as being a 18432 shader part whereas 4090 only uses 16384 so about 88%.
 
Advantage how? I've been trying to think of what the advantage is of increasing market share and maybe I'm just not getting it but i can't think of any.
From what I can think of is less likely for a dev to ignore properly optimizing for you and higher chance to get your tech in (like tressfx). Plus a bigger mind share, more likely to get sales with lesser products.
The thing is though look at some of the data I linked to which showed Nvidia had billions of USD in unsold stock and parts earlier in the year.

Even for Nvidia it makes no sense either,especially consumer dGPU sales are terrible overall.

As I said it's because the stock market is obsessed with margins,and margin growth(or even slower margin decline). Even with Subprime it was the same issue,which lead to risky things being done.

After all why have Western tech companies handed the profitable mainstream and entry level phone markets to China? It's how Japanese car companies got to where they are. Western car companies were more concerned with higher level markets,so sold trash at the lower end.

It's also why so many manufacturing jobs moved abroad. It's not that you can't make things profitable enough over here,it's that you can't push high margin growth YoY doing it here.

It's also the reason why seemingly profitable companies,try squeeze the pay of employees. It's all about margins in the short term.
The old infinite increase in afinite market.
Has anyone got any evidence as to how much the latest cards cost to manufacture vs the retail pricing so we actually know the profits involved? Surely Nvidia have much more benefit in terms economies of scale and so are likely to be profiting more especially as their fan base will pay the asking regardless of the stupid prices being asked. You can't blame AMD for following the pricing trend as their development and manufacturing costs are likely to be greater so they need to make hay whilst the sun shines. But without having some ideas of the actual facts and figures especially given increasing overheads and threatened restrictions of silicone it's difficult to reach an informed understanding of the profiteering if any involved.

Thing is the cards themselves have become more complex and power hungry. GTX 970 was 145w tdp. 4080 (which is more like a 70 class card) is already 320w tdp. The cooler is great,but masive (my old 2080 seems like a toy). Now consider all the other parts that have been built to serve the higher tdp, plus 16gb vram vs 3.5gb fast + 512mb slow memory (4gb in total for gtx970).

And then there's greed! :))
 
From what I can think of is less likely for a dev to ignore properly optimizing for you and higher chance to get your tech in (like tressfx). Plus a bigger mind share, more likely to get sales with lesser products.

The old infinite increase in afinite market.


Thing is the cards themselves have become more complex and power hungry. GTX 970 was 145w tdp. 4080 (which is more like a 70 class card) is already 320w tdp. The cooler is great,but masive (my old 2080 seems like a toy). Now consider all the other parts that have been built to serve the higher tdp, plus 16gb vram vs 3.5gb fast + 512mb slow memory (4gb in total for gtx970).

And then there's greed! :))

I can see defensible reasons for higher cost cards: more complex tech and manufacturing techniques, more raw hardware and silicon, general inflation, bigger board, more secondary hardware and cooling solutions (as you say). But in 2014 a GTX 970 was under £300, I think I paid about £270 for mine and in 2022 a 4080 is £1200 and a 4070Ti is £800+. Depending on which one you regard to be the current gen equivalent that's a x3 or x4 increase in 8 years! Double I think would be fair, but 3-4 times... mostly greed.
 
Last edited:
I can see defensible reasons for higher costs cards: more complex tech and manufacturing techniques, more raw hardware and silicon, general inflation, bigger board, secondary hardware, and cooling solutions (as you say). But in 2014 a GTX 970 was under £300, I think I paid about £270 for mine and in 2022 a 4080 is £1200 and a 4070Ti is £800+. Depending on which one you regard to be the current gen equivalent that's a x3 or x4 increase in 8 years! Double I think would be fair, but 3-4 times... mostly greed.

100% is greed. That card should have been called a 4070 Ti and cost £749 at most.
 
100% is greed. That card should have been called a 4070 Ti and cost £749 at most.
The downgrade in die size and the ridiculous price rise is what makes the 4080 doubly disappointing, one or the other and Nvidia might have just about got away with it but implementing both has killed off the 80 series this gen which is a shame as the 3080 was one of the best if not the best card last gen offering high end performance at a reasonable price but the 4080 is the polar opposite.
 
  • Like
Reactions: TNA
I think the 4080 name is legit because it *performs* almost 50% better than last-gen's 3080. (and not just at lower resolutions) The problem is that they are charging way too much for it.

My thought is they want silly moneys for 4080 naming, then name it 4070 Ti and charge a fair price. Not £1200...
 
I think the 4080 name is legit because it *performs* almost 50% better than last-gen's 3080. (and not just at lower resolutions) The problem is that they are charging way too much for it.
The ADA performance increase somewhat masks the die downgrade but still we should have got the full +70% especially considering the price.
 
The downgrade in die size and the ridiculous price rise is what makes the 4080 doubly disappointing, one or the other and Nvidia might have just about got away with it but implementing both has killed off the 80 series this gen which is a shame as the 3080 was one of the best if not the best card last gen offering high end performance at a reasonable price but the 4080 is the polar opposite.
I'd say they've killed the original appeal of it. At current prices and keeping in mind with what else is out there as alternatives, isn't great, but isn't untouchable as well.
 
Despite saying the ceo wouldn't shut the company down when evga left the GPU market, it seems evga has been shut down as it's emerged that all employees have just resigned


It's obvious there was no money in making PSUs & accessories.
 
I have a feeling they jacked the price up of the 40 series since they wouldn't be able to keep up with demand and supply gpus to gamers when they got massive orders for Ai to fulfil and need that foundary capacity for it

I had the same thought on another thread. If they can't supply enough to meet typical demand then stretch as bigger margin as possible from that you can make.
 
Back
Top Bottom