an 80 class Ti GPU for £420 (in 2014)
makes you wonder where that additional £700 (adjusted for inflation) goes in Turing
Not defending Nvidia here...
It comes down to margin, the difference between the cost price and the selling price.
Nvidia is pulling 60% across customer GPU's.
Memory is obviously contentious.
Only data I can find priced GDDR6 14Mbps at ~ $11 per GB late last year.
So GDDR6X 19Mbps is top tier it's going to cost more so lets assume $20 per GB.
Slip 10GB GDDR6X onto a 80 series card and you're at $200 BOM cost.
Now Nvidia need to make their Magic Margin so will charge you $500 just for the memory chips so they get their juicy 60%.
Then the distributer and retailer want their cut so that will bump the price a little more.
Even if you Dropped to a slower GDDR6X say ~ 16Mbps @ $15 (speculating here as I don't have a price list).
You still have a BOM of $150 or consumer price of $375 for that 80 series card. - Just for the memory.
Add another $375 of VRAM to a 80 series card to get 20GB and you're at almost TI prices, though with more memory.
That's why a 12GB GDDR6X 2090 is likely to be $1400 and a 24GB version $2000.
That's why we're getting 8-12GB VRAM
margin, margin, magin
It's a key metric for shareholders.
The only way we get better prices is that AMD launches competitive cards, if AMD have a higher bus width on the memory interface say 512bit, they give us more, less expensive memory at the same cost.
The only way Nvidia cut margins is if they end up with excess inventory... and that will only happen if AMD is competative this time.