• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

Maybe they want everyone to cave and buy a 5080 before the market is flooded with second hand 4090s :D
Yep. nV also know most enthusiasts understand the 5080 is an anemicly-spec'ed card with an unbalanced config. It's an xx60 ti-level card woth a smaller chip/less silicon and wide margin of error in manufacture. If they sell at £1200, profit margin will be high.

I would not he surprised at all if they launch 5080 first to take advantage of initial hype to move more units.
 
dont think it works like that, must be a real supply issue... but the launch's been already delayed, doesnt bode well
It 100% works like that: release yoir baseline, less desirable (but higher margin) card first as over-eager fans will buy. Later, release your higher-end (but lower margin) card. Moreover, launching in this sequence allows build up of more supply for the card you know is going to sell more. Moreover, moreover, this allows a softer landing with less balking from consumers and the press if you're planning on pricing your highest-end card high.

It's straight out of Apple, Sony, and nearly every performance automaker's playbook.
 
Yeah I must admit it dosnt make too much sense. People after a 5080 or 5090 should be guided to make the most expensive purchase first. Also the whole point on an early launch was to get the 5090 round Trumps import tariffs.

If the 5080 is really bad value it needs to artificially go first nearly all reviews will say wait a few weeks. If it's really good value then then it's going to eat into 5090 sales.
 
not in this case, what will 4090 users be upgrading to? - nobody would like to delay revenues willingly, its not like the 5080 is a viable/possible upgrade path for 4090 users

I doubt nVidia really cared too much about the 1% of customers who have the 4090 to upgrade from when deciding to release the 5080 first.

Pushing the 5080 out first makes perfect sense from a profit perspective; this is all going to be assuming the leaks are accurate or at least semi-accurate.
The 5080 is about 50% the die size of the 5090 - but that doesn't mean 50% the cost for nVidia given that yields will likely be much higher for the smaller die size bringing their cost down but let's roll with it being 50% cost for now. Because I'm lazy I'm just going to normalize the rest of the component costs between the 2 models to a flat value, except for the vRAM which I'll just throw out that the 5090 cost nVidia an extra $100.
Then let's assume their pricing will be somewhere along the line of $1200 for 5080 and $2000 for 5090 (not too far removed from reality I suspect)... You'll note at these prices the 5090 actually offers a better value for the customer while also providing less profit to nVidia and placing higher demand on the lower supply product, meaning the lower demand product with the naturally higher supply sells less units - this obviously is less than ideal for them. By releasing the 5080 first, that mentally sets the reference-level in consumers heads (the first product they see becomes the default and other units get compared to it) but also means nVidia get a huge surge in sales of the more profitable and more available product before the 5090 can start cannibalising demand.

Feel free to correct any bad assumptions I made btw; I'm spitballing based on my limited knowledge of manufacture and the little info that's leaked.
 
Last edited:
I doubt nVidia really cared too much about the 1% of customers who have the 4090 to upgrade from when deciding to release the 5080 first.

Pushing the 5080 out first makes perfect sense from a profit perspective; this is all going to be assuming the leaks are accurate or at least semi-accurate.
The 5080 is about 50% the die size of the 5090 - but that doesn't mean 50% the cost for nVidia given that yields will likely be much higher for the smaller die size bringing their cost down but let's roll with it being 50% cost for now. Because I'm lazy I'm just going to normalize the rest of the component costs between the 2 models to a flat value, except for the vRAM which I'll just throw out that the 5090 cost nVidia an extra $100.
Then let's assume their pricing will be somewhere along the line of $1200 for 5080 and $2000 for 5090 (not too far removed from reality I suspect)... You'll note at these prices the 5090 actually offers a better value for the customer while also providing less profit to nVidia and placing higher demand on the lower supply product, meaning the lower demand product with the naturally higher supply sells less units - this obviously is less than ideal for them. By releasing the 5080 first, that mentally sets the reference-level in consumers heads (the first product they see becomes the default and other units get compared to it) but also means nVidia get a huge surge in sales of the more profitable and more available product before the 5090 can start cannibalising demand.

Feel free to correct any bad assumptions I made btw; I'm spitballing based on my limited knowledge of manufacture and the little info that's leaked.
I don't see any assumptions here, only good reasoning. The move to hard launch with 5080 with 5090 coming later is an obvious play tp maximize the sale of their higher-margin, card that has been specifically designed and placed to reduce hardware costs and allow for further segmentation later in the cycle. Surprised with all the talk of badly cut down the 5080 is and how inbalanced its config is we didn't see this coming sooner.
 
dont think it works like that, must be a real supply issue... but the launch's been already delayed, doesnt bode well
They had to delay it because of bugs in the chip's design - new masks costed them $40+ mil per set and from starting production it takes min. 3 months for first cards to appear. Hence, it doesn't seem to be like stock will be high on the release, depending on which chips has the bug.
 
It's straight out of Apple, Sony, and nearly every performance automaker's playbook.
There's one problem in your predictions - unlike other mentioned companies, Nvidia is a monopoly and has no competition to worry about currently. Hence, I wouldn't expect from them a typical behaviour from more healthy market. They can do near anything they want now.
 
They had to delay it because of bugs in the chip's design - new masks costed them $40+ mil per set and from starting production it takes min. 3 months for first cards to appear. Hence, it doesn't seem to be like stock will be high on the release, depending on which chips has the bug.

Source? First I've heard about this.
 
I doubt nVidia really cared too much about the 1% of customers who have the 4090 to upgrade from when deciding to release the 5080 first.

Pushing the 5080 out first makes perfect sense from a profit perspective; this is all going to be assuming the leaks are accurate or at least semi-accurate.
The 5080 is about 50% the die size of the 5090 - but that doesn't mean 50% the cost for nVidia given that yields will likely be much higher for the smaller die size bringing their cost down but let's roll with it being 50% cost for now. Because I'm lazy I'm just going to normalize the rest of the component costs between the 2 models to a flat value, except for the vRAM which I'll just throw out that the 5090 cost nVidia an extra $100.
Then let's assume their pricing will be somewhere along the line of $1200 for 5080 and $2000 for 5090 (not too far removed from reality I suspect)... You'll note at these prices the 5090 actually offers a better value for the customer while also providing less profit to nVidia and placing higher demand on the lower supply product, meaning the lower demand product with the naturally higher supply sells less units - this obviously is less than ideal for them. By releasing the 5080 first, that mentally sets the reference-level in consumers heads (the first product they see becomes the default and other units get compared to it) but also means nVidia get a huge surge in sales of the more profitable and more available product before the 5090 can start cannibalising demand.

Feel free to correct any bad assumptions I made btw; I'm spitballing based on my limited knowledge of manufacture and the little info that's leaked.
they will have comparable yields on both products, because both chips will be cut down accordingly to match the same level of defect incidence, infact they will be pushing borders on 5080 compared to the 5090 because of the available headroom cannibalization seems to be debatable, because in an upsell situation a 5090 isnt cannibalizing the 5080, it should be the other way around
also it isnt like the situation where ps4 pro users upgrade to ps5 and then to ps5 pro midcycle, 4090 will not be upgraded to 5080 its just not viable
thats broadly my thought process
 
Source? First I've heard about this.
Example, from August (just 1st result in search): https://www.tomshardware.com/pc-com...ll-gpus-allegedly-delayed-due-to-design-flaws Sure it's not specifically gaming GPUs, but they're all related. Considering they need about a month to fix and test things, then restart production, which then take min. 3 months to get first chips from the factory, confirm all good, then produce enough for release - yeah, January seems to be about the earliest time they could do a release, though I expect stock numbers to be relatively small initially. Then again, some other sources issue wasn't in the chip design itself but in packaging by TSMC, which would likely mean they had a whole batch of chips damaged in production most likely and not usable (depending on the exact issue), then had to wait for new batch to be delivered - that would introduce delays to resolve it, but release stock could be bigger than with fully restarting whole production.
 
Last edited:
they will have comparable yields on both products, because both chips will be cut down accordingly to match the same level of defect incidence, infact they will be pushing borders on 5080 compared to the 5090 because of the available headroom cannibalization seems to be debatable, because in an upsell situation a 5090 isnt cannibalizing the 5080, it should be the other way around
also it isnt like the situation where ps4 pro users upgrade to ps5 and then to ps5 pro midcycle, 4090 will not be upgraded to 5080 its just not viable
thats broadly my thought process

I'm not saying you're wrong here; you may well be right so I'll just explain how I thought it worked so we can see where I've made the error (I'm going to simplify it for demonstration sake).

So in chip manufacture they create a huge wafer with say 10,000 'cores'.
Each core has say a 1% failure rate.
You're cutting the wafer to 2 models - one with 100 cores and one with 200 cores.
On average you'd expect 1% of those 100 core models to have failures, and 2% of those models with 200 cores -- double the failure rate.

I assume they try and account for this by cutting to say 105 cores for the 100 core units and 210 cores for the 200 core units to minimize chip fails? Is this fully negating the effect of the larger die size failure rate?
 
I'm not saying you're wrong here; you may well be right so I'll just explain how I thought it worked so we can see where I've made the error (I'm going to simplify it for demonstration sake).

So in chip manufacture they create a huge wafer with say 10,000 'cores'.
Each core has say a 1% failure rate.
You're cutting the wafer to 2 models - one with 100 cores and one with 200 cores.
On average you'd expect 1% of those 100 core models to have failures, and 2% of those models with 200 cores -- double the failure rate.

I assume they try and account for this by cutting to say 105 cores for the 100 core units and 210 cores for the 200 core units to minimize chip fails? Is this fully negating the effect of the larger die size failure rate?
i remember this from my or class..
what they have as a parameter is average number of defects/area (d) as a parameter, then to estimate defect probability you assume that the defects are distributed as a poisson distribution with parameter d*A (chip area)
so the defect probability is 1-exp(dA).. it doesnt scale linearly with area, so they balance it by cutting a much larger area from a larger chip, so in the end they just cut enough area from both chips to deliver the same level of reliability, so as to ensure that the scale of the chip does not put a constraint on supply - without this there would be a much bigger hit on profitability and lost revenue opportunity (its an exponential response after all)
also, purely from a business strategy standpoint, they definitely need 4090 users to upgrade, and sooner the better
 
Last edited:
i remember this from my or class..
what they have as a parameter is average number of defects/area (d) as a parameter, then to estimate defect probability you assume that the defects are distributed as a poisson distribution with parameter d*A (chip area)
so the defect probability is 1-exp(dA).. it doesnt scale linearly with area, so they balance it by cutting a much larger area from a larger chip, so in the end they just cut enough area from both chips to deliver the same level of reliability, so as to ensure that the scale of the chip does not put a constraint on supply - without this there would be a much bigger hit on profitability and lost revenue opportunity (its an exponential response after all)
also, purely from a business strategy standpoint, they definitely need 4090 users to upgrade, and sooner the better
You could have just said "Yep Howl, they do that" :P
Man, I haven't even heard poisson distribution for at least 20 years though... all that yummy maths statistics we had to (well, technically I chose maths so it's my bad... preferred Pure maths and Mechanics though) learn and totally use every day.... xD
 
You could have just said "Yep Howl, they do that" :P
Man, I haven't even heard poisson distribution for at least 20 years though... all that yummy maths statistics we had to (well, technically I chose maths so it's my bad... preferred Pure maths and Mechanics though) learn and totally use every day.... xD
i too have a hazy recollection there.. d is < 0 so, the function is increasing at a decreasing rate, lol so the response is better than linear..
 
Pretty much all of them. Cyberpunk with PT, IJ, etc. Hub shown all these examples very well in their video too so I can refer you there now :)
Well yes, there are some issues, but not that big. Just RT in CB77 is pretty spotless (almost), Chernobilyte too.
It kinda depends per scene, with some tricks you fix a lot of issues. Some are just "bugs" that require more polish.

All in all, to me is worth it. The downside is less than those from "pure" raster.
 
Back
Top Bottom