• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

I have to wonder what evga is playing at with their 4090, they didn't just send Jay the card, the card is fully working but they can't call it a 4090 legally and now evga is writing custom software and bios for Jay to install

Like, why?

For the benchmark titles. They want to rub salt into Nvidia's open wounds.
 
Last edited:
They do have a monopoly on nodes at the moment but just for one second the wafer is nothing to do with GPUs costing this much and that's a fact, There was a report out recently that Nvidia's margins on the 4090 is 70% pure profit, A 4090 literally cost about £4-500 to produce absolute tops, That's everything, These components cost pittance, They are making massive profits, AMD also not just Nvidia, AMDs are even cheaper to make.
The BOM doesnt include the vast R&D cost to design/develop the chips. I'm sure they're making a good profit, just not 300%.
 
They do have a monopoly on nodes at the moment but just for one second the wafer is nothing to do with GPUs costing this much and that's a fact, There was a report out recently that Nvidia's margins on the 4090 is 70% pure profit, A 4090 literally cost about £4-500 to produce absolute tops, That's everything, These components cost pittance, They are making massive profits, AMD also not just Nvidia, AMDs are even cheaper to make.
yeah that's about right....gross margin at Nvidia is usually about 65%. They ordered a load of extra Ampere chips from Samsung in the first half of 2022 and Russia/Ukraine/inflation/interest rates disrupted consumer demand, so they're looking at about 60% gross margin for the year to 31 Jan 2023 (with a massive dip in Q2 and probably Q3) and about 66-67% gross margin in the year to 31 Jan 2024, which is when most of the Ada Lovelace chips will be sold...before coming back to 65%-ish in the year to 31 Jan 2025 as they clear Ada stock and get ready for the next gen (all subject to how harsh the coming recession is). Hopefully Intel coming back in as a leading edge fab will create a bit more competition post 2025, but with US/China tension building at the moment all the new fabs are breaking ground in the US, where people cost wayyyy more than in Taiwan. Maybe RDNA3's MCM/chiplet designs do for GPU what AMD did in CPU with Zen and we see way better yields, which could lower manufacturing prices? Also Raja Koduri (the guy who lead the Navi/RDNA project at AMD) is now over at Intel and their next gen (Battlemage?) could push more competition at the top. It's not just the cost to manufacture, market and distribute though (i.e. the costs reflected in gross margin).....Nvidia and AMD are spending $billions in R&D for each generation as well. And there are only so many gaming nerds and datacentres that are willing to pay up for the chips they're designing.

Don't get me wrong...on a personal level I'm ****** that top end GPUs cost more than a lot of people's cars nowadays. The engineering is amazing though....I'd venture Ada/Hopper chips are the most advanced thing mankind has ever built and a lot of the spend in improving performance is justified by AI/ML applications. It's great that the same tech can be used to chase higher FPS in video games. What a time to be alive ;)
 
Last edited:
Yes it does, as well as cards of them times boxes from all companies, they use to make box art that made you look at them, now they are boring boxes.

Bering honest though a lot off the box art was utter pish, robots with frogs head and that kinda mad crap from palit and others. Aliens from sapphire etc.
 
Last edited:
The BOM doesnt include the vast R&D cost to design/develop the chips. I'm sure they're making a good profit, just not 300%.
Yes of course there is R&D etc but that's the same for any company, The ground basis is already there so to speak.

It's not to say they don't deserve some rewards for their efforts of course they do but the margins they're making compared to BOM is astronomical.
 
yeah that's about right....gross margin at Nvidia is usually about 65%. They ordered a load of extra Ampere chips from Samsung in the first half of 2022 and Russia/Ukraine/inflation/interest rates disrupted consumer demand, so they're looking at about 60% gross margin for the year to 31 Jan 2023 (with a massive dip in Q2 and probably Q3) and about 66-67% gross margin in the year to 31 Jan 2024, which is when most of the Ada Lovelace chips will be sold...before coming back to 65%-ish in the year to 31 Jan 2025 as they clear Ada stock and get ready for the next gen (all subject to how harsh the coming recession is). Hopefully Intel coming back in as a leading edge fab will create a bit more competition post 2025, but with US/China tension building at the moment all the new fabs are breaking ground in the US, where people cost wayyyy more than in Taiwan. Maybe RDNA3's MCM/chiplet designs do for GPU what AMD did in CPU with Zen and we see way better yields, which could lower manufacturing prices? Also Raja Koduri (the guy who lead the Navi/RDNA project at AMD) is now over at Intel and their next gen (Battlemage?) could push more competition at the top. It's not just the cost to manufacture, market and distribute though (i.e. the costs reflected in gross margin).....Nvidia and AMD are spending $billions in R&D for each generation as well. And there are only so many gaming nerds and datacentres that are willing to pay up for the chips they're designing.

Don't get me wrong...on a personal level I'm ****** that top end GPUs cost more than a lot of people's cars nowadays. The engineering is amazing though....I'd venture Ada/Hopper chips are the most advanced thing mankind has ever built and a lot of the spend in improving performance is justified by AI/ML applications. It's great that the same tech can be used to chase higher FPS in video games. What a time to be alive ;)
I think also they're charging more this time because they did order a **** ton of wafers banking still on the mining boom and that backfired massively.

They are Fantastic products but when you look at them on the whole in all honesty they're no different to any consumer level electronics products, It's just entertainment I guess and also yes the professional level of these GPUs are really what they were designed for I imagine.

What people should be doing or hoping on is once intel creates competition in the global fab processes, This will be massive in creating competition for the likes of TSMC, Things will change once this happens and for the good as far as I am concerned.
 
Back
Top Bottom