Soldato
It may not be what we wanted but its what we are willing to go with, that and cards with not enough VRam.
You given up on the other thread bug ?
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
It may not be what we wanted but its what we are willing to go with, that and cards with not enough VRam.
You given up on the other thread bug ?
I paid 660€ for my RX 6800 so I wouldn't mind if the RX 7800 XT would be near that, or at least below 800€.I wonder if a 7800xt or non xt will be approx £600
The last time there was a serious wafer shortage, AMD did what they were contracted (that is produce lots of console chips with ~10% margins), then allocated most of the rest to CPUs which have far greater margins than GPUs. If Sony or Microsoft want a 5nm refresh, I hope AMD told them they are at the back of the queue this time.
Well I did say "what they were contracted [to do]" as no matter how cautious their forecasting they can't have been happy that the PS5 used so many wafers for wafer-thin* margins (*sorry couldn't resist!).Are you talking about 2020/2021 or a different wafer shortage? If they did prioritise anything it would have been their CPUs, especially for the Enterprise. I don't see anything in their financial reports that they did otherwise. So, I am just wondering where did you get this info?
And I would be very surprised if their margins on GPUs were lower than their margins on CPUs.
Well I did say "what they were contracted [to do]" as no matter how cautious their forecasting they can't have been happy that the PS5 used so many wafers for wafer-thin* margins (*sorry couldn't resist!).
With the wafers they had left, I'm sure most went to CCDs and as many to EPYC as possible. That's a given. My point was more that the console makers distorted things.
As for margins of CPUs vs GPUs, the die sizes answers that. Take one 7nm wafer, produce 650 Zen3 CCDs, or produce 62 Navi 21 dies. Even if the CCDs all went to 5600X (retail $300 * 650 = $195,000), and all the Navi 21's went to 6900 XT (retail $1000 * 62 = $62,000). Now both Zen3 and Navi needs more than just the die. But the GPU needs a lot more parts (VRAM, cooler, PCB, VRMs) whereas desktop Zen3 only needs the IO chip and packaging. The raw margins is probably bigger than x3 between the two.
They would have agreed numbers before any chip shortage was known about. Sorry, but, check out the financial reports, the revenue brought in by the CPU and Graphic division are always higher than the revenue from the Enterprise and Semi Custom side of things. And 2021 was an amazing year for AMD's Epyc processors (which are in the Enterprise and semi custom section)
AMD would have done what any company would have done and concentrated on their core business, which is CPUs both home and business.
Not sure I agree with you on the margins. But that's harder for us to figure out. There is also the mix of AMD just selling the chip to the AIBs and using the chip to make the cards themselves. I would think CPU margins are tighter because they are a necessity for every computer and integrated GPUs are fine for most people. However, GPUs are a luxury. So AMD would want to be making more of a margin on them. I would be very, very surprised if GPUs were making less margin than CPUs.
Over on the AT forum thread about AMD results, a user Vattila put together a chart of the various margins:
Page 2 - News - AMD's Q2 2022, yet another quarter of revenue growth
Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.forums.anandtech.com
(AFAIK only possible because AMD change the way they reports on divisions).
Now that the Operating Margins are as the report things and I'm sure they have interest to distribute costs so as to pay less tax, but the new division Gaming has pretty poor margins. Theory is that they've lumped GPUs and consoles in there (think consoles used to be under Embedded but that's now mostly Xilinx.
Still doesn't tell us the whole story as GPU prices have come down a lot and consoles are still in there. But Data Centre and Client should be where most of the CPUs are. And while AMD sell few cheap CPUs, Data Centre is only 32%. Hard to tell, but it is possible that Intel have been underbidding AMD as their server chips are rather poor now.
What a turn of events.On your last sentence:
In the same quarter Intel earned $4,700 Million revenue from data centre, operating income from that was $200 Million, that's 4%.
Yes Intel are undercutting AMD like crazy, they have to, If they don't AMD may push them out of what was previously Intel's most profitable market. Intel job is to stop, or try at least to stop AMD from getting established in data centre, Intel are failing in that.
In 2016 Intel earned $19,000 Million from data centre, operating income from that was $8,500 Million, 50%, now they are on the brink of getting pushed out.
And you know what? i have 0 sympathy for Intel.
What a turn of events.
Of course, not only did little AMD manage to out-engineer them with Zen, a lot of Intel's big cloud clients have begun rolling their own ARM server chips.
And most of that was when only those with big pockets who could afford run a chip designer. That is so far it's mostly been the giants like AWS. But now ARM themselves actually do reference server chips. The 50% may be gone forever. Are Nvidia the only huge margin vendor left?
They need to be in it for the long game, or they are not really in it at all imho.Intel, who are currently evaluating the profitably of ARC will be asking them selves these same questions, in the past they will have been arrogant enough to think they can just steal it from AMD, they are not so sure now.
A 5950X amounts to about 350mm^2 of die space, it has no fans, no shroud, no cooler, no memory IC's, little in the way of a PCB.
Right now its £500. https://www.overclockers.co.uk/amd-...hz-socket-am4-processor-retail-cp-3c9-am.html
This, https://www.overclockers.co.uk/powe...ddr6-pci-express-graphics-card-gx-1a3-pc.html an RX 6700XT is 335mm^2, with everything the 5950X doesn't have its also £500.
They can sell the 5950X in to a supply chain for £400, who sell it to OCUK for £450, who sell it to you for £500
When Powercolor are selling that GPU to you for the same money after adding all the ancillaries, building it and selling it to a supply chain, then OCUK before it gets to you, how much do we think AMD sold Powercolor the chip for? Its not £400
Regarding the bolded statement a CPU as sold is fairly "simple", while a GPU is kind of like a mini prebuilt PC. You have your cooling system, "mother"board and "V"RAM, so I'm not convinced that at any point in the product stack (when comparing like for like) the percentage margin of a GPU is more than a CPU.Just seen this post, not sure why I missed it yesterday.
I don't agree with your calculations. Well, mainly because you picked the top CPU and a midrange GPU. If you were comparing like with like, then you would be comparing with maybe the 5600x. Which one would make the most margin then?
But, think about it another way. Intel deal with mainly CPUs. Nvidia deal with mainly GPUs. Intels margins always hover around the mid 40s. Now, lets go back to Pascal and Kepler for Nvidia. Their margins back then were mid 50s. Remember Kepler, most of the GPUs were under £500.
That would lead me to believe that there are higher margins in GPUs.
And I would be very surprised if it's not the same with AMD. That overall GPUs have a higher margin than CPUs.
Also just want to comment on your post AMD getting out of the GPUs. LOL, not a hope. Even with the tiny market share they had, GPUs were bringing in money. Now, AMD are finally getting their act together and the gaming market is still huge.
But, it's not only the gaming market now, GPUs are bringing in more revenue as they are becoming a bigger and bigger part of datacentres.
Regarding the bolded statement a CPU as sold is fairly "simple", while a GPU is kind of like a mini prebuilt PC. You have your cooling system, "mother"board and "V"RAM, so I'm not convinced that at any point in the product stack (when comparing like for like) the percentage margin of a GPU is more than a CPU.
About the evidence you presented concerning Pascal and Kepler margins. There has to be more to the story that is not being accounted for.
I think for both most of their sales are OEMs/prebuilt. It just that the lowest value pre-builts don't come with a dedicated GPU. I do agree with you about everyone "neglecting" their CPUs in favour of a more powerful GPU.I am only using the Kepler and Pascal margins as they are the last generation of GPUs before the prices skyrocketed due to other influences, like mining etc. Nvidia's margins have been over 60% since then.
It's pretty simple. CPUs margins are smaller because the majority of their sales are in OEM's etc. Even most gamers will buy the cheapest CPU they can get away with because for most gamers spending more on the GPU will have a bigger impact.