• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Do AMD provide any benefit to the retail GPU segment.

Caporegime
Joined
17 Mar 2012
Posts
49,575
Location
ARC-L1, Stanton System
Given the current state of the GPU market i have been thinking a lot about this lately.

I'm inspired to some extent to make this post by this video, which makes many good and valid points but its conclusion seems to be born out of complete denial of what's been happening for at least a decade.


I will try to condense this down as much as i can as i don't want people to be put off by a wall of text. So please excuse the short handed nature of it.

He is right, IMO, that AMD have no interest in competing for market share, or rather perhaps they lack the confidence to try, they have no reason to believe it would work for them, but plenty of reasons to believe it wouldn't, they have been steadily loosing market share for a decade + despite during that time having tried to compete for that market share.

These are not exacting figures, so without watching the video again to get those this is close enough.
AMD segment their revenue result reports, one of those is gaming, it consist of GPU's and Consoles.
AMD's 2022 revenue for the "Gaming" segment was about $6.5 Billion, $3.5 Billion of that was from Sony, consoles like the PS5, about $2 Billion of that was from Microsoft, the XBox, the remaining $1.5 Billion was from GPU's and other assorted consoles, like the Steam Deck and its clones, so probably about $1 Billion revenue for the whole of 2022 for GPU's.
That's nothing, for a total revenue for that year of about $24 Billion, its about 4% of AMD's revenue, and that's revenue, not profit, this matters because developing GPU's is very expensive, the profit margins for that $6.5 Billion on the Gaming segment was less than $1 Billion, about 16%, if those are the margins for the GPU's that is $160 Million profit on GPU's for 2022.

If it costs $500 Million per year to keep up development for your GPU's then AMD are losing $350 Million per year to stay in this game, Intel spent $3.5 Billion over 3 or 4 years on ARC, and it is under developed, so i don't know how much AMD spend on GPU R&D but i'm willing to bet the consoles and Ryzen are propping it up heavily.

How much longer are AMD willing to go on with that and can they even invest more to fight Nvidia harder? AND do not want to take away R&D from where they are successful, they can't do it all and they have to stay ahead of the curve in the segments they are winning.
Its also about volume, the irony is that if AMD had 25 or 30% market share they would be better placed to fight Nvidia, because they would be selling a lot more GPU's than they actually are, so brining in more money and with that the task would be less extreme and difficult than it actually is.

That low volume also presents another problem, because it costs so much to develop them, and you sell so few you have no room to reduce your margins, you almost have to make it low volume high margin or you're just burning money keeping up with the R&D cycle.
So its no good saying AMD have to be 30% cheaper than Nvidia because RT isn't as good and they don't have DLSS, if that's how you see it good luck to you as Nvidia have you right where they want you, at that there is no point in AMD being in this game at all as they don't have the market share for low margins,

So, its up to us, it always has been, and tech tubers need to get angry at Nvidia instead of sighing and moaning that AMD aren't cheaper to make Nvidia cheaper, that being your reason for existing is exactly what did ATI in and it was AMD who bailed them out, so they aren't going to go down that road, if you don't buy them they will just stop making them and probably be glad of it.

AMD are a business, they will make decisions that are best for them, if they think they can get from 8% to 50% market share by being significantly cheaper than Nvidia, that is what they will do. In the same way that if Nvidia think they can maintain 90% market share with £1300 ##80 class cards that are really ##70 class cards.... that is exactly what they will do.

As for Intel, IMO they have realised they do not want to get tied up like AMD have in this massive mindshare monster that is Nvidia, and that is something we created. Yes putting someone on a very very high pedestal just gives them grater hight to spit on you, Nvidia feel absolutely untouchable, like they can do no wrong, because where else are you going to go? AMD? Yeah didn't think so...

Honestly i don't know why AMD don't just throw in the towel, i think if it wasn't for the consoles they would, irronically, but the fact that they are still in this game, despite everything perhaps indicated that on some level they do care.


A slight digression from this but i have noticed inexplicable frustration from a lot of reviewers reviewing AMD latest CPU's, ranting about things that have been going on for years just to put a negative slant on what should be a positive for AMD. complaining about existing trends that AMD are only starting to follow because no one has ever complained about it when AMD wasn't competitive, now that AMD are they seem upset about that, perhaps because while AMD's boot is on Intel's neck they are not helping reign Nvidia in, as if that's AMD's job, not these same tech journalists who seem to go out of their way as to not upset Nvidia too much.
When you farm out your own responsibilities what you get is what we now have. My own rant over, sorry :)
 
Last edited:
Amd is keeping RTG alive for the following reasons:
1.)Data centre dGPUs which can be sold with their data centre platforms on supercomputers, etc. These are evolved Vega designs.
2.)IGPs for their APUs.It's telling they kept TSMC 4NM for their new Zen4 APUs instead of using it for Navi31. These are used in laptops, OEM desktops and older ones used in industrial embedded systems in aircraft, etc.
3.)Have the ability to design SOCs for semi-custom areas such as consoles .This is partially helped by Sony and MS bankrolling R and D costs.
4.)Ability to license GPU uarch. This is happening with Samsung.
5.)The ability to be able to bundle dGPUs with their laptops. Again you can see Navi33 being designed for low cost in TSMC 6NM.

Basically RTG now really exists to support the AMD CPU division in selling new servers, supercomputers, laptops and semi custom products, whilst being able to generate licensing fees. Sadly it also means higher end gaming dGPUs will be an afterthought at best and that gaming dGPUs will be mostly driven by consoles, IGPs and laptops.

I suspect Navi 31 is a bit like the original Fiji design, testing out new technology, ie, in this case chiplets. I suspect AMD did really want to get closer to the RTX4090 but the design probably needed a respin like the ATI X1800 did, but AMD doesn't want to throw money at it. The fact they had access to TSMC 4NM like Nvidia did and would rather make APUs is telling. It was the same during the pandemic when they would rather make more console APUs than dGPUs.

Why waste silicon on dGPU's when laptop vendors are apparent never satified with how much they are getting?
 
If it was me i would keep RTG alive but knock all this on the head and instead use all these resource to keep HP, Lenovo, Asus..... as happy as i can.

Its too resource intensive are you get nothing but grief for it.
 
Last edited:
CPUs are much higher margins for AMD than dGPUs ever were. Even when ATI/AMD were competing top to bottom in dGPU performance and had 30% to 50% sales marketshare,they made very little money compared to Nvidia,because people bought rubbish like the Nvidia FX at a premium in droves over the Radeon 9000 series,or bought Fermi over the HD5000 series.

Conversely when AMD has hit it out of the park with CPUs,it was shown in revenues and profits. AMD will continue to make higher end dGPUs for gaming but I don't think they will throw resources at stuff. So when AMD does compete at the high end,its more about Nvidia making a misstep IMHO. An example is when the GA102 misstep,meant AMD was a process node and a half ahead with Navi MK2. This time Nvidia is a bit better process node.

If AMD would have made Navi31 on TSMC 4NM and used stacked cache on the memory controller chiplets,if they truly wanted to make a proper AD102 competitor. At the least it would have shown the AD103 a clean set of heels. Chiplets are more about dropping costs,ie,yields increase so are themselves a value play. OTH,they do have certain power consumption penalties due to the extra I/O required,as well as needing more transistors for that extra I/O.Instead yet again,they tried to make an economy 5NM/6NM design and hoped it would compete against a monolithic 4NM AD102. The GCD itself is smaller than the dGPU used in the RX6700XT.

It is smaller yes, 304mm for the 7900XTX logic die vs 335mm for the 6700XT / 6750XT, with the mem dies its 530mm but they are on 6nm, all of it on 5nm and without all the extra interconnect stuff it would probably be around 450mm, still 150mm smaller than the 4090 which is also on 4nm.
It is a smaller GPU that could be a lot larger, as Jim pointed out, but they chose not to do it, even though larger ones were apparently designed.

Who knows what the margins are on Navi 31, they could be higher than Navi 21, which was 520mm on 6nm, i would not be surprised if AMD deliberately cancelled a larger more 4090 region Navi and simply made the smaller one the $1000 6900XT replacement to get more margins from it once AMD realised the 4080 was going to be a $1200 GPU.
Yeah, absolutely. That is cynical and disrespectful, yet at the same time the situation is such that this card at $1000 still looks half decent and there is nothing AMD can do about that so as a business they are going to make a business decision, in the same way that Nvidia did making the 4080 a $1200 card.

Consoles are probably the bigger threat to Nvidia than AMD, oh... wait.
 
Last edited:
The last bit is also part of what Jim was also inferring in the video. If AMD was a bit more price competitive it wouldn't matter so much if they "only" competed with the second tier Nvidia dGPUs. But the reality is they stopped caring,and essentually just price a bit below Nvidia or plonk extra VRAM on their dGPUs and call it a day. Honestly I wish AMD would just concentrate on decent mainstream and entry level dGPUs and forget about the high end.Polaris and Navi MK1 being examples.

It's quite clear Nvidia has given up in that area like Apple did with cheaper smartphones,and even if AMD used a lagging process node for them to cut costs it would make more sense. At least AMD can bundle these dGPUs with their laptops as part of a package. At the "high end" its quite clear people are more brand focussed anyway.

I completely agree with that, 7700XT type card with 12GB of VRam, its not necessary to put the fastest memory IC's on it, good card, inexpensive, job done.

Let Nvidia have the high end and ignore the inevitable "Its AMD's fault for not competing" moaning, those people will have to learn to stand their own ground and if they never do why should AMD care?
 
The biggest problem with people talking about these things, including dummies like ATV, is that they keep framing this discussion in terms of simple moves, as if AMD is a bakery and they should just tweak the dough a bit, not understanding the time horizon of this industry and the way these moves are made. The reality is AMD made the moves we see today many years ago, and so whatever adjustments to the strategy they will make we will only see long after they are made, so any discussions of what they "should" do from us the masses is pointless - we have incomplete information and even that only years later; yeah, good luck making assessments with that! Ultimately this all ends up as nothing more than yapping.

As for the title question, the answer is: yes, obviously. AMD gives you the option of a cheaper GPU & more vram and overall better value if you don't care about RT, as well as providing a counter-weight to Nvidia's monopolistic tendencies - if you think things are bad now, just imagine if Nvidia DIDN'T have to think of AMD at all. So that's a clear benefit.

Yeah, agreed.
Exactly this. The idea of Nvidia having absolutely no competition is awful.

AMD haven’t been this competitive for years. I don’t know what people are expecting but maybe they are looking at different reviews to me because they are highly competitive and have a ton of features now days. I’ve got a 4090 and a 7900 xt, also had a 6950xt, 6900xt, 6800xt and a 6700xt…! And they have all be superb cards with excellent ownership experience. I don’t get why everyone is so negative about AMD in the GPU space. They offer a feature rich alternative. The 7900 XTX is a powerhouse with tons of a vram available at a comparable price to a 4080.

That in itself is excellent for everyone, including Nvidia.

RDNA2 is by far the best architecture AMD have made in many many years, if you ignore its RT performance, which for me to use Nvidia's example anything from the 3080 down you should because in terms of usable performance its just a token feature at that point.

The best card from that generation is the RTX 3090, i think we can all agree on that, great card, but a bit pricey, and that's fine, its a halo card designed for people for who money is no object, ok good.

I think the best card for the mass market from that generation are the RX 6600XT / 6650XT, RX 6700XT / 6750XT and the RX 6800, that last one is the one i really like but AMD have pretty much EOL that one, its almost impossible to find, they are not hugely better than the RTX 3060 <> 3070Ti but they are IMO better for a couple of reasons and cheaper, right what we all argue AMD should be doing, this is AMD doing that, so hows it going? Steam Hardware Survey has the RTX 3060 9X as many as the RX 6600XT and 6700XT put together, going from that Nvidia sold 20X more 3060's than AMD sold RX 6600XT's which is AMD's second highest ranking RDNA2 GPU, the highest being the 6700XT.

So where AMD should be at their best, in fact are at their best, they have 5% market share. Every single 3000 series, other than the RTX 3050Ti, which i didn't even know was a thing, is ranked higher.

What more do they have to do? And AMD can see these comparisons better than we can.
 
The 6600xt suffers from severe bandwidth issues, in recent games the 3060 is much faster than the 6600xt (forspoken / tlou / RT hogwarts / deadspace etc.)

Erm?

All of those games are in this and no.... that's simply not true.

 
Man the majority if not all of these sites are just fake (testing games / mikes benching / nj tech etc.).

Yes it is true, check hwunboxed or techpowerup. For example

performance-1920-1080.png


performance-rt-1920-1080.png
Don't tell me anything but TPU are fake because they don't agree with you, do you think i'm that stupid?
You're just not going to derail this thread by throwing troll grenades and then getting in to slide wars when someone posts something to counter your argument.
 
Last edited:
They are the same architecture, the same vendor and generation GPU, the lower end card with less bandwidth but more VRam is faster than the higher end card with more bandwidth but less VRam.

What in that is the logical reason for this discrepancy?
 
Last edited:
What difference does it make if it's bandwidth or the amount of vram itself that causes problems? It's a fact that there is an issue there and the 3060 (the one you compared the 6600xt to) doesn't have that issue.

Well because you said it was bandwidth related, it isn't, the 3070 has more bandwidth than all of them.
 
He didn't say 6600xt..


He said 3060 12gb vs 3070 8GB

This is what he said.

The 6600xt suffers from severe bandwidth issues, in recent games the 3060 is much faster than the 6600xt (forspoken / tlou / RT hogwarts / deadspace etc.)

Post #31...

 
I find threads like this funny.

AMD vs Nvidia, AMD vs Intel. People expect the same competition from a company fighting on two fronts, both CPU and GPU.

Until recently Intel hadn't even dipped their toe in any major way into the GPU market why not ask the same question of them vs nVidia?

More competition the better IMO but don't expect companies who have other major focuses to compete with the top end of nVidia (not an area that I've ever bothered with anyway). If it wasn't profitable they wouldn't do it.

I'm coming around to thinking this way.

As others have pointed out AMD are a CPU business who also make GPU's, they are in fact perhaps not that interested in fighting Nvidia for market share. AMD's rival are Intel, not Nvidia.
 
Back
Top Bottom