• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA 4 thread

I have an hunch that AMD's market share could see a sizeable jump in market share over the next few months judging by the anecdotal reports coming out of China.

I doubt it. Every AMD GPU launch since 2015 has been underwhelming, sure RDNA has gone a long way to fix AMD's woeful performance per watt issue during the Vega days but performance just hasn't been there and they never push the envelope in terms of how much silicon we get for a £ notes.

Faster than 4090 in many games and resolutions.
but somehow performance isnt there according to such a comment just mean ridiculous.
Before you start posting various benchmarks recall this is about rdna4 and your bringing past events to such that havent happen yet setting you up to never experience reality.
Leading to disappointment and fear.

rdna4 is interesting as the tech amd has is innovative.
 
I have an hunch that AMD's market share could see a sizeable jump in market share over the next few months judging by the anecdotal reports coming out of China.

I doubt it. Every AMD GPU launch since 2015 has been underwhelming, sure RDNA has gone a long way to fix AMD's woeful performance per watt issue during the Vega days but performance just hasn't been there and they never push the envelope in terms of how much silicon we get for a £ notes.

I don't agree, the RX 6000 series was decent and IMO the 7900XT is better than the 4070Ti, the 7800XT better than the 4070 and the 7700XT better than the 4060Ti, Nvidia sell 10X more of those.

With one tenth of the budget AMD are expected to develop GPU's that are as good in every way or better than Nvidia for 30% or more cheaper.

That's unsustainable, if AMD didn't have the consoles it wouldn't be making dGPU's, they would have given up on that years ago.

Nvidia sold far more RTX 3050's for $400 than AMD sold the far better RX 6600 for $330.
 
Last edited:
I don't agree, the RX 6000 series was decent and IMO the 7900XT is better than the 4070Ti, the 7800XT better than the 4070 and the 7700XT better than the 4060Ti, Nvidia sell 10X more of those.

With one tenth of the budget AMD are expected to develop GPU's that are as good in every way or better than Nvidia for 30% or more cheaper.

That's unsustainable, if AMD didn't have the consoles it wouldn't be making dGPU's, they would have given up on that years ago.

Nvidia sold far more RTX 3050's for $400 than AMD sold the far better RX 6600 for $330.

The RX6000 was sabotaged by AMD pushing lots of wafers to consoles,which lead to stock shortages everywhere(UK being an example where no RRP cards were sold after 2020).
 
Last edited:
The RX6000 was sabotaged by AMD pushing lots of wafers to consoles,which lead to stock shortages everywhere(UK being an example).

AMD don't manufacture console chips and it doesn't come out of their wafer allocation, Sony / Microsoft source their own supply and manufacture those chips, they pay AMD a licencing fee.
---------------------

We all want cheaper GPU's but its AMD tech channels keep criticising for not being cheap enough to force Nvidia to be cheaper.

With that AMD get a bad rep even when they do make good GPU's, so Nvidia not only keep getting away with ever more expensive crap but people are also left with the impression AMD are not worth buying, at all, only Nvidia.

I have had Nvidia GPU's for the past 9 years, i'm looking at the 4070 thinking its much too expensive and the 7800XT is just a much better GPU, i class it more a 4070Ti competitor. that's me, i think tech reviewers are stupid, at best, as i said Nvidia sell 10X more 4070's.
 
Last edited:


I get exactly the same problem he does in that second video with Shadowplay, its a piece of ____ i've been trying to fix it for weeks, everytime i do something else goes wrong with it and it ______ up again.
 
Last edited:
AMD don't manufacture console chips and it doesn't come out of their wafer allocation, Sony / Microsoft source their own supply and manufacture those chips, they pay AMD a licencing fee.
---------------------

AMD makes the chips for those companies and it comes out of their wafer allocations. Even the original contracts were AMD supplying finished chips. That means consoles fight for the same wafers as everything else and dGPUs come last. Nvidia for many years has supplied far more dGPUs to OEMs,and unless AMD can supply enough chips they will continue to not have a lot of marketshare.

Most PCs sold in the world are prebuilt systems. This is why I prefer AMD concentrates on the mainstream market and actually build decent dGPUs that can make it into prebuilt systems. They also need to work more on the software side,ie, as shown by the RX7900M which had buggy drivers leading to high idle power consumption.

Instead they concentrate on large,overly complex designs,which cost too much to make, which unless they take the performance crown from Nvidia,are a waste of money IMHO.The self built PC market is too fickle.

This is why the RTX3060 and GTX1650 are in the top 5 - so as much as everyone talks about RTX4090,those are the bread and butter consumer dGPUs for Nvidia. The RX6600 should have been that,but the supply was not simply good enough and it launched late.

I hope with RDNA4,they properly integrate things in their laptops to provide a package for OEMs to sell. Its annoying that AMD laptops are selling RTX4060 dGPUs!

They need volume to sustain RTG. Even @KompuKare has talked about this.

ATI even when it had subpar series such as the HD2000/HD3000 series,they supplied a lot of dGPUs for prebuilt systems. This is why their marketshare was still above 30% IIRC.

The agreement with Samsung is licensing where they build their own chips at their own fabs. I am hoping they can try and build commodity dGPUs using Samsung because TSMC is overbooked currently.
 
Last edited:
AMD makes the chips for those companies and it comes out of their wafer allocations. Even the original contracts were AMD supplying finished chips. That means consoles fight for the same wafers as everything else and dGPUs come last. Nvidia for many years has supplied far more dGPUs to OEMs,and unless AMD can supply enough chips they will continue to not have a lot of marketshare.

Most PCs sold in the world are prebuilt systems. This is why I prefer AMD concentrates on the mainstream market and actually build decent dGPUs that can make it into prebuilt systems. They also need to work more on the software side,ie, as shown by the RX7900M which had buggy drivers leading to high idle power consumption.

Instead they concentrate on large,overly complex designs,which cost too much to make, which unless they take the performance crown from Nvidia,are a waste of money IMHO.The self built PC market is too fickle.

This is why the RTX3060 and GTX1650 are in the top 5 - so as much as everyone talks about RTX4090,those are the bread and butter consumer dGPUs for Nvidia. The RX6600 should have been that,but the supply was not simply good enough and it launched late.

I hope with RDNA4,they properly integrate things in their laptops to provide a package for OEMs to sell. Its annoying that AMD laptops are selling RTX4060 dGPUs!

They need volume to sustain RTG. Even @KompuKare has talked about this.

ATI even when it had subpar series such as the HD2000/HD3000 series,they supplied a lot of dGPUs for prebuilt systems. This is why their marketshare was still above 30% IIRC.

The agreement with Samsung is licensing where they build their own chips at their own fabs. I am hoping they can try and build commodity dGPUs using Samsung because TSMC is overbooked currently.

The RX 6600XT was one of the few GPU's regularly in stock during the mining craze.
 
They aren't going to make 5 million GPU's if they can't even sell half of them, they made that mistake once before when they had competitive GPU's.

All the RX 7000 series GPU's have good stock level's, if they aren't making enough of them there wouldn't be any stock. They are just not selling. they still aren't even showing up on the Stream Charts.
 
Last edited:
The RX 6600XT was one of the few GPU's regularly in stock during the mining craze.

Not in most markets,as AMD never launched them in any volume. They had hardly any traction with OEMs on their own platforms. They launched many months late,and simply can't commit enough volume for larger OEMs.

You need to stop looking just at DIY PCs - its prebuilt systems where a lot of consumer sales are at. Not only in the UK do you barely see them,its the same when I visited abroad. There is no excuse now when AMD sells a decent amount of CPUs - it shows that RTG and the CPU division are not working close enough together.

The CPU division should be pushing for integration of AMD dGPUs into laptops,etc on their own platforms. But it seems at least on the consumer side it hasn't been a priority. Just look at the RX7900M launch - first reviews come out and the launch drivers are buggy and it has high idle power consumption. But a previous driver is OK. It's been a year since RDNA3 chiplet dGPUs have launched,and they can't even get a flagship product to launch with the correct drivers? Also I have heard lots of other things from elsewhere too,which are not inspiring either.

During the days of ATI,it was much more common to have ATI cards in prebuilt systems. When ATI was bought by AMD they ignored the OEM market. For example ATI had built-up a decent amount of laptop share. But with the HD7000 series,AMD didn't even integrate proper graphics switching reliably and Nvidia did. The previous generation was fine.

The bugginess meant lots of OEMs shifted to Nvidia,despite many of the entry level and mainstream HD7000 parts being efficient. Then their obsession with the DIY market,meant Nvidia crept in with cards such as the GTX750TI which was ideal for OEMs and laptops. AMD concentrated way too much on the prebuilt desktop market. Nvidia didn't.

ATI understood this - the HD4770 being an example of such a product.

So I honestly hope that with RDNA4,they stop chasing after Unicorns and concentrate on mass market integration on their own platforms. If it means desktop power users have to get Nvidia,then I would rather have that,than Intel eventually bypassing them in sales. Plus if they really want to sell high end products for decent money,they have to be competitive with Nvidia/Intel in RT,upscaling software and other features. These are far more important to desktop power users.
 
Last edited:
CAT ATI's driver where ______ attritious. Its where a lot of the "Bad AMD driver" hangups come from, in reality Nvidia have just as many if not more driver problems than AMD, they are always ignored and when it does get to a point where people are screaming at tech jurnoes "why aren't you cavering this?" they always say "Nvidia say they are on with fixing the problem so its not nothing to worry about"

I do however agree that RTG need to get their _____ together but its not the reason for AMD's 15% market share and OEM's not wanting them is also little to do with that, AMD's APU's are in every conceivable way better than Intel's and have been for many years, yet few of them have any interest in using them at all, those that do put them in one or two models and that's it, even with massive demand for them, which there is.
 
Last edited:
Either AMD stick to 60-80 series cards and price them to sell or they really up their game. Even then they will need to be better than Nvidia and cheaper, a tall order. They need something like chiplets to give them the ability to increase performance whilst being able to sell at a keen price profitably. They still have to keep that up for years to win market share.
 
Either AMD stick to 60-80 series cards and price them to sell or they really up their game. Even then they will need to be better than Nvidia and cheaper, a tall order. They need something like chiplets to give them the ability to increase performance whilst being able to sell at a keen price profitably. They still have to keep that up for years to win market share.

The only reason AMD are able to do what they are doing to Intel is the MCM technology, if the were monolithic they wouldn't be able to make these CPU's let alone at this cost.

AMD are trying to do the same with GPU's vs Nvidia, but i feel it may be a lost cause and i get the sense AMD are already loosing their own confidence in it, the fact that no matter what AMD do (short of becoming unprofitable) its never enough keeps getting proved over and over again.

AMD had a large following for CPU's over many many years before Bulldozer, AMD have an existing mindshare for their CPU's which helped, they have never had that for GPU's, from day one it was "AMD suck Nvidia good" and with people now only wanting AMD to compete so they get cheaper Nvidia, Even HUB admitted they knew that's what people want, they play to it... AMD have no chance what so ever.
 
The only reason AMD are able to do what they are doing to Intel is the MCM technology, if the were monolithic they wouldn't be able to make these CPU's let alone at this cost.

AMD are trying to do the same with GPU's vs Nvidia, but i feel it may be a lost cause and i get the sense AMD are already loosing their own confidence in it, the fact that no matter what AMD do (short of becoming unprofitable) its never enough keeps getting proved over and over again.

AMD had a large following for CPU's over many many years before Bulldozer, AMD have an existing mindshare for their CPU's which helped, they have never had that for GPU's, from day one it was "AMD suck Nvidia good" and with people now only wanting AMD to compete so they get cheaper Nvidia, Even HUB admitted they knew that's what people want, they play to it... AMD have no chance what so ever.
It is a pretty stiff challenge to make both CPU's and GPU's and fight both Intel and Nvidia at the same time!
 
Also, just putting a huge gob of cache on CPU's will not work for that gaming performance, the reason being is that a large cache has latency, its why L1 is so much faster than L2 and it so much faster than L3.

It only works because they are stacked on top of eachother, the access latency is vertical and this (---) long as opposed to horizontal and this (---------------------) long.

It is a pretty stiff challenge to make both CPU's and GPU's and fight both Intel and Nvidia at the same time!

That too yes, i sometime wonder if AMD would be better off dropping dGPU altogether and reinvest in to CPU's, AMD are the best engineers when it comes to difficult and complex technologies, i would love to see he sort of stuff they could come up with if they had an R&D budget like Nvidia.

But i too am selfish and don't want to pay £1000 for a ##60 class card.
 
Last edited:
I'd love to know what AMD could do with dGPU's with an R&D budget like Nvidia.

We are backing the wrong horse, imo.
 
I don't agree, the RX 6000 series was decent and IMO the 7900XT is better than the 4070Ti, the 7800XT better than the 4070 and the 7700XT better than the 4060Ti, Nvidia sell 10X more of those.
The 6000 series was pretty good, the 7000 is poor though, just being 10% faster or 10% cheaper wasn’t good enough when Nvidia were already overpriced by 50%+ this gen.
 
CAT ATI's driver where ______ attritious. Its where a lot of the "Bad AMD driver" hangups come from, in reality Nvidia have just as many if not more driver problems than AMD, they are always ignored and when it does get to a point where people are screaming at tech jurnoes "why aren't you cavering this?" they always say "Nvidia say they are on with fixing the problem so its not nothing to worry about"

I do however agree that RTG need to get their _____ together but its not the reason for AMD's 15% market share and OEM's not wanting them is also little to do with that, AMD's APU's are in every conceivable way better than Intel's and have been for many years, yet few of them have any interest in using them at all, those that do put them in one or two models and that's it, even with massive demand for them, which there is.

The driver issues were mostly before the ATI 9000 series but Nvidia marketing is brilliant(especially on social media) and they emphasised the fact. WRT to their APUs,Intel still has more overall volume in laptop than AMD so I am not surprised. But remember,all Intel consumer desktop CPUs are APUs,also used in laptops. But with AMD it was split between the chiplet CPUs and the APUs,and the latter had reduced connectivity(such as being stuck with PCI-E 3.0). So if you wanted to use an AMD CPU in a prebuilt system you needed a dGPU. Who makes the most dGPUs....Nvidia.

This is why Zen4 has a basic IGP. But at the same time,I think splitting AMD into the CPU division and RTG was a mistake. It would have been better to have them under one umbrella,so they could integrate things better. I think this is now happening.

But the reason I go on about volume,is that it is free marketing for Nvidia. If most of the prebuilt systems have Nvidia badges,it's all they are going to aware off. AMD have to focus more on a narrower spread of products,and get the launches done in good stead and in good time. Ideally even line-up CPU and dGPU launches,and try and pre-empt Nvidia.

During the Polaris era their marketshare was much higher than now,despite literally only two products. Their big nose-dive happened when Maxwell released and got a lot of sales in prebuilt systems,and then after Polaris they seemed to have not really found the same success. But Polaris was launched before the GTX1060.

Their biggest advantage is a very competitive CPU platform,and they need to use this to drive sales of dGPUs in complete systems.

On the APU side,Strix Halo and Strix Point look to have decent IGP performance:
 
Back
Top Bottom