• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel ARC and the Latest Drivers: Does this bode Well for ARC's future?

In fact, looking at said roadmap it specifies another ~225W TDP part, so it certainly won't be a 4090 competitor unless Intel perform an actual miracle. It'll be another product aimed at the mid-range.

Didn't Intel say somewhere they were targeting $300?

Oh, I sort of found it:

Q) That mass-market approach, would that mean that you primarily focus on the mid- and lower-tier SKUs first and then push out high-end ones?

A) Raja Koduri: High-end has no limit right now. What is the definition of high-end? Is it 600 Watts? Obviously our partners and our customers want some halo SKUs for bragging rights, and we always like to figure out ways to enable that. But my priority at this point is getting that core audience, with one power connector. And that can get you up to 200-225W. If you nail that, and something a little above and a little below, all that falls into the sweet spot."

Q) Xe scales from integrated GPUs to supercomputers. Is that going to be the way forward now that you have had this experience? Do you have any changed priorities for the next generation of Xe? And on the software side will OneAPI also filter down to the consumer space?

A) Raja Koduri: We haven't broken our vision. Architecture is a hardware-software contract. Will we need to break it? I'm not sure at this point. If we do, it will be for a good reason, like if I find a path to getting [Nvidia GeForce RTX] 4090-class performance at, say 200W – if only! That's the level of benefit I need to see before I break it. OneAPI is supported all the way from the smallest integrated chip to the biggest exascale machines.

 

Raja will step down as Intel Chief Architect and leave Intel after 5 years at end of March 2023. Raja will create a new software company startup to generative AI for gaming, media & entertainment.
 
Well performance in Destiny 2 certainly is interesting with the latest drivers.

On my Bi Frost performance in my usual testing spot (Throne world loop) is ever so slightly better but the performance in the central area of Neptune is terrible. FPS counter is reading in the mid 50's but feels lower in practice - even with VRR. Compared to my Legion's 3080M (with no overclock / tweaks just max TGP) which is around 100FPS average it seems to be a step backwards. For context I was starting to see a closing of the gap between the two in my previous testing.

Now I have moved over to AM5 from Alderlake but that shouldn't be causing me any issues, especially not at 3440x1440 and with a CPU hitting 5.15Ghz in game (7600 with decent DDR5). Initially I thought Rebar was off, but no it is on in the bios. Need to do more testing but so far quite the mixed bag.

It is possible that the changes to the game itself is causing some kind of performance conflict on ARC but that isn't something I can logically test as there is no "old" version of the game to compare against.

Moving over to AM5 hasn't caused any obvious regression in any other game I test so it sonly D2 which is the outlier. Saying that it took months for Bungie to resolve issues with RDNA2 underperforming massively so I won't hold my breath.

EDIT: Just did a bit of digging and looks like lots of reports of D2 performing worse than it used to across a spectrum of PC's.
 
Last edited:

Raja will step down as Intel Chief Architect and leave Intel after 5 years at end of March 2023. Raja will create a new software company startup to generative AI for gaming, media & entertainment.
Well, no real surprise there.

Pushed out jumped?

I suspect pushed although in best corporate style, it may have been that Gelsinger merely lead Koduri to the right corner of the roof and blocked all other routes!

Wonder why exactly though? Over-promising and under-delivering is something he's done many times before, so maybe this time it was that he did so to the board & CEO. To consumers you can usually tell any fib without consequences.

People mistakenly look at A770's performance versus Nvidia and AMD, for instance showing it beating a 6700 XT at ray tracing. However, considering the number of transistors used and the size of the die (and what Intel have to pay TSMC for a die that size) A770 was never meant to have been hanging out with 3060 or 6600 but should have competing with 3070Ti and 6800.

ARC's relatively good showing at ray tracing merely shows is that Intel were willing to spend transistors there, or in other words AMD are not willing to spend more transistors there than the absolute minimum.
 
Well, no real surprise there.

Pushed out jumped?

I suspect pushed although in best corporate style, it may have been that Gelsinger merely lead Koduri to the right corner of the roof and blocked all other routes!

Wonder why exactly though? Over-promising and under-delivering is something he's done many times before, so maybe this time it was that he did so to the board & CEO. To consumers you can usually tell any fib without consequences.

People mistakenly look at A770's performance versus Nvidia and AMD, for instance showing it beating a 6700 XT at ray tracing. However, considering the number of transistors used and the size of the die (and what Intel have to pay TSMC for a die that size) A770 was never meant to have been hanging out with 3060 or 6600 but should have competing with 3070Ti and 6800.

ARC's relatively good showing at ray tracing merely shows is that Intel were willing to spend transistors there, or in other words AMD are not willing to spend more transistors there than the absolute minimum.

In transistors the ACM-G10 seems halfway between Navi22 and Navi21,and I would argue drivers and support in games are what is not helping performance. Supposedly most of the driver development team being in Russia caused big problems too. Just look at some of the Chinese dGPUs too - the hardware looks capable enough,but it seems again drivers and support in games which are the problem.

This is a problem with Intel IGPs,even predating Raja being there. At least AMD has the excuse of being a relatively small company,what excuse did Intel have for over a decade?

AMD,despite being in consoles and having a long history of involvment with games devs,has the same issues compared to Nvidia too,so it will be even harder for Intel in this regard but they had the resources. Nvidia has always been pushing games to favour its own features and strengths or capitalising on AMD uarch weaknesses,ie,think tessellation or more use of RT reflection effects. Even further back,lack of DX10.1 support.

Just look at Vega after a few years:

Look where Vega56 was - barely beating a GTX1070 and GTX1080 and RTX2060 beating it. It pretty much is now matching a GTX1080/RTX2060. Sometimes it can match a GTX1080TI:

It was always a design designed for commerical usage - the first Vega cards were commercial cards. AMD repurposed it because they had no other dGPUs to use above the RX480 8GB. But it still lives on in cDNA which seems relatively successful. Vega also did pretty well in AMD IGPs until relatively recently.

Also,people forget it was Raja who was involved with Polaris,and Vega was developed during a time with very little R and D money at AMD. IIRC,over 70% of AMD R and D was being pushed towards designing Zen.When AMD nearly went bankrupt a few years ago(share price started tanking),they massively cut R and D spend due to Bulldozer being a disaster, and AMD having locked themselves into the Global Foundries contracts which meant they had to pay penalties for not hitting order requirements. IIRC,it was the period after the R9 290 series was launched too. RDNA/RDNA2 was started under his watch too.

RDNA3,OTH,might have been the first uarch which he had no involvement with.
 
Last edited:
Heh thought I was on to something with Youtube hogging 3D resources on my Legion and thinking that might have been the case on the A770. Nope, D2 still runs like a pig in comparison on Neptune. :cry:

In other news the i9 12900H / 3080 (Max TGP) config in the Flow Z13 is noticeably faster than then 5800H/3080 (Max TGP) in D2. Obviously the faster CPU is helping out quite a bit there. Didn't really think it would be that different but its around 10-15 FPS uplift. Interesting...
 
Ok, interesting. Downloaded D2 via the Xbox store / portal and it runs better than via Steam, at least in my Neptune testing area. Was not expecting that. It is still noticeably behind the Legion / Flow but the delta to the same area on Steam is in the region of 10-15% so more than run to run variance.

Now I need to see if the same is true on Legion / Flow.
 
they massively cut R and D spend due to Bulldozer being a disaster

You mean "faildozer".

I remember going to this AMD hardware convention thing in London at the time.

One of the stands the AMD guys were overclocking one of those CPUs using a copper cup and pooring liquid nitrogen in it, to get it past 5ghz.

When I said to him I could get very close to that on my Sandybridge i7-2600k using just an air cooler he really had no comeback.
 
Had my 2700K running at 5Ghz on a D14 iirc, lovely cpu that.

Well I think this is part of it if I'm honest, yes the "faildozer" wasn't great, but those Sandybridge CPUs were phenomenal.

I was using mine for 10 years and it was still doing ok, it's still being used today in my son's PC and still performing.

They were in my opinion the best regular consumer market CPUs ever made to date, AMD had no chance and even less so given the faildozer.

Edit: and I say that as a long term AMD fan!
 
Last edited:
In transistors the ACM-G10 seems halfway between Navi22 and Navi21,and I would argue drivers and support in games are what is not helping performance. Supposedly most of the driver development team being in Russia caused big problems too. Just look at some of the Chinese dGPUs too - the hardware looks capable enough,but it seems again drivers and support in games which are the problem.

This is a problem with Intel IGPs,even predating Raja being there. At least AMD has the excuse of being a relatively small company,what excuse did Intel have for over a decade?

AMD,despite being in consoles and having a long history of involvment with games devs,has the same issues compared to Nvidia too,so it will be even harder for Intel in this regard but they had the resources. Nvidia has always been pushing games to favour its own features and strengths or capitalising on AMD uarch weaknesses,ie,think tessellation or more use of RT reflection effects. Even further back,lack of DX10.1 support.

Just look at Vega after a few years:

Look where Vega56 was - barely beating a GTX1070 and GTX1080 and RTX2060 beating it. It pretty much is now matching a GTX1080/RTX2060. Sometimes it can match a GTX1080TI:

It was always a design designed for commerical usage - the first Vega cards were commercial cards. AMD repurposed it because they had no other dGPUs to use above the RX480 8GB. But it still lives on in cDNA which seems relatively successful. Vega also did pretty well in AMD IGPs until relatively recently.

Also,people forget it was Raja who was involved with Polaris,and Vega was developed during a time with very little R and D money at AMD. IIRC,over 70% of AMD R and D was being pushed towards designing Zen.When AMD nearly went bankrupt a few years ago(share price started tanking),they massively cut R and D spend due to Bulldozer being a disaster, and AMD having locked themselves into the Global Foundries contracts which meant they had to pay penalties for not hitting order requirements. IIRC,it was the period after the R9 290 series was launched too. RDNA/RDNA2 was started under his watch too.

RDNA3,OTH,might have been the first uarch which he had no involvement with.

Polaris was the final iteration on GCN.

Vega was all Raja and ARC has all the same odd behaviours of Vega, right down to what games it does well in and what games it does badly in, its a copy of... with a few tweaks i'm sure.

I am going to blow my own trumpet here, i was pretty much alone on this forum predicting a Raja Kuduri involved with ARC would result badly and end badly.
I was ridiculed, Raja is a genius, its all AMD's fault, they didn't give him enough money, he would be awesome at Intel, they will give him all the money he needs, RIP AMD. They said.... AMD were glad to get the _____ rid of him. I guarantee you that.

Raja at Intel

AND: you know those piles of cash you keep comparing to ours..... wave them bye bye.

giphy-downsized-large.gif


I was bang on, and if i was to say "RIP Intel" now i would be very wrong, but still far more accurate than that ^^^^ was ever going to be. I was right about that too.... Jon Peddie? pft... these people couldn't build an accurate analysis if their lives depended on it, they don't have the instict for it and you can't teach that.

While i'm at it.... i was also right about Nvidia. I should do this sort of analysis for a living, many people paid 50X more than i do keep getting it wrong where i get it right.
 
Last edited:
Polaris was the final iteration on GCN.

Vega was all Raja and ARC has all the same odd behaviours of Vega, right down to what games it does well in and what games it does badly in, its a copy of... with a few tweaks i'm sure.

I am going to blow my own trumpet here, i was pretty much alone on this forum predicting a Raja Kuduri involved with ARC would result badly and end badly.
I was ridiculed, Raja is a genius, its all AMD's fault, they didn't give him enough money, he would be awesome at Intel, they will give him all the money he needs, RIP AMD. They said.... AMD were glad to get the _____ rid of him. I guarantee you that.

Raja at Intel

AND: you know those piles of cash you keep comparing to ours..... wave them bye bye.

giphy-downsized-large.gif


I was bang on, and if i was to say "RIP Intel" now i would be very wrong, but still far more accurate than that ^^^^ was ever going to be. I was right about that too.... Jon Peddie? pft... these people couldn't build an accurate analysis if their lives depended on it, they don't have the instict for it and you can't teach that.

While i'm at it.... i was also right about Nvidia. I should do this sort of analysis for a living, many people paid 50X more than i do keep getting it wrong where i get it right.

4K8K is that you? :cry:
 
Polaris was the final iteration on GCN.

Vega was all Raja and ARC has all the same odd behaviours of Vega, right down to what games it does well in and what games it does badly in, its a copy of... with a few tweaks i'm sure.

I am going to blow my own trumpet here, i was pretty much alone on this forum predicting a Raja Kuduri involved with ARC would result badly and end badly.
I was ridiculed, Raja is a genius, its all AMD's fault, they didn't give him enough money, he would be awesome at Intel, they will give him all the money he needs, RIP AMD. They said.... AMD were glad to get the _____ rid of him. I guarantee you that.

Raja at Intel

AND: you know those piles of cash you keep comparing to ours..... wave them bye bye.

giphy-downsized-large.gif


I was bang on, and if i was to say "RIP Intel" now i would be very wrong, but still far more accurate than that ^^^^ was ever going to be. I was right about that too.... Jon Peddie? pft... these people couldn't build an accurate analysis if their lives depended on it, they don't have the instict for it and you can't teach that.

While i'm at it.... i was also right about Nvidia. I should do this sort of analysis for a living, many people paid 50X more than i do keep getting it wrong where i get it right.

He was a manager,with good experience of the software side,and some knowledge of the hardware side. But you need to have a clue about software to target where the hardware can be good at. He was at S3 Graphics,ATI,AMD,Intel and Apple over 30 years. He can't be that bad if so many leading companies hired him.

Raja Koduri was director of advanced technology development at ATI from 2001 to 2009. Many of us older folk know of him because of that. He joined ATI to work on the Radeon R300,aka,the 9700 series and the follow up X800 series. The reality is that when he was at ATI they made decent hardware most of the time(OK the X1800 had some issues and the R600 was a flop),but the 9000 series,X800 series,X1900 series,HD4000 series and HD5000 series were developed when he was there. The R300 was the DX9 graphics hardware standard for years.

He was at Apple,and was behind the push to Retina displays and probably their move to their own dGPU designs.

When he came back a second time to AMD you had Polaris,Vega and RDNA1/RDNA2. Considering Polaris and Vega had to be made on a subpar GF 14NM,because the people at AMD before Rory Read put AMD into a lose/lose scenario to stave off bankruptcy. Just look at the R and D budget for graphics at the time against Nvidia. 70% of AMD R and D was put towards CPUs. Their total R and D was still less than Nvidia. Vega was released firstly as compute card called Radeon Instinct. Those Vega cards were not made originally for gaming. Only two Polaris designs were made. AMD had no real budget for developing new GPUs at the time.

Because of all the moaning AMD fans,they went re-released these cards to consumers,and had to make a loss on them. That is one of the biggest mistakes AMD made - they should have just not bothered,as it was a loss leader card.

Yet,Polaris wasn't a flop in the end? Same as Vega. It had relevance for 6 years as a uarch,especially since it was designed on a shoestring budget. Vega IGPs are still part of the Ryzen 5000 series APUs - they are still beating what Intel has. CDNA1 and CDNA2 are Vega derivatives and RDNA1 was developed when he was there,and by extension that means RDNA2 has influences from it. RDNA3 will probably be the first design had zero interaction with.

It takes years to develop new designs,and more importantly get the design teams together.

The guy has worked at S3 Graphics,ATI,AMD and Intel since 1996. Have people not noticed he likes jumping into challenging scenarios? S3 was already having issues back then when he joined,and ATI when he joined in 2001 was a massive underdog compared to Nvidia at the time. The R300 was the design which made ATI's name in the industry.

He joined Apple as "Director, Graphics Architecture" meaning he had a hand in that too.So like Keller he stepped into something different. Rejoins AMD,literally as they are trying to starve off bankruptcy. Goes to Intel to try and push into a mature market with two incumbents. Now is going to found an AI gaming software startup.

If he was so bad,then he would have been out of the industry a long time ago,not nearly 30 years. Even Jim Keller only stays for a few years at many places,yet gets all the credit for those companies achievements years later. Jim Keller had good things to say about him IIRC.

Raja Koduri gets dunked on,even when a number of the designs he was involved with have stood the test of time.

Plus Jim Keller joined Intel only in 2018 and left two years later.Who do you think got Jim Keller onboard at Intel? Raja Koduri. Are people blaming him for the current Intel design problems? Why did he leave so quickly? He spend more time at AMD.Intel is having problems,left,right and centre and almost all their units.They let go of 1000s of older,more experience workers:


CPU,NAND,GPU,etc. The company has had too many people butting heads inside for the last 5~8 years. Intel have no clear vision of anything. Just look at their advanced packaging technologies,stuff such as L4 cache,etc. Intel had all the technology to out-AMD,AMD years ago. They had a process node advantage over AMD until 2019. It only took until Zen3 for AMD to finally field a faster core. They had decades to improve the software side on the IGPs.

It's not only the hardware but the whole software stack for 100s if not 1000s of games. If it was so easy,then look at the numerous Chinese dGPUs,some made on 7NM which have horrible performance due to poor drivers and lack of gaming support. Apple has taken years to move to its own GPU uarch(they started moving towards it in 2014),and even then look at gaming performance? This is with a company that controls their whole ecosystem.I remember back in the late 90s and early 2000s,when we had more dGPU companies out there,how software could make or break things.

People talk about AMD "Fine Wine" but we all know that is AMD drivers and game support catching up with the hardware. ATI took years to get within distance of Nvidia in this regard. Intel is starting from a much worse situation,especially as they didn't care about gaming for decades.

Lisa Su had made the correct decision to push more budget to CPU R and D. When AMD was competitive people just bought more Nvdia graphics cards anyway,irrespective of drivers,power consumption,etc.

Yet,where is all the moaning,that Nvidia is now ahead again with Ada Lovelace? This time R and D is much more,and AMD went back onto TSMC full time. They had access to TSMC 4NM(their new APUs are made on them). Why not make their new dGPUs on TSMC 4NM,like Nvidia? They clearly think APUs are more important. The reality is dGPUs are just a secondline priority for AMD,and Nvidia throws money at them.
 
Last edited:
He was a manager,with good experience of the software side,and some knowledge of the hardware side. But you need to have a clue about software to target where the hardware can be good at. He was at S3 Graphics,ATI,AMD,Intel and Apple over 30 years. He can't be that bad if so many leading companies hired him.

Raja Koduri was director of advanced technology development at ATI from 2001 to 2009. Many of us older folk know of him because of that. He joined ATI to work on the Radeon R300,aka,the 9700 series and the follow up X800 series. The reality is that when he was at ATI they made decent hardware most of the time(OK the X1800 had some issues and the R600 was a flop),but the 9000 series,X800 series,X1900 series,HD4000 series and HD5000 series were developed when he was there. The R300 was the DX9 graphics hardware standard for years.

He was at Apple,and was behind the push to Retina displays and probably their move to their own dGPU designs.

When he came back a second time to AMD you had Polaris,Vega and RDNA1/RDNA2. Considering Polaris and Vega had to be made on a subpar GF 14NM,because the people at AMD before Rory Read put AMD into a lose/lose scenario to stave off bankruptcy. Just look at the R and D budget for graphics at the time against Nvidia. 70% of AMD R and D was put towards CPUs. Their total R and D was still less than Nvidia. Vega was released firstly as compute card called Radeon Instinct. Only two Polaris designs were made.

Because of all the moaning AMD fans,they went re-released these cards to consumers,and had to make a loss on them. That is one of the biggest mistakes AMD made - they should have just not bothered,as it was a loss leader card.

Yet,Polaris wasn't a flop in the end? Same as Vega. It had relevance for 6 years as a uarch,especially since it was designed on a shoestring budget. Vega IGPs are still part of the Ryzen 5000 series APUs - they are still beating what Intel has. CDNA1 and CDNA2 are Vega derivatives and RDNA1 was developed when he was there,and by extension that means RDNA2.

It takes years to develop new designs,and more importantly get the design teams together.

The guy has worked at S3 Graphics,ATI,AMD and Intel since 1996. Have people not noticed he likes jumping into challenging scenarios? S3 was already having issues back then when he joined,and ATI when he joined in 2001 was a massive underdog compared to Nvidia at the time. The R300 was the design which made ATI's name in the industry.

He joined Apple as "Director, Graphics Architecture" meaning he had a hand in that too.So like Keller he stepped into something different. Rejoins AMD,literally as they are trying to starve off bankruptcy. Goes to Intel to try and push into a mature market with two incumbents. Now is going to found an AI gaming software startup.

If he was so bad,then he would have been out of the industry a long time ago,not nearly 30 years. Even Jim Keller only stays for a few years at many places,yet gets all the credit for those companies achievements years later. Jim Keller had good things to say about him IIRC.

Raja Koduri gets dunked on,even when a number of the designs he was involved with have stood the test of time.

Plus Jim Keller joined Intel only in 2018 and left two years later.Who do you think got Jim Keller onboard at Intel? Raja Koduri. Are people blaming him for the current Intel design problems? Why did he leave so quickly? He spend more time at AMD.Intel is having problems,left,right and centre and almost all their units.They let go of 1000s of older,more experience workers:


CPU,NAND,GPU,etc. The company has had too many people butting heads inside for the last 5~8 years. To try and blame Raja Koduri for all of them,when so many people have left comes across as people getting annoyed AMD couldn't beat Pascal,on a fraction of the budget.It's not only the hardware but the whole software stack for 100s if not 1000s of games. If it was so easy,then look at the numerous Chinese dGPUs,some made on 7NM which have horrible performance due to poor drivers and lack of gaming support. Apple has taken years to move to its own GPU uarch(they started moving towards it in 2014),and even then look at gaming performance? This is with a company that controls their whole ecosystem.I remember back in the late 90s and early 2000s,when we had more dGPU companies out there,how software could make or break things.

Lisa Su had made the correct decision to push more budget to CPU R and D. When AMD was competitive people just bought more Nvdia graphics cards anyway,irrespective of drivers,power consumption,etc.

Yet,where is all the moaning,that AMD still lost to Nvidia with RDNA2 on a far better node,or Nvidia is now ahead again with Ada Lovelace? This time R and D is much more,and AMD went back onto TSMC full time. They had access to TSMC 4NM(their new APUs are made on them). Why not make their new dGPUs on TSMC 4NM,like Nvidia? They clearly think APUs are more important. The reality is dGPUs are just a secondline priority for AMD,and Nvidia throws money at them.

The second reality is Intel,is cancelling projects everywhere and there is massive infighting. They have no clear vision of anything. Just look at their advanced packaging technologies,stuff such as L4 cache,etc. Intel had all the technology to out-AMD,AMD years ago. They had a process node advantage over AMD until 2019. It only took until Zen3 for AMD to finally field a faster core. They had decades to improve the software side on the IGPs.

The wasn't out until 2 years after Raja left.
Right after he left Dr Lisa Sue pushed some of the Ryzen team on to iGPU's, in 2 months the increased the performance and performance per watt of Vega, by 50% (actual) for the latter. Then relaunched those APU, on the same node...

He makes ok GPU's, sometimes good GPU's, once perhaps he made great GPU's, but when you're up against someone like Nvidia, who make nothing but GPU's, its their whole life and they are fanatical about being the best, at any cost, you can't just be good, because the GPU's they knock out are ###### phenomenal, with their entire R&D being nothing but GPU's they had better be.
We are spoiled by modern GPU's, we don't know just how good they actually are, because other than AMD vs Nvidia we have nothing to compare them to and while we think of AMD's as slightly lesser they are also ###### awesome.

That's the level Nvidia and AMD oporate in, nothing short of that will cut it.
 
Last edited:
The wasn't out until 2 years after Raja left.
Right after he left Dr Lisa Sue pushed some of the Ryzen team on to iGPU's, in 2 months the increased the performance and performance per watt of Vega, by 50% (actual) for the latter. The relaunch those APU, on the same node...

He makes ok GPU's sometimes good GPU's, once perhaps he made great GPU's, but when you're up against someone like Nvidia, who make nothing but GPU's, its their whole life and they are fanatical about being the best, at any cost, you can't just be good, because the GPU's they knock out are ###### phenomenal, with their entire R&D being nothing but GPU's they had better be.
We are spoiled by modern GPU's, we don't know just how good they actually are, because other than AMD vs Nvidia we have nothing to compare them to and while we think of AMD's as slightly lesser they are also ###### awesome.

That's the level Nvidia and AMD oporate in, nothing short of that will cut it.

It takes 3~5 years to design something new. Even when Jim Keller was at AMD the first time he was there until late 1999. He was lead architect on the K8/Athlon 64 which came out over 3 years later. Raja Koduri left in late 2017. Navi samples were already floating about in 2018. So he was involved. Vega was a compute uarch made for commercial usage. Those cards were only released as loss leaders,because AMD didn't want to develop a bigger Polaris design. Even the smaller Polaris was probably funded by Apple(they were talking about die thinning used to make it fit in MacBooks). Vega as a design seems to have lasted well - cDNA is an evolution of it.

The big problem here,is AMD is a CPU company which dabbles in GPUs. So a lot of the development of new consumer orientated uarchs is to help their IGPs,laptops and semi-custom partners. This is what is funding their dGPU designs. So this means area optimised and lower power designs. Not the larger high performance dGPUs we all want. A lot of the talk about RDNA2 was about it maximising throughput for consoles. Nvidia,goes for the best performance even if it means larger chips and more power,or having to use the best nodes they can get access to.

Its all about budget and what is the priority. CPUs are the bread and butter of AMD.You can see that with RDNA3. AMD has TSMC 4NM volume. They could have made the RX7900XTX compute die on TSMC 4NM,and most likely would be somewhat faster or more efficient. But instead they think their Zen4 APUs are a more worthy use of that volume. Then going back to RDNA2 - AMD pushed most of its 7NM volume to CPUs and consoles.

With Intel,the big problem is software and a history of not really being good at it. But that will take a lot of money to overtake the decades of investment Nvidia,let alone,AMD have already done.Also the fact they used a lot of Russian engineers as part of the driver development team:

The problem this time is that key parts of the drivers for this GPU, specifically the shader compiler and related key performance pieces, were being done by the team in Russia.

It's why they have big problems with older games because they need to do all of it from scratch,as Intel didn't care for decades about games. Another instance of sheer stupidity as they have the most IGPs out there,and it would have cost them relatively little to do. But with all the infighting at Intel the last 5~8 years,instead of sticking to a plan,they seem to assigning blame to each other.
 
Last edited:
It takes 3~5 years to design something new. Even when Jim Keller was at AMD the first time he was there until late 1999. He was lead architect on the K8/Athlon 64 which came out over 3 years later. Raja Koduri left in late 2017. Navi samples were already floating about in 2018. So he was involved. Vega was a compute uarch made for commercial usage. Those cards were only released as loss leaders,because AMD didn't want to develop a bigger Polaris design. Even the smaller Polaris was probably funded by Apple(they were talking about die thinning used to make it fit in MacBooks). Vega as a design seems to have lasted well - cDNA is an evolution of it.

The big problem here,is AMD is a CPU company which dabbles in GPUs. So a lot of the development of new consumer orientated uarchs is to help their IGPs,laptops and semi-custom partners. This is what is funding their dGPU designs. So this means area optimised and lower power designs. Not the larger high performance dGPUs we all want. A lot of the talk about RDNA2 was about it maximising throughput for consoles. Nvidia,goes for the best performance even if it means larger chips and more power,or having to use the best nodes they can get access to.

Its all about budget and what is the priority. CPUs are the bread and butter of AMD.You can see that with RDNA3. AMD has TSMC 4NM volume. They could have made the RX7900XTX compute die on TSMC 4NM,and most likely would be somewhat faster or more efficient. But instead they think their Zen4 APUs are a more worthy use of that volume. Then going back to RDNA2 - AMD pushed most of its 7NM volume to CPUs and consoles.

With Intel,the big problem is software and a history of not really being good at it. But that will take a lot of money to overtake the decades of investment Nvidia,let alone,AMD have already done.Also the fact they used a lot of Russian engineers as part of the driver development team:



It's why they have big problems with older games because they need to do all of it from scratch,as Intel didn't care for decades about games. Another instance of sheer stupidity as they have the most IGPs out there,and it would have cost them relatively little to do. But with all the infighting at Intel the last 5~8 years,instead of sticking to a plan,they seem to assigning blame to each other.

It takes 3~5 years to design something new

So he wasn't there for most of its development.

Raja Koduri left in late 2017. Navi samples were already floating about in 2018. So he was involved

The 5700XT was launched in mid 2019. Involved perhaps but a lot of development went on without him, the only GPU we know he was involved in from start to finish post Apple was Vega and ARC, both are bad. You can't get away from that.

The big problem here,is AMD is a CPU company which dabbles in GPUs. So a lot of the development of new consumer orientated uarchs is to help their IGPs,laptops and semi-custom partners. This is what is funding their dGPU designs. So this means area optimised and lower power designs. Not the larger high performance dGPUs we all want

Was the 6900XT really that bad?
Its as fast as Nvidia's flagship of the same generation, its also more power efficient, RT is was not as good but it is 1'st gen vs 2'nd gen.
Its also much smaller, so cheaper to make, it had a lot going for it, very good design, Overall IMO a very good card by true measure.

Is the 7900XTX so bad?
Its less competitive in terms of performance vs its predecessor.
Its still a very fast card.
RT is an improvement per core vs 1'st gen, still not as good as Nvidia's 3'rd gen.
Its the first MCM GPU, that's ground breaking. How much R&D do you think that cost?
The logic die is half the size of its competitor because of the ground breaking arch which means the 6 chaplets can also be made on a very much cheaper node, all that means they keep costs down, its the same price as the card its replaced while maintaining high margins.
Its more power efficient than its competitor.

2 generations of very good GPU's, 3.5 and 5.5 years after Raja left.



They could have made the RX7900XTX compute die on TSMC 4NM,and most likely would be somewhat faster or more efficient. But instead they think their Zen4 APUs are a more worthy use of that volume

Zen 4 and RDNA3 are on the same 5nm TSMC node.
Nvidia paid "Me first all mine" prices to get on 4nm, AMD will not be using it, at all, and going straight to 3nm as TSMC want.

Its all about budget and what is the priority. CPUs are the bread and butter of AMD.

Yes it is, but they have stuck with it for 20 years and they continue to try, more than Intel ever will, you can be sure of that.

With Intel,the big problem is software and a history of not really being good at it. But that will take a lot of money to overtake the decades of investment Nvidia,let alone,AMD have already done.Also the fact they used a lot of Russian engineers as part of the driver development team:

All i have heard since the reveal of ARC is excuses, not just from Intel themselves, and there is plenty of that going round, but every white knight internet hope has spend the last 2 years playing whack-a-mole with anything else that points out the blindingly obvious with ARC, it has exactly the same problems Raja's previous arch has and that was never fixed.

We are in this dire situation not because AMD wouldn't step up, they tried that several times, we played whack-a-mole trying to push them down again, that's why we are where we are, what is wrong with us? Even when Intel showed their true colours by pushing that broken crap out at more than an overpriced RTX 3060 for a card far worse, we even found excuses for that, i'm sure i'll hear them again.
No i don't include my self in that "we"

I'm not angry, i'm bitterly disappointed, and tired with all of that ^^^
 
Last edited:
Back
Top Bottom