• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel ARC and the Latest Drivers: Does this bode Well for ARC's future?

So he wasn't there for most of its development.

Well he was. He left in late 2017. So 3~5 years from early 2019,means 2014~2016 for RDNA1. Plus RTG sent him an RX6800,years after he left AMD:

That pretty much means he was involved with it.

Koduri left his top position at the Radeon Technology Group back in 2017, following a short sabbatical from the role in the days following Vega's launch. It was always said that the Navi architecture (before we knew it as RDNA) was Koduri's pet project, so perhaps it's only fitting that the now-Intel GPU engineer would receive an RDNA 2 card in the mail.

You can't just dunk on one implementation(Vega 10),when the uarch seems to have lasted years.it's like saying GCN2.0 was a failure,because it was launched in a rubbish state(poor cooler,and overvolted). The reality GCN2.0 ended up showing longevity as it was in consoles,and the cards have had relevance for years.

The question is - would you have preferred AMD to throw R and D at the time,at a larger Polaris,or a rejigged "gaming" Vega card made at TSMC? Or spend that money on developing Zen? AMD total R and D was less than Nvidia.

Sadly,AMD had to make a choice. They only developed two Polaris designs,and one Vega design for commercial usage.


The 5700XT was launched in mid 2019. Involved perhaps but a lot of development went on without him, the only GPU we know he was involved in from start to finish post Apple was Vega and ARC, both are bad. You can't get away from that.

He was involved start to finish on:
1.)Polaris(he joined three years before it came out)
2.)Vega
3.)Most of RDNA1 and it appears part of RDNA2 too

14NM and 7NM Vega were commercial compute cards,and were in supercomputers(Radeon Instinct). They released as Radeon Instinct cards first. They literally took dGPUs with huge compute performance and re-released them. They just overclocked them and threw a bone at AMD fans,and lost money on them. Nvidia fans got angry AMD was not making their Nvidia cards fine. It has become some badge of honour amongst PCMR,to literally dunk on the guy 24/7.

This is a period with only 70% of AMD R and D sent towards developing Zen. That means Nvidia was outspending AMD on dGPUs by a couple of times.

He is one the one who brought Jim Keller onboard to Intel. Keller left very quickly,especially compared to his previous time at AMD and Apple. Are we blaming Jim Keller for things not working so well at their CPU division? The reality there was a leadership race for the company,and all you had is people trying to screw their potential competitors over.

This is why EVERY Intel roadmap is in a mess. CPU.GPU.Process Nodes. NAND development ended. This is all happening at the same time.


Was the 6900XT really that bad?
Its as fast as Nvidia's flagship of the same generation, its also more power efficient, RT is was not as good but it is 1'st gen vs 2'nd gen.
Its also much smaller, so cheaper to make, it had a lot going for it, very good design, Overall IMO a very good card by true measure.

It wasn't bad in anyway but AMD was on true 7NM process node. Nvidia was on a 10NM node,which caused them problems. But Nvidia had the ability to throw money at the problem.

AMD was in the same situation merely years before when Raja was there but with no money.

AMD had been locked into using GF 14NM for their dGPUs. GF had zero experience with larger die products. The consoles were made on TSMC 16NM. GF 14NM was worse than the Samsung 14NM it was based on,as their 14NM was a failure. Look at how Polaris drank power if pushed a bit far(RX580). Now think if it was on TSMC 16NM.Then on top of that SK Hynix HBM2 had real problems,meaning AMD had to source it from Samsung at greater cost(and IIRC,it might have been slower).

So AMD was not only fighting a lack of R and D,they were fighting Nvidia who were using a better process node AND didn't need to repurpose products for dual duties. I told you people back then,but just like with Beast Mode Polaris nobody wanted to listen.

Vega as a uarch can't be a disaster if literally CDNA1 and CDNA2 are evolved Vega designs and Vega IGPs have beaten off Intel for years. But the reality Vega10 should never have been released as a consumer product. None of the Vega dGPUs were gaming optimised cards. Nvidia never released Volta as one either,but didn't need to as they had high end "gaming" Pascal designs to use.

People forget Pascal to Turing was a two generational improvement in uarch,not one.

Is the 7900XTX so bad?
Its less competitive in terms of performance vs its predecessor.
Its still a very fast card.
RT is an improvement per core vs 1'st gen, still not as good as Nvidia's 3'rd gen.
Its the first MCM GPU, that's ground breaking.
The logic die is half the size of its competitor because of the ground breaking arch which means the 6 chaplets can also be made on a very much cheaper node, all that means they keep costs down, its the same price as the card its replaced while maintaining high margins.
Its more power efficient than its competitor.

The RX7900XTX is as fast as the second tier Nvidia dGPU,ie,the AD103 which uses less transistors and less memory bandwidth(and consumes less power) but is on a better process node. Vega10 was as fast as the GTX1080,which was the second tier Pascal dGPU too which consumed less power and was on a much better process node.If this was a Raja design people would all be moaning at the guy and having a go at the design team.

Yes,we can see AMD testing out chiplet designs and there is a risk to that OFC. Is it still a decent dGPU.

2 generations of very good GPU's, 3.5 and 5.5 years after Raja left.

RDNA1 and RDNA2 were started when he was there. That is the point - all people can do is bash him over ONE IMPLEMENTATION of the design and say Vega was a disaster.

Yet,then when the same Vega has done very well in commercial areas,ie,the compute Vega cards and the CDNA1/CDNA2 cards. Then it did well in IGPs. Then everyone but him gets praised for it.People can't dunk on Vega as a design,then praise other implementations.

Zen 4 and RDNA3 are on the same 5nm TSMC node.
Nvidia paid "me First all mine" prices to get on 4nm, AMD will not be using it, at all, and going straight to 3nm as TSMC want.

AMD has 4NM volume - the Zen4 APUs are on 4NM! The reality is AMD could have pushed the RX7900XTX onto 4NM but chose not to. Which goes back to the last few years of their dGPU development. AMD is not throwing money at dGPU development or production.

The last time they did this was with GCN2.0 IMHO,and that was just before they started reducing dGPU R and D significantly.

Yes it is, but they have stuck with it for 20 years and they continue to try, more than Intel ever will, you can be sure of that.

Yes,they will and again it is the big problem with Intel. Outside their core CPU department in Windows PC and Windows servers,look at all the other stuff they dipped into and left?

All i have heard since the reveal of ARC is excuses, not just from Intel themselves, and there is plenty of that going round, but every white knight internet hope has spend the last 3 years playing whack-a-mole with anything else that points out the blindingly obvious with ARC, it has exactly the same problems Raja's previous arch has and that was never fixed.

We are in this dire situation not because AMD wouldn't step up, they tried that several times, we played whack-a-mole trying to push them down again, that's why we are where we are, what is wrong with us? Even when Intel showed their true coulors by pushing the broken **** out at more than an overpriced RTX 3060 for a card far worse, we even found excuses for that, i'm sure i'll hear them again.
No i don't include my self in that "we"

But if it was just dGPUs having a problem it would be one thing. It's everything from CPUs,process nodes,etc. This tells me there is a really problem in the heart of the company. Too many cooks spoil the broth IMHO!

When Zen came out,Intel was on a better process node,and could have easily done something to address it. They had better packaging technology,L4 cache,etc.Instead look at the denial they had for years,until AMD finally surpassed them in many ways. Even Jim Keller left very quickly and that was after Raja Koduri got him onboard.

AMD wouldn't step up because even when AMD/ATI were ahead people still bought more Nvidia cards at higher prices.

If AMD after decades of investment in hardware,games devs,consoles and software,still has problems with Nvidia,how the heck can Intel do it in a few years? If the leadership at Intel couldn't realise that software and getting devs onboard was going to take time and money,they are foolish. It took until 2002/2003 for people to take ATI seriously against Nvidia! It took years of competing against Nvidia to do this.

There are literally Chinese companies trying to do the same backed by decent money,and looking at reviews from LTT,its all about the software side.

IMHO,a lot of problems with Arc are the drivers and getting devs onboard. It needs money and one has to question whether the current CEO is interested in spending that money over a few years. The reality they are not,which is pretty much like what Intel does all the time. Take away their core consumer X86 CPUs and their server CPUs,they have failed miserably elsewhere or given up.

Maybe we need to agree to disagree,but I think once the internet gets it claws into people,there is no reverse gear.
 
Last edited:
More on the rest later but that PCGame article is exactly the sort of thing i'm talking about.

Raja Tweets that AMD sent him a GPU, with no context other than to say thank you.

Everything in that article is made up, it reads like this.

AMD sent Raja the card he made before he left, because Raja is such a genius and AMD miss him so much... RDNA2 is all his, see he's brilliant, he makes great GPU's, that's why he's in change of Intel now, we love you Raja.

Its really no less sycophantic than that and its oozing with cope.
 
Also, AMD, If you're going to send one of your ex-employees one of your products as gift at least make it the highest end one you make, not just the mid level, that's like saying you're on the mid level of appreciation to us. You're average, like a sales manager. Its a bit rude...
 
Last edited:
More on the rest later but that PCGame article is exactly the sort of thing i'm talking about.

Raja Tweets that AMD sent him a GPU, with no context other than to say thank you.

Everything in that article is made up, it reads like this.



Its really no less sycophantic than that and its oozing with cope.

How often do you see a company send their latest product to a person who left their company for a competitor? The reality is AMD sent it to him(probably after asking the permission of the CEO),because he was involved in some way with it.

The fact is that the guy has spend the better part of 30 years at S3 Graphics,ATI,Apple,Nvidia and AMD can't mean he is as bad as random people on forums thing he is.
 
How often do you see a company send their latest product to a person who left their company for a competitor? The reality is AMD sent it to him(probably after asking the permission of the CEO),because he was involved in some way with it.

The fact is that the guy has spend the better part of 30 years at S3 Graphics,ATI,Apple,Nvidia and AMD can't mean he is as bad as random people on forums thing he is.
An RX 6800, not even the XT, read above.
 
An RX 6800, not even the XT, read above.

But they literally sent him a card,and you wouldn't send a card unless there was some involvement. So Polaris,Vega,CDNA1,CDNA2,RDNA1 and RDNA2 all have his involvement in some way. The PS5 literally has a hybrid RDNA1/RDNA2 design. PS5 development was already a few years old by then:


RDNA development most likely was closely linked with Sony/MS and their own requirements. GCN2.0 was probably the same,as the PS4 had a GCN2.0 design.

So Polaris was fine. Vega implementations for desktop were shoehorned compute cards,and AMD sold it at a loss(just to appear to have something).The Vega uarch worked fine in laptops. CDNA1/CDNA2 kept AMD relevant in the commercial areas. RDNA1,was good enough that AMD renamed it from an RX680/RX690 and probably jacked the pricing up to meet the RTX2070.

Just like the R300,R400,R520,R580,R600,RV600,R700 and Evergreen when he was at ATI. R300,R400,R700 and Evergreen were solid. R500 was a miss due to clockspeed issues and the fixed R580 was much better(also was a shader heavy design so was better than the Nvidia 7900 series longer term). The R580 inspired the ATI Xenos which was the first gaming GPU with unified shaders. R600 was a disaster,partially salvaged by the RV670.

This is why I said I had to agree to disagree on this. He wouldn't have spent nearly 30 years at major companies in the graphics arena(hardly a huge area if you think about it) if he was as terrible as PCMR on forums thinks he is. He certainly wouldn't have worked for 13 years at ATI/AMD over two stints! I think all we are hearing is unverified social media stuff about him also attacking him as a person too.
He seems to be good friends with Jim Keller for example:
 
Last edited:
I think they probably know they have to just stick at it. Developing a good GPU and more importantly, good drivers, takes time. They have done really well. Lets just hope they keep going.
 
New driver is out, improves game performance in most games by between 4% and 17%

Here is some specific examples

Dead Space Remake (DX12)

  • Up to 55% uplift at 1080p with Ultra settings on Arc A750
  • Up to 63% uplift at 1440p with High settings on Arc A750
F1 22 (DX12)

  • Up to 6% uplift at 1440p with High settings on Arc A770
  • Up to 7% uplift at 1440p with High settings on Arc A750
  • Up to 17% uplift at 1080p with Ultra High Ray Tracing settings on Arc A750
Dying Light 2 Stay Human (DX12)

  • Up to 6% uplift at 1080p with High Ray Tracing settings preset on Arc A770
  • Up to 7% uplift at 1440p with High Ray Tracing settings preset on Arc A770
Dirt 5 (DX12)

  • Up to 8% uplift at 1080p with Ultra High Ray Tracing settings on Arc A750
  • Up to 4% uplift at 1440p with Ultra High Ray Tracing settings on Arc A750
Deathloop (DX12)

  • Up to 4% uplift at 1080p with Very High and Ray Tracing Performance settings on Arc A750
  • Up to 6% uplift at 1440p with Very High and Ray Tracing Performance settings on Arc A750
 
Told you Jim Keller and Raka Kodori must get on well:

Just two weeks after retiring from Intel, Raja Koduri joined the board of directors of Tenstorrent, an AI and high-performance RISC-V CPU company that intends to challenge the blue giant in the coming years, as noticed by @SquashBionic. A renowned developer of GPUs for graphics and compute, Raja Koduri will be joining his former colleagues from AMD, ATI, and Intel.

Tenstorrent is developing datacenter solutions comprising of RISC-V-based AI/ML accelerators as well as high-performance RISC-V general purpose processors. The company was established in 2016 by Ljubisa Bajic, and it is presently led by Jim Keller, a renowned CPU architect who has spearheaded the creation of revolutionary processor architectures at companies such as Apple, AMD, and DEC. (We recently outlined Tenstorrent's short and mid-term plans.)

Truth to be told, a position in the board of directors does not imply that Raja will be able to influence development of CPUs or AI accelerators. Yet, he will have his word when setting Tenstorrent's strategic goals and strategic roadmap, something that is particularly important. Being developer of AI accelerators and high-performance RISC-V CPUs, Tenstorrent is essentially travelling in uncharted watersm since AI-related technologies are changing rapidly, whereas the open-source RISC-V microarchitecture is poised to change fast in general.
 
  • Assassin’s Creed Unity (DX11)
    • Up to 271% uplift at 1080p with Very High settings
    • Up to 313% uplift at 1440p with High settings
  • F1 22 (DX12)
    • Up to 36% uplift at 1080p with High settings
    • Up to 20% uplift at 1440p with High settings
    • Up to 10% uplift at 1080p with all Ray Tracing settings on
  • Deathloop (DX12)
    • Up to 10% uplift at 1080p with Ultra settings
    • Up to 8% uplift at 1440p with Very High settings
 
Last edited:
Apparently only for recent cards and still has issues.
Officially probably. Fairly sure I've seen older cards running on 3rd party drivers with the update and getting a decent uplift. Didn't know about issues though - that makes me sad.
I don't recall anyone mentioning anything about intel/opengl, so maybe they don't have any problems with it. Interested to know too tbh, would make AMD look proper **** if intel have no issues (also good on intel if they've put the work in)

Edit- missed a word, it's in bold
 
Last edited:
Back
Top Bottom