• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Arc series unveiled with the Alchemist dGPU to arrive in Q1 2022

Caporegime
Joined
9 Nov 2009
Posts
25,024
Location
Planet Earth
Intel revealed some more details of its first Arc uarch dGPUs,the Alchemist series which are arriving in Q1 2022:
https://www.anandtech.com/show/16886/intel-video-cards-get-a-brand-name-arc-coming-q1-2022
https://videocardz.com/press-releas...s-based-xe-hpg-architecture-coming-early-2022
https://videocardz.com/newz/intel-d...map-and-development-of-ai-super-sampling-tech

Upcoming Intel Arc graphics products are based on the Xe-HPG microarchitecture, a convergence of Intel’s Xe-LP, HP and HPC microarchitectures, that will deliver scalability and compute efficiency with advanced graphics features. Alchemist, the first generation of Intel Arc products, will feature hardware-based ray tracing and artificial intelligence-driven super sampling, and offer full support for DirectX 12 Ultimate.

Has a DLSS competitor:

Intel: AI-based Super Sampling Tech Under Development
Alongside today's announcement also comes a small mention from Intel that they will be throwing their hat into the ring for image upscaling technologies. As part of the Arc announcement, Intel has confirmed that the company is working on "artificial intelligence-driven super sampling". For the last couple of years NVIDIA and more recently AMD have been making waves with their respective DLSS and FSR technologies, and it’s fair to say that Intel doesn’t want to be left out – after all, they intend to become an equal competitor in the discrete GPU space.

Unfortunately, today’s announcement from Intel is just throwing their hat into the ring, so anyone looking for a release date or information on if Intel's in-development tech is purely spatial or spatial + temporal will be waiting a bit longer. But, for the moment, we at least have confirmation that Intel is working on their own take on the tech.
 
Intel doesn't even have a high performance bar,let alone price to beat. They literally could have an RTX3070,at RTX3070 level pricing,but actually available in quantity at under £500 and people would buy it. The bigger issue is the drivers.
 
Here is hoping teasm blue put out some decent drivers for both windows and linux....don't hold my breath. I get there will be initial teething issues, so fingers crossed over 6-12months they will solidify.

Hopefully Intel actually wants to aggressive get some marketshare,and TBF AMD/Nvidia has made it much easier due to their opportunistic price increases in 2021,to set the bar relatively low for Intel. I do think you are right,there are no doubt going to be problems,but we do need more players in this area. It seems AMD/Nvidia have gotten very comfortable.
 
I'm optimistic about Xe. Raja makes some interesting designs, he should just leave the marketing to someone else :p. Are there any other big profile peeps on the Xe team? do anyone know of any noteworthy souls who been working on it?

TBF,he was involved with the ATI R300 which was the first GPU to actually beat Nvidia they made. I do think with Vega,AMD basically had not enough money,and was shoehorning the product into to many areas.
 
Agreed, so fingers crossed Intel makes good on the opportunities in play. Beat Nvidia with a big stick and we might see some proper market and performance shifts in the coming year(s).

I think both Intel and AMD have settled into not competing too hard with each other now. You can literally see them price products around each other.

For this reason with the driver's thing I think I'll be skipping Intel's first 1 or 2 GPU releases just to see how their software support is medium to long term.

You would think with so many IGPs,Intel should have had enough realworld testing of their drivers!! :P
 
Okay, but my comment was also about the economics. If going by trans/cost then DG-512 will cost Intel close to 470mm² versus 336mm² for Navi 22 or 393mm² for GA104 (on a cheaper Samsung process).

Intel probably don't have any choice but to make less margin at least the first few years.

Considering the mega-margins AMD/Nvidia are making on their GPUs,I expect the margins will still be OK!
 
As new entrants, they might be willing to go for console margins. If their design is inefficient or has inconsistent performance they may not have a choice.

The bad news for Intel is that they not manufacturing these themselves.

The good news for Intel is that they not manufacturing these themselves!

So, they don't have to do the calculations AMD has to do with whatever they can get from TSMC: monster margins on Zen3 CCDs, okay margins on APUs and get to please some laptop OEMS, some margins on GPUs, or use most of the wafers at almost giveaway prices MS/Sony because of 'contracts'.

Although dGPU will cut into Intel's TSMC wafers for high margin HPC projects. Plus, I think Intel are very keen to destroy some of Nvidia's high HPC margins just for "business is war" reasons.

Maybe,but considering Intel threw billions of USD at contrarevenue for Atom,etc they have no issue loosing a few billion USD in the short term. However,the only reason they have a chance is because Nvidia/AMD have gotten greedy now and formed a sort of cartel. It reminds me of Apple and Samsung doing the same with smartphones,until Chinese companies started competing better.
 
Intel margins start at 55% (lower than Nvidia @ 65% but higher than AMD @45%)

Intel makes mostly CPUs,which are higher margin products so it shows you how much Nvidia is making. AMD is selling a ton of consoles which are lower margin - it wouldn't surprise me their margins are pretty decent once you discount consoles. Once they get more supply to non-console products,see their margins go past 50% easily IMHO(and their net margins have gone up a decent amount). 7NM and 8NM are lagging nodes now,unlike two years ago.
 
Its very challenging to find a proper break down on products , but AMD margins are coming from server cpu`s (thats from either TTP or GN, cant remember). Intels margins are on teh up, they were at 45% just 2 years ago. 14nm has really hurt them, but they have supply which AMD (at TSMC) doesnt have. The enws though is that Intel wont be buying up GloFo and rumour is they are heading to Samsungs 8nm process licenced.

I would imagine that 7NM is in a much better shape in terms of yields and pricing than it was 2 years ago when Navi10 and Zen2 were launched. But the thing is that Nvidia still makes over 60% of its revenues from gaming GPUs. Its why people shouldn't be paying significantly over RRP for Nvidia GPUs,as even at RRP they are making more than enough money IMHO! All the AIB partners complaining life is hard,are having decent years too.

The only gamers not being ripped off,are console gamers IMHO. Even the scalped consoles go less than for scalped slower/similar speed dGPUs. Plus a system like the XBox Series S,has an RX580 level GPU and a Ryzen 7 4800H level CPU,but goes for well under £300.
 
Its why AMD will stick to N6 for its next production run (cpu), let Apple pay for risk on N5 then jump in in 2022. Intel are going straight to N6 for its ARC/ I do wonder if N6 will also be the test run for navi + zen2 in the apu.

What I am hoping it means 7NM production capacity is freed up for CPUs/GPUs. I think that Zen4 is going to quite expensive with its use of DDR5,so I can see AM4 still being around for a while. The same goes with mainstream dGPUs,if 7NM supply gets better,perhaps a rejigged Navi 22/Navi 23 with RDNA3 features,but made on 7NM would probably make more sense. I can seen N6 being reserved for the higher end dGPUs at the start.
 
Thats the consensus - 6600xt > 6700XT (and maybe 6800) being on N7 (+ or P) sliding down the stack with the high end on N6. What is interesting is AMD Van Gogh - it looks like it`ll be an RX 6500 with Zen2 ondie as the first of the `new` igpu core models along with lpddr5; it does follow the pattern of slowly maturing new technologies

I think it makes sense - RDNA2 is quiet energy efficient on 7NM,so with some tweaks the maintream on a lagging node makes some sense.

If it means more supply,I am all for it!!

But what if TSMC is really keen* to get customers to move to 6nm?

Although rumours are transistor per dollar hasn't moved for 6nm vs 7nm but throughput has (maybe the EUV steps take less time all the previous DUV ones)?

So if TSMC is keen to move their clients to 6nm to get better throughput they might have to offer some incentives.

The other part, is that while TSMC are not Intel and tend to keep some older nodes, the key word might be some. They may want to re-tool some 7nm lines for 6nm if it increases their total wafers per month output.

* Maybe with rebates or offers, but the Taiwan press is reporting wafer prices are going up on just about all nodes:
https://www.computerbase.de/2021-08/tsmc-umc-smic-preissteigerungen-bei-foundrys-fuer-mehr-gewinn/
(https://translate.google.com/transl...oundrys-fuer-mehr-gewinn/&prev=search&pto=aue)

EDIT:
Seems Hexus has picked that story up too:
https://hexus.net/business/news/com...say-tsmc-decided-raise-prices-10-20-per-cent/
x4g1Fnm.png


I will just repeat what I said about this on Hexus.

At this point after seeing Nvidia making massive amounts of dosh - it makes me wonder whether many of these "leaked" reports are actually real,especially in light of the fact many larger customers of TSMC would have signed longterm contracts. It seems mighty convenient once all these reports come out(mostly not officially confirmed),that companies can make an excuse to massively increase prices way past any actual price increase. Then lo and behold these companies have made record net and gross margins. It almost sounds like some form of market manipulation.

An example was what happened to hard drive prices,after those floods many years ago. Companies "were forced" to jack up prices due to the problems,but then for years after that prices remained high per TB,and they did well out of it. Its also like when NAND becomes too cheap,all of a sudden some "disaster" like a slightly inebriated Pelican flying into the factory for 30 seconds,and causing 6 months of supply to suddenly sublimate into thin air. Even Gamersnexus in their 6600XT review pointed this to out some degree,ie,just because the price of copper goes up by 2X,it might not mean much if the device uses $2 worth of copper.
 
Last edited:
The issue I have is that Nvidia,etc are making massive margins,despite "massive cost increases". It reminded of what happened before the first Geforce Titan came out at nearly £1000,in an era of flagship GPUs costing closer to half that. All of a sudden you had techsites post a "leaked Nvidia document" detailing "how expensive" TSMC 28NM was,and you had people trying to justify the cost increases due to "increased" costs. Yet this was the starting point of Nvidia rising to the money making machine it is today.

Remember,a while back it was all about substrates,GDDR6 costs,etc. Yet Sony can make the PS5 with a 300+ MM2 7NM SOC,16GB of GDDR6,a PCI-E 4.0 SSD,Blu-Ray optical drive with retailer margins,for £450 and not make a loss. AMD then released that weird 8C motherboard with no IGP,and 16GB of GDDR6 and I doubt that will cost more than Ryzen 7 5700G,with DDR4. I doubt OEMs are paying megabucks for that either. Then you go and look on HUKD,plenty of laptops with an RTX3060 or RTX3060 and a half decent 6C CPU,going for well under £1000. The laptop RTX3060 is a full GA106 GPU,unliked the salvaged part used in the RTX3060 desktop dGPU. I doubt the large system integrators like Dell and Lenovo,will be paying anything more than the minimum they can get away with either.

The leaking of all this news seems very good news for Nvidia,AMD,Intel,AIB partners,etc who can nicely justify jacking up their prices. Yet,lots of other computer parts seem to be relatively easy to get at RRP or below RRP,despite being affected by the same factors.

It distinctly reminds me of the times when RAM went up to silly prices a few times,and we found years later the companies had been having a Gentlemen's Agreement to not bother competing with each other.
 
I'm quite sure there's a huge amount of market manipulation and cartel like behaviour going on. These companies are better served by agreeing to keep prices high than by competing with each other and there are multiple excuses used for why prices go high and remain high for months or years afterwards. It's broken capitalism and it's quite widespread especially in markets with fewer players where it serves all manufacturers to cooperate rather than compete. Then you get the army of paid undercover shills who argue that the new high price is somehow justified because of 'reasons' few of which are significant long term but can be used as an excuse to keep milking the consumer. The phone market was like this with Nokia and then Apple until we got more players on the scene (from China) who were willing to compete rather than collude and cheat their customers.

Agreed and sadly true! :(
 
Well, Intel have had ages to sort out some drivers.

The AT thread on this has now been renamed as "Intel to develop discrete GPUs - Almost 5 years later, cards are here!"

Plus there is the historical thing: Intel have so many outstanding GPU driver issues which they have never fixed.

Fine wine might play a part but from the look of it, we'd have to go back a long time to a card which was release with so many obvious issues.

And this aren't for new games where there is dirty playing all the time with "game ready" drivers often involving sponsorships and one company or the other - although it has been the green team far more often than the red - going out of their way to make some, possible last minute, changes to make the other look bad.

No, these are established titles where the performance of this Intel card is far away of the same card at synthetics that it really looks like their driver team spend far too much effort optimising for synthetics which is always something we should find suspicious.

Wonder if instead of people moaning about AMD drivers,now they will moan about Intel ones?! :cry:
 
Due Intel having been so profitable for so many decades, it is easy to forget that Intel have a long history of spending $billions and having little to show for it.
Pricing all these adventures is hard (plus it would have to adjusted for inflation), but a short list includes:
  1. Entry into networking when they were worried about Cisco and Juniper etc. Some of the work did lead somewhere.
  2. Larrabee aka x86 everywhere. A minor spin-off did launch as Knights Landing etc.
  3. Contra revenue to try save Atom. Dumping is illegal in some markets and segments but this wasn't quite the same.
  4. 4G and 5G modems
  5. McAfee
  6. some other things I've forgotten?
so yes, they have made so much profit over the decades but being the size they are also means they have wasted so much money it is hard to keep track. The above must be close to $10 billion in today's money.

Well not if Intel do their usual thing and give up on their entry into dGPUs!
It's a bit of shame the A380 drivers are so poor ATM. The 6GB of VRAM and good media capabilities are something the competition does not have in that range currently.
 
Back
Top Bottom