• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Arc series unveiled with the Alchemist dGPU to arrive in Q1 2022

Soldato
Joined
9 Nov 2009
Posts
24,841
Location
Planet Earth
Intel revealed some more details of its first Arc uarch dGPUs,the Alchemist series which are arriving in Q1 2022:
https://www.anandtech.com/show/16886/intel-video-cards-get-a-brand-name-arc-coming-q1-2022
https://videocardz.com/press-releas...s-based-xe-hpg-architecture-coming-early-2022
https://videocardz.com/newz/intel-d...map-and-development-of-ai-super-sampling-tech

Upcoming Intel Arc graphics products are based on the Xe-HPG microarchitecture, a convergence of Intel’s Xe-LP, HP and HPC microarchitectures, that will deliver scalability and compute efficiency with advanced graphics features. Alchemist, the first generation of Intel Arc products, will feature hardware-based ray tracing and artificial intelligence-driven super sampling, and offer full support for DirectX 12 Ultimate.

Has a DLSS competitor:

Intel: AI-based Super Sampling Tech Under Development
Alongside today's announcement also comes a small mention from Intel that they will be throwing their hat into the ring for image upscaling technologies. As part of the Arc announcement, Intel has confirmed that the company is working on "artificial intelligence-driven super sampling". For the last couple of years NVIDIA and more recently AMD have been making waves with their respective DLSS and FSR technologies, and it’s fair to say that Intel doesn’t want to be left out – after all, they intend to become an equal competitor in the discrete GPU space.

Unfortunately, today’s announcement from Intel is just throwing their hat into the ring, so anyone looking for a release date or information on if Intel's in-development tech is purely spatial or spatial + temporal will be waiting a bit longer. But, for the moment, we at least have confirmation that Intel is working on their own take on the tech.
 
Associate
Joined
31 Dec 2010
Posts
2,440
Location
Sussex
So while this gen Nvidia needs lots of current (Ampere), Intel are going wireless and will be Arc'ing their power!
Wonder if their PR renders are any indication of the actual chip layouts?
 
Soldato
Joined
28 Jun 2013
Posts
3,660
I cant wait even though i think they are going to struggle unless they have a very aggressive price stratergy or they are just sraight up beastly cards
 
Associate
Joined
31 Dec 2010
Posts
2,440
Location
Sussex
So Q1 2022 (and Intel generally mean March by that not January) means mass production should start sometime before the end of the year.

Intel must have final silicon back then.

If this is TSMC's 6nm, then with that being EUV the process time could be less than 3 months, but if they want a hard launch they have to stockpile too. Volume production to start in October, then?

There are also rumours Apple want to make their own dGPU. Presumably not to sell stand-alone though. And Apple Silicon M1 has no Thunderbolt. As would be the obvious way Apple would get big spenders to part with more money: locked down MacBook Air with external GPU.

Edit: seems M1 does have Thunderbolt. Still even if Apple were to make ThunderBolt eGPUs, drivers would presumably be M1 MacOS only.
 
Last edited:
Associate
Joined
31 Dec 2010
Posts
2,440
Location
Sussex
Shades are a bit off, but maybe in the right light?

However, that surely is only the case if you can actually install all three cards and drivers in the same system?
I know that Intel iGPU and AMD dGPU can get along as I've run them in the same system with the Intel driving the secondary monitor, but AFAIR the last time I tried that with an Nvidia dGPU it didn't work.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
I wonder how much behind that ARC will be performance per watt wise compared to AMD Radeon Navi 21? The Radeon RX 6800 XT 16GB.

Also, we would need very deep analysis of the image quality on all cards in order to see if it's a fair battle or someone might be cheating for performance.
 
Soldato
Joined
18 Feb 2015
Posts
6,484
The only thing I'm excited about with this announcement is that their XeSS might be open-source. Otherwise it's all pretty bland so far with any interesting card being many years away, likewise for their driver support.
 
Associate
Joined
23 Oct 2019
Posts
687
Hmm Q1 2022 is interesting as most rumours point to at the earliest Q3 for nvidea and AMDs next cards. Depending on performance might this accelerate AMD and Nvideas plans next year?
 
Associate
Joined
31 Dec 2010
Posts
2,440
Location
Sussex
Hmm Q1 2022 is interesting as most rumours point to at the earliest Q3 for nvidea and AMDs next cards. Depending on performance might this accelerate AMD and Nvideas plans next year?
Designing new a generations of cards is such a long process, that I don't see where they even could accelerate or rush parts of the process.

Let's say that the normal time from start to finish for planning, design, tape-out and mass release is between 2 to 3 years. The most they could rush a generation forward is probably around 3 months and that would most likely be a noticeably rushed job with drivers which leave performance on the table sure due to lack of time to fully optimise, or boards rushed out with design flaws like excess power skills etc. AMD have tended to do the former, while Nvidia seemed to have done they latter with the Ampere.


So I tend to think that rushing is unlikely from both AMD and Nvidia. Nvidia might hold back a new Ti type product to rain on Intel's launch, though.
 
Soldato
Joined
26 May 2014
Posts
2,953
Hmm Q1 2022 is interesting as most rumours point to at the earliest Q3 for nvidea and AMDs next cards. Depending on performance might this accelerate AMD and Nvideas plans next year?
Even the most optimistic predictions have put the top end of Intel's first generation of cards at ~3070 performance. I doubt AMD and Nvidia are worried at all, given they'll both be breezing well beyond that with their next cards. Possibly way, way beyond if some of the rumours pan out. Of course, 3070 performance is still pretty nice and more than enough for most people, so if the card is priced well then it'll sell.
 
Soldato
Joined
3 Aug 2010
Posts
3,038
Even the most optimistic predictions have put the top end of Intel's first generation of cards at ~3070 performance. I doubt AMD and Nvidia are worried at all, given they'll both be breezing well beyond that with their next cards. Possibly way, way beyond if some of the rumours pan out. Of course, 3070 performance is still pretty nice and more than enough for most people, so if the card is priced well then it'll sell.
3070 performance at the right price and availability would be very disruptive to the market.

However, we're talking about Intel here so highly unlikely they'll get it right.
 

G J

G J

Associate
Joined
3 Oct 2008
Posts
1,403
Insert generic poster comment about how this is going to be great for competition and hopefully make GPU more cheeper but Intel will join the duopoly. :p
 
Soldato
Joined
28 Jun 2013
Posts
3,660
Insert generic poster comment about how this is going to be great for competition and hopefully make GPU more cheeper but Intel will join the duopoly. :p

can always dream, i think intel will undercut them or fall flat on their face.
 
Permabanned
Joined
20 Jan 2021
Posts
1,337
Intel, might set their price a little higher than similar level GPU prices. They might then start chucking the card in their laptops devices and tablets.
 
Associate
Joined
23 Oct 2019
Posts
687
Intel, might set their price a little higher than similar level GPU prices. They might then start chucking the card in their laptops devices and tablets.
Hmm good point here Im curious how much of a focus they'll have on mobile GPUs with this series. I'm hoping you're wrong about pricing thing though!
 
Associate
Joined
23 Oct 2019
Posts
687
Designing new a generations of cards is such a long process, that I don't see where they even could accelerate or rush parts of the process.

Let's say that the normal time from start to finish for planning, design, tape-out and mass release is between 2 to 3 years. The most they could rush a generation forward is probably around 3 months and that would most likely be a noticeably rushed job with drivers which leave performance on the table sure due to lack of time to fully optimise, or boards rushed out with design flaws like excess power skills etc. AMD have tended to do the former, while Nvidia seemed to have done they latter with the Ampere.


So I tend to think that rushing is unlikely from both AMD and Nvidia. Nvidia might hold back a new Ti type product to rain on Intel's launch, though.
Thinking maybe a Super refresh of the current gen possibly. As said further down the thread it doesn't sound like this intel GPU gen is going to be particularly competitive at the high end though so might just require some price reductions on the AMD and nvidea side....but hey I'd take that too!
 
Back
Top Bottom