• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Arc series unveiled with the Alchemist dGPU to arrive in Q1 2022

Glad to see Intel is supplying dedicated GPUs to a market, judging by Nvidia's recent announcement, hopefully this will at least keep the entry-mid tier market competitive - the A770 appears to be a solid contender toward Nvidia's 3060/70 or AMD's RX6600-6700XT line; even if Intel's line up looks like at least a year older. Better to have decent selection of midrange cards which are competitively priced to put pressure on those inflated price on both high-end and entry-level cards.

3060/6600 level and only in DX12 and with far worse drivers and for the same price as a better card (6650XT). But it's something. Performance might well improve a bit in time as the drivers become less bad. The potential performance of the hardware seems to be a bit higher than the actual performance of the cards and we know that the drivers are very bad indeed. I think the drivers will become less bad. I think Intel will continue to develop Arc, both hardware and drivers. Not primarily to compete in the gaming dGPU market (though I think they will do that too, or at least try to do so on a small scale) but for the future of Intel:

That [can't compete with established competition at a price that will make any sort of profit] might be true for Arc Alchemist, but Intel can't afford to bin Arc entirely. Not if they want to remain a major player in the industry. The Arc project isn't just (probably not even mainly) about graphics cards for PC gaming. There are bigger markets now for massively parallel processors. Alchemist is even worse for those markets, but Intel has nothing that fits apart from Arc. Intel's choices are either develop Arc into something that can compete in those markets, develop an entirely new architecture from scratch to compete in those markets or give up on entering those markets. I think they'll go with developing Arc.

So I still think this is true:

So I think they'll do Alchemist on a small scale, partly to get the Arc brand name out there to some extent and partly to aid development of the next generation of Arc (Battlemage). Alchemist has taught Intel some valuable lessons on what they've done wrong. Well, it should have done. Maybe Intel will have properly functional drivers by the time Battlemage is ready, for example.

Intel do have Pro versions of Alchemist cards for those other markets. I don't think they'll sell many. Probably an even smaller share of the market than they'll get with the home gaming versions of the cards (which might sell fairly well for the budget end of prebuilts, especially if Intel sell to those companies at cost) but trying to look at Alchemist objectively I think it's not rubbish. Not the hardware, anyway. I think it might serve to establish Intel as a possible future alternative in those markets. I think buyers in those markets might look at Alchemist and think along the lines of "Arc has potential, so it's worth keeping an eye on. Maybe the next generation will be worth buying."
 
Intel do have Pro versions of Alchemist cards for those other markets. I don't think they'll sell many.

Don't be so sure about that. If they get their GPUs certified for Maya or whatever - and that's non-trivial - that will be sufficient. They could well leverage their relationships with Dell, HP, and Lenovo to make their pro cards the default.
 

Shadow of the Tomb Raider: XeSS vs. DLSS Comparison Review




Introduction​

Intel's Xe Super Sampling (XeSS) technology is finally available—with the latest version of Shadow of the Tomb Raider. Announced earlier this Spring, XeSS is a major update to Intel's performance enhancement suite, rivaling NVIDIA DLSS and AMD FSR 2.0, which lets you improve framerates at minimal loss to image quality. XeSS, DLSS and FSR 2.0 work on the principle of getting the game to render everything except the HUD and post-FX at a lower resolution than the display is capable of and upscaling it using sophisticated algorithms that make the output look as if it was rendered at native resolution. Depending on the game, there are subtle differences in the implementations of Intel's Xe Super Sampling (XeSS) and NVIDIA's Deep Learning Super Sampling (DLSS), so we are keen to have a look at both in this game.

Below, you will find comparison screenshots at 4K, 1440p, 1080p, and in different XeSS and DLSS quality modes. For those who want to see how DLSS and XeSS perform in motion, watch our side-by-side comparison video. The video can help uncover issues like shimmering or temporal instability, which are inherently not visible in screenshots.

All tests were made using a GeForce RTX 3080 GPU at Ultra graphics settings, with ray tracing enabled; motion blur and depth of field were disabled for better image viewing. DLSS was manually updated to version 2.4.12 by swapping the DLL file.



Conclusion​

In Shadow of the Tomb Raider, none of the anti-aliasing and upscaling solutions are using sharpening filters in the render path. However, you can still enable AMD FidelityFX CAS when TAA is enabled, and for our testing we disabled AMD FidelityFX CAS for TAA. What's also important to note, Shadow of the Tomb Raider has an option to launch the game in either DirectX 11 or DirectX 12 mode, and XeSS only supports the DirectX 12 API in this game. If you have been playing the game in DirectX 11 mode, DirectX 12 has to be enabled in order to utilize XeSS.

Compared to native TAA, the XeSS image quality is a very noticeable upgrade across all resolutions. The in-game TAA solution has a very blurry overall image across all resolutions except 4K, and very poor rendering of small object detail—tree leaves or fishing nets, for example. All of these issues with the in-game TAA solution were resolved with XeSS. Compared to DLSS, XeSS image quality is very close to what DLSS can output, with some differences in temporal stability. One of the most noticeable differences in image quality between XeSS and DLSS is how water puddles render. With XeSS they appear with a noticeable reduction in the resolution of the puddles and also look very jittery, which may be very distracting for some people. These issues with jittery water puddles are visible even at 4K XeSS Quality mode, and the lower internal resolution you are using, the more visible this issue will become. The second-most-noticeable difference is the hair rendering. With XeSS it appears to look pixelated in motion, which can be distracting. Also, there are some differences in how XeSS deals with ghosting in comparison to DLSS. Overall, XeSS handles ghosting similarly to DLSS at 1440p resolution and above. 1080p is a bit different, as XeSS has more ghosting on small objects such as falling leaves or walking NPCs at long distance.

Interestingly, when using XeSS, there are some major differences in performance gains, compared to DLSS or FSR 2.0, which essentially had equal performance gains in most games. As we are testing XeSS with an RTX 3080 GPU, which does not have the XMX instruction set, which is designed to accelerate XeSS workloads on Intel's Arc GPUs, the performance gains are less than what we can expect on Arc GPUs, so keep that in mind. That said, the actual performance increase difference between XeSS and DLSS is about 10% at 4K Quality mode, in favor of DLSS. However, compared to native 4K resolution, XeSS manages to deliver up to 40% more performance while using the DP4a instruction set that is compatible with all GPU architectures, which is still a quite decent performance uplift.
 

Not bad at all :D , will keep many budget gamers happy and not getting ripped off by the usual duopoly. Lets see if intel can provide the goods at the price with good drivers, because they should be onto a winner with these at these prices.


EDIT: May actually grab a A770 16gb limited edition when out to play with, seems a very sensible price for it even if performance is lacking compared to the rest. 16GB is very useful for certain apps and may actually make a nice HTPC card too.
 
Last edited:
Yup, always have been keen on the intel entry as i'm not a top-end gamer, so i'm quite looking forwards to seeing how the A770 plays out and likely to grab one at those prices, depending on where it sits.
 
So currently at $1.11 to £1, we're looking at roughly (plus vat):

$349 = £377 A770_16
$329 = £356 A770_8
$289 = £312 A750

However the prices for nvidia FE didn't follow that kind've logic, it's rarely a 'take the dollar cost, change it to pounds then add 20% to find UK msrp' in practice, normally cheaper.

Using the bad $1.11 = £1 we're at right now;
3060ti = $399 = £360, + 20% = £431, but they're up for £369
3070ti = $599 = £539, + 20% = £647, but they're up for £549 etc etc

Retailers add their % on and call it something else, so hopefully the LE's should be at roughly their dollar value for us too, £340 / £320 / £280 would be logical to see, which is where their pricing is meant to be 'A770 ~ 3060ti in dx12 and should be just below their price' was what they said.
 
Last edited:
I reckon that even if Intel price the A750 at £289 inc VAT they'll be too expensive. MSRP for the RTX 3060 is £10 more. And the RTX 3060 has mature drivers and Nvidia has mindshare, so if you can pay £10 more for the real deal then that's a no-brainer. And then there's the RTX 3060 Ti at £369 MSRP; granted the cheapest price on OCUK is £440 but Nvidia have the stock to flood the market.

Reduce everything by £50 and the decision becomes more complex.
 
Last edited:
In news that will excite nobody here, Intel have launched the A310.


I doubt if it will excite anyone anywhere. What's the intended market? Builders of desktop PCs for which the GPU performance is irrelevant will use CPUs with built-in graphics because that's the cheapest way to meet their requirements. If GPU performance is of any relevance, A310 is pointless. So the only possible market I can see is people seeking to replace a very old budget graphics card that has failed.
Yup, always have been keen on the intel entry as i'm not a top-end gamer, so i'm quite looking forwards to seeing how the A770 plays out and likely to grab one at those prices, depending on where it sits.

I'm hanging around to see what plays out with Arc and rdna3. I'm thinking that I'm more likely to go with the latter, but I'm not ruling out A770. If full reviews show that Intel has managed to make the drivers at least adequate and if performance doesn't suffer badly from lack of support for anything older than DX12 and if AMD don't release anything better at a similar price, I might buy an A770. The 16GB variant if the price difference is only ~£20. What would be the point of the 8GB variant if they're otherwise the same and the price difference is only ~£20?

But I think I'd probably rather have a 6650XT at that price point, even if rdna3 doesn't being anything new to the table at that price point. Probably. I'll wait and see.
 
I doubt if it will excite anyone anywhere. What's the intended market? Builders of desktop PCs for which the GPU performance is irrelevant will use CPUs with built-in graphics because that's the cheapest way to meet their requirements. If GPU performance is of any relevance, A310 is pointless. So the only possible market I can see is people seeking to replace a very old budget graphics card that has failed.


I'm hanging around to see what plays out with Arc and rdna3. I'm thinking that I'm more likely to go with the latter, but I'm not ruling out A770. If full reviews show that Intel has managed to make the drivers at least adequate and if performance doesn't suffer badly from lack of support for anything older than DX12 and if AMD don't release anything better at a similar price, I might buy an A770. The 16GB variant if the price difference is only ~£20. What would be the point of the 8GB variant if they're otherwise the same and the price difference is only ~£20?

But I think I'd probably rather have a 6650XT at that price point, even if rdna3 doesn't being anything new to the table at that price point. Probably. I'll wait and see.
No clue why the difference between the 2 is only £20, no reason to buy the 8 at all imo too, agreed.
 
Tell me, except out of curiosity, would you buy an A750 for £289 when you can get a RX 6600 for £10 less?
 
Last edited:
Apparently the A750, not the A770, matches or betters the RTX 3060 in DX12 and Vulkan


So A770 might be 3060ti level? Not too bad i guess, but i think they might struggle to gain traction if the UK RRP is near £400.

I think if they can get 3060ti performance for £300 or under they would likely sell a lot. That would be a reasonable amount of money, and a lot of performance for 1080p/1440p gaming.
 
Last edited:
Who is actually going to be selling these Intel GPU's?

I don't see them listed anywhere, you'd have thought they'd at least be placeholder pages ready to view...somewhere.

Are they direct purchase from Intel only or something..
 
Who is actually going to be selling these Intel GPU's?

I don't see them listed anywhere, you'd have thought they'd at least be placeholder pages ready to view...somewhere.

Are they direct purchase from Intel only or something..

I'd say loads of them are going to the likes of Dell et al.
 
Back
Top Bottom