• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel ARC and the Latest Drivers: Does this bode Well for ARC's future?

Intel made a big song and dance about how they are for the people, like some pound shop Chairman Mao. And then dumped the A770 on us for £450, more expensive than the RTX 3060 was at the time, even then these mainstream idiots couldn't bring themselves to call Intel out for being worse than Nvidia.

These people are either incredibly idiotic or they have their own agenda and there is nothing "socially progressive" about that agenda.
 
Last edited:
If Intel can't make these GPU's profitable at prices that make sense vs the competition then they should still be selling them at competitive prices, its what AMD did when they were stuck with Raja's Vega.

Or pack it up and walk away because as it is right now Intel make even Nvidia look good, we don't need that.
 
Last edited:
£300 for the A770 (so a price drop from where it is now) and £250 for the A750 is competitive if the games you want to play perform well, The exact same can be said for the 6650XT if the games you play prefer the architecture (Hitman being a good example for favouring AMD vs Destiny 2 not being one of its best... noticeably favours Nvidia hardware ot the point where my 3080 laptop GPU (~3070 desktop performance) could match a RX6800 @ 1440p UW). Nvidia don't even get a look in with the 3060 given it is bested by both of AMD and Intel's offerings.

The initial launch prices were stupid, especially the A770 (although £450 was utterly moronic given the LE was "only" £389).

Hopefully they learn from this for Battlemage.
 
£300 for the A770 (so a price drop from where it is now) and £250 for the A750 is competitive if the games you want to play perform well, The exact same can be said for the 6650XT if the games you play prefer the architecture (Hitman being a good example for favouring AMD vs Destiny 2 not being one of its best... noticeably favours Nvidia hardware ot the point where my 3080 laptop GPU (~3070 desktop performance) could match a RX6800 @ 1440p UW). Nvidia don't even get a look in with the 3060 given it is bested by both of AMD and Intel's offerings.

The initial launch prices were stupid, especially the A770 (although £450 was utterly moronic given the LE was "only" £389).

Hopefully they learn from this for Battlemage.

The question for me is not what would i buy, but what would i recommend to a friend.

I would buy an RX 6800 over a 3070/Ti in a heart beat, in fact i have my mind already set on it. However, i recently recommended an RTX 3070.
You see i'm willing to agree that hysterically AMD has had more problems than Nvidia, i think those problems have been exaggerated and overblown, none the less i do agree.
I'm a skilled PC enthusiast and i have a degree of patience for problem solving, to my friends their PC is not a toy, its a tool, or rather a gateway to playing games, they just want it to work like a household white goods appliance. If there is even a 1% chance the Nvidia GPU servers them more reliably i owe it to them to be honest because its not about me, and believe me i would like to see Nvidia taken down a few pegs!

For tech journalists i think its even more their responsibility to be completely honest, especially these days where they have moved from just telling you the hard facts and leaving you to make an informed decision they are actively telling people what and what not to buy like they have a vested interest, holding up a £450 A770 posturing "buy this" No Linus.... don't scam ordinary people in to bolstering your shares that's ###### disgusting.

The truth is the chances are if you were to do a blind test going through a Steam Library 150 games strong for a few weeks with similar performance AMD vs Nvidia GPU's 9 out of 10 people wouldn't be able to tell the difference between those two GPU's.

You can't say the same for Intel. That doesn't mean they don't have value, they do, but be honest, be reasonable.
 
Last edited:
The 150 steam game (or maybe a blend across Steam, Gamepass, Epic, Uplay etc) would actually be quite interesting from a general point of view. As far as I know its only LTT that did a general gaming test not long after ARC launched and only then using a 3060 as a comparison. Would be interesting to get an updated take with the new driver.
 
The 150 steam game (or maybe a blend across Steam, Gamepass, Epic, Uplay etc) would actually be quite interesting from a general point of view. As far as I know its only LTT that did a general gaming test not long after ARC launched and only then using a 3060 as a comparison. Would be interesting to get an updated take with the new driver.

So would i, TBH.
 
The Arc A770 16GB seems to be better than the 4070 Ti, with RT and High settings, when it comes to games that require lots of VRAM, such as Hogwarts Legacy, and a decent Memory Bus, and yes the 4070 Ti has greater FPS, but the A770 16GB VRAM/256 Bit Bus, eliminates, a great deal of stuttering, that the limited Memory Bus on the 4070 Ti causes, making it more playable, similar when tested with a 3090.
 
Last edited:
Intel made a big song and dance about how they are for the people, like some pound shop Chairman Mao. And then dumped the A770 on us for £450, more expensive than the RTX 3060 was at the time, even then these mainstream idiots couldn't bring themselves to call Intel out for being worse than Nvidia.

These people are either incredibly idiotic or they have their own agenda and there is nothing "socially progressive" about that agenda.

I will be honest here and say I don't think they are as bad as that. The fact is at the die size they are making them and with the VRAM I think it is fair to say they expected them to be a lot faster. The production cost on them would not have been cheap, and from a tech spec they are not cheap GPUs. Not to make.

With the chip crisis and everything else I don't think they had much choice. I am also pretty sure that they are selling the reduced price 750 at a loss, too.

But yeah, the 780 is it? or 770 IDK? should have been up there with the 2080ti/3070. In performance terms, as that is what all of the leaks predicted. And thus, at £450? it would have been excellent. Now whether it will ever catch a 3070 when the drivers mature? no idea. But right now it's up there with a 3060 and 3060ti, which again really isn't as bad as people think it is.

The problem is the way people think. They just love to go all out on a brand on love, and push everything to the extreme. Which ends up being bad for consumers. Like, I asked around for ages what was the best value gaming CPU, as I had a couple of rigs on older workstation CPUs that had low clock speeds. "3600 and 5600 !" I was told, and absolutely nothing else to consider. Yeah, right. I ended up getting a 11400F and a Z590i board for less than a 5600X and it overclocks itself on the thermal turbo to 4.4ghz and beats a 5600x.

In fact I was so impressed with it I got another one, and a previously £540 board for £199. It absolutely decimates my own AMD 3950x for gaming. But that is the problem. AMD GPUs have also improved so much lately, especially after they finally got their head around the core bug on Vega that carried through to the 57xx GPUs where it would black screen and actually fixed it. But people just seem so hard core set on one brand that they can't even see the wood for the trees.

If I had a bit of bad luck and one died now? I would get an A750. More than enough for PUBG, and I would happily reduce settings in other stuff. I'm over paying over £700 for a GPU.
 
I will be honest here and say I don't think they are as bad as that. The fact is at the die size they are making them and with the VRAM I think it is fair to say they expected them to be a lot faster. The production cost on them would not have been cheap, and from a tech spec they are not cheap GPUs. Not to make.

With the chip crisis and everything else I don't think they had much choice. I am also pretty sure that they are selling the reduced price 750 at a loss, too.

But yeah, the 780 is it? or 770 IDK? should have been up there with the 2080ti/3070. In performance terms, as that is what all of the leaks predicted. And thus, at £450? it would have been excellent. Now whether it will ever catch a 3070 when the drivers mature? no idea. But right now it's up there with a 3060 and 3060ti, which again really isn't as bad as people think it is.

The problem is the way people think. They just love to go all out on a brand on love, and push everything to the extreme. Which ends up being bad for consumers. Like, I asked around for ages what was the best value gaming CPU, as I had a couple of rigs on older workstation CPUs that had low clock speeds. "3600 and 5600 !" I was told, and absolutely nothing else to consider. Yeah, right. I ended up getting a 11400F and a Z590i board for less than a 5600X and it overclocks itself on the thermal turbo to 4.4ghz and beats a 5600x.

In fact I was so impressed with it I got another one, and a previously £540 board for £199. It absolutely decimates my own AMD 3950x for gaming. But that is the problem. AMD GPUs have also improved so much lately, especially after they finally got their head around the core bug on Vega that carried through to the 57xx GPUs where it would black screen and actually fixed it. But people just seem so hard core set on one brand that they can't even see the wood for the trees.

If I had a bit of bad luck and one died now? I would get an A750. More than enough for PUBG, and I would happily reduce settings in other stuff. I'm over paying over £700 for a GPU.

I will say this, and RE: @Author_25 above you, i like and am impressed with them offering a 16GB variant in the mid range class, brilliant..... much like AMD do. Its the main reason i have written out of my thinking an RTX 3070 or 3080 completely

As some of you may know :D one of the biggest problems i have with Nvidia is them maximising profits at the expense of a large enough VRam buffer to give you consistent performance, they do it also because they know tech journalists are largely idiots and don't care about it and they have an army of disciples running around putting out fires for them.
Its also a good way to build in obsolescence, for reasons i don't need to explain here.

So kudos to Intel, good job.
 
Also, my choice was a 10900K for £500 or a Ryzen 5800X for £450, the latter is better in every conceivable way, and the motherboard was only £165, a damned good one.

It depends on when exactly you're buying.

6 core CPUs are all about Intel now. That is only because of competition. However, you would be amazed how many fanboys and brainwashed people AMD now have a hold over. Watched someone do a intel mod build the other day and it was like going back several years. Comments like "I can't believe you didn't go AMD" and "The only thing wrong with it is the Intel CPU" and so on.

I wish people would grow up and realise that it's that silly mentality that causes a monopoly, and actually do what they should and blame themselves for the situation they find themselves in. Yes the Intel drivers are dodgy. They will never, ever get better until more people adopt them. It is as simple as that. Ironically you can say the same thing about literally any new technology. Saw loads of people having issues with 7000 series AMD CPUs, from DOA to certain memory kits not working at all. Yet all of a sudden AMD are where Intel used to be ffs. And you are right, tech journalists are usually completely dumb.
 
I have always built with Intel CPUs but the one thing I cannot is the short life of the CPU socket and I will commend AMD for the long life of there Socket types making upgrades and replacement cheaper and more accessible.
 
I have always built with Intel CPUs but the one thing I cannot is the short life of the CPU socket and I will commend AMD for the long life of there Socket types making upgrades and replacement cheaper and more accessible.

TBH a lot of that was launch bravado and because they wanted market share. As soon as they were given a chance to show their true colours it changed. Quite a few times they tried to wriggle out of supporting older boards. The more time goes on? the more they will do that. Their absolutely hideous X670 board prices demonstrate just that. They singlehandedly managed to put off nearly all potential buyers with their insane board prices. The way they completely fluffed a very clear chance of total victory in the GPU space demonstrates their intent even more. We need Intel to give both of them a reality slap, and that won't happen all of the time people circulate fake rumours that they are giving up or continually bang on about the drivers. It takes me MERE seconds to use a search engine and find out about an absolutely enormous Nvidia CPU driver bug. And this isn't the first time either.

BTW Humbug I meant a true, extremely high end board. Like the one I got for £199.

gK1bmK5.png

The AMD board I could have got for that much money looks like it came out of a 10p plastic ring machine.

The problem here and what irks me the most is that every one keeps complaining about GPU prices, yet seem to want to laugh off the competition and rubbish it so that it will never become competition. Kinda like they have given in and bought an Nvidia GPU and must defend their own silly mindset at any cost. I dunno man, humans.
 
Motherboard prices are getting ridiculous, almost as bad as GPUs, and with less features.

I got lucky when building my new PC last year, got the MSI Z690 Tomahawk DDR4, at £194, so saved a bit.
 
Motherboard prices are getting ridiculous, almost as bad as GPUs, and with less features.

I got lucky when building my new PC last year, got the MSI Z690 Tomahawk DDR4, at £194, so saved a bit.

Yeah it's mad how something like a X58 Rampage used to cost about £250 and that was expensive AF, yet here we are with boards costing a grand. AMD said that the traces for PCIE5 were the reason for the extortionate cost, yet Intel are proving otherwise. So they are just being greedy AF right now.

The only company who can actually take on Nvidia is Intel. AMD will never be big enough for that tbh. All they have done so far is just cosy on up to Nvidia with their pricing, providing absolutely no competition at all. This is why when "influencers" go around bad mouthing Intel GPUs it winds me up so bad, and in their very next video they whine about GPU prices. I mean at least Linus has been behind Intel so far.

But yeah, a "cheap" AMD board now for £250 looks like one you could get for about £70 on every other gen. It's absolutely ridiculous.
 

ADLINK Puts Intel Arc A-series GPUs on MXM Form Factor


by GFreeman

After GUNNIR showed the same product back in January, ADLINK is now offering both Intel Arc A-series GPUs in MXM form factor. The MXM (Mobile PCI Express Module) is a standardized form factor that is used mostly in laptops and some small form factor PCs. Product pages confirm that ADLINK offers both the Intel Arc A370M and the Intel Arc A350M in MXM form factor.

According to specifications The ADLINK MXM-AXe, as the product is called, is MXM 3.1 Type A based on Intel Arc GPU, packing 8 Xe-cores, 128 Execution Units, 4 GB of GDDR6 memory, and TDP of 35-50 W, which is pretty much standard for the Arc A370M GPU. The company also offers the same product with A350M GPU with TDP of 25-35 W. With a decent power efficiency, full AV1 hardware encoding, and support for up to four 4K displays, such a GPU would be perfect for SFFs, and could be even a decent upgrade for some laptops.
fITSY7X.jpg

hE6V0B0.jpg
 
Last edited:
If you took the best Arc, whats the equivalent Nvidia card pls? Also, in opposite direction, is intel currently or imminently releasing (if they released the info) a card thats equiv to the 4090 pls?
 
If you took the best Arc, whats the equivalent Nvidia card pls? Also, in opposite direction, is intel currently or imminently releasing (if they released the info) a card thats equiv to the 4090 pls?

The A750 is equivalent to the 3060 (though slower at 1080p), the A770 16GB is roughly equivalent to the 3060 Ti, but the 3060 Ti is generally faster, as there's not that much of a gap (or not as much as you'd expect) between the A750 & A770 in most games.
 
Last edited:
If you took the best Arc, whats the equivalent Nvidia card pls? Also, in opposite direction, is intel currently or imminently releasing (if they released the info) a card thats equiv to the 4090 pls?
It really does swing wildly depending on what you're playing. At its very best, the A770 can perform on the level of a 3070. However, that's in outlier titles (Red Dead Redemption 2 for example), and on average it sits somewhere between a 3060 and 3060 Ti. At its worst (generally when running pre-DX12/Vulkan games), it's way below a 3060 or even a 2060. Performance is just all over the place right now. There's really no way to judge it other than looking up benchmarks for exactly the games you want to play, albeit keeping in mind that it's likely to be fine in anything newer using DX12/Vulkan. DXVK is also worth looking into, as that translates older DirectX APIs to Vulkan, which is much more friendly to Arc's architecture and can massively boost performance. Intel are using it as part of their driver package for certain problem DX9 titles like CS:GO, but not as standard.

As for the future, no, a 4090 competitor is almost certainly a long way off. The next series of Arc cards, Battlemage, is expected in Q2 24 according to the latest Intel roadmap, so over a year away. And of course there's no guarantee it'll reach 4090 performance, or what AMD and Nvidia will be cooking up by then. In fact, looking at said roadmap it specifies another ~225W TDP part, so it certainly won't be a 4090 competitor unless Intel perform an actual miracle. It'll be another product aimed at the mid-range.

arc58e7r.png
 
Last edited:
Back
Top Bottom