• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Arc series unveiled with the Alchemist dGPU to arrive in Q1 2022

Ok First game tested and the most important one to me as its my most played game. Destiny 2 (DX11 IIRC) runs fine. I don't have a direct comparison but I would say based on delta to my wife's laptop (RTX3070 desktop level performance from the 160w RTX 3080 mobile), I would put the A750 around 3060 performance. Maybe a little above or maybe a little below, hard to tell. Need to run a "benchmark" between the two to get exact numbers. Its a lot faster than the OG Titan so........ :D


EDIT: One thing I have noticed is that my second monitor (Alienware 1080p 240Hz IPS panel) only goes up to 144Hz. I am also missing a number of the sync options in the Arc Control panel as well. I need to do more digging on this but its a a tad odd.

EDIT: Managed an entirely unscientific "benchmark" in D2 by standing in the exact same place on the Throne world and looking at the exact same spot on both machines. Resolution of 1440p and all max settings, HDAO and motion blue off:

A750 (2.5Ghz core) - 75-80FPS
RTX 3080 (160w Mobile = RTX 3070/Ti/RX6800 in D2 specifically) - 100-110FPS

So around a 30-35% performance jump on average with up to 40% on the high to low sides. Possibly more to squeeze from the ARC card as I noticed performance had some random drops before returning to normal a few seconds later. Overall feels like it would be matching a RTX 3060 / RX 6600XT as it stands right now (note the latter is down to the game generally performing better on Nvidia hardware from my own testing).

EDIT 3: If you happen to buy and ARC card DO NOT set the GPU performance boost straight to 100. I thought this was like a 100Mhz overclock but my system started lagging out and then froze. I don't know exactly what this translates to but at around 25 it gives another 100Mhz or so on the core. I am guessing that 100 would be around 2.8-2.9Ghz hence why the system locked up. Another area where Intel need to work on, driver crashes.
 
Last edited:
Those interested in Intel Arc may be interested in this livestream that is starting shortly.

TLDW: they had big problems getting video capture to work, wasting an hour. There was a long discussion about AV1 encoding not actually being useful right now as Twitch doesn't support it and OBS beta does not support it on Intel GPUs. The card performed decently in benchmarks. They tried two games, Spiderman and Deep Rock Galactic. It crashed in the former but they noted that the game was flakey, and performed flawlessly in the latter.

They really hated that Arc's control software is an overlay, but it didn't crash.
 
It was an interesting stream, especially as they were essentially experiencing it at the same time as me. I had forgotten that the 770 has RGB whereas my 750 just has little silver rings around the fans. One thing that did slightly annoy me was that Graham mentioned having both a 3060 and 6650XT in the shop, yet they didn't try to use either as a performance comparison. Nothing crazy but just a Super position run would have sufficed.

The capture issues were strange although I do recall Linus having weird Chroma Sampling / gamma issues when doing their game test stream. Possibly something odd going on with HDCP maybe?

The overlay to the control panel is weird as on first load it doesn't look like its an overlay. I kept trying to move it around and then realised why it wouldn't move.

Just had some really weird performance issues in CP2077. In game benchmark was giving me 60FPS @ 1440p Ultra yet only 47FPS @ 1440p High. Then in game at Ultra settings it then refused to go above 35FPS....

On the plus side when it is working, High @ 1440p with Quality FSR achieves a decent 60FPS lock so the game is quite playable. Ray Tracing seems unplayable, at least at 1440p. Not really sure if I was expecting too much or there is a driver issue (given the above experience I am not ruling out the above).

No crashes yet that weren't caused by me being stupid with the Control panel settings.

Additionally ran Timespy and Superposition and scored:

Timespy - 13,141
Superposition (1080p Extreme) - 7,955

Both compare favourably to the 3080M (3070) so there is certainly potential for the card / architecture to perform well.

How much of the weird performance swings can be fixed in software is somewhat unknown but it will certainly be interesting to see what they can do (again, I would not recommend buying one on the basis that it *might* get more in line with a 3060Ti / 6700(XT) in the future - DX9 titles excluded given the architectural limitations of course).

EDIT: One last test before I really need to get some sleep!

Tried out the FX9 Unigine Heaven and it scored 2595 (Avg 103 FPS) with the 1080p Extreme preset as per the benchmark thread. Whilst that is around 40% slower than the 3080M it isn't as bad as I was expecting. :)
 
Last edited:
It was an interesting stream, especially as they were essentially experiencing it at the same time as me. I had forgotten that the 770 has RGB whereas my 750 just has little silver rings around the fans. One thing that did slightly annoy me was that Graham mentioned having both a 3060 and 6650XT in the shop, yet they didn't try to use either as a performance comparison. Nothing crazy but just a Super position run would have sufficed.

The capture issues were strange although I do recall Linus having weird Chroma Sampling / gamma issues when doing their game test stream. Possibly something odd going on with HDCP maybe?

The overlay to the control panel is weird as on first load it doesn't look like its an overlay. I kept trying to move it around and then realised why it wouldn't move.

Just had some really weird performance issues in CP2077. In game benchmark was giving me 60FPS @ 1440p Ultra yet only 47FPS @ 1440p High. Then in game at Ultra settings it then refused to go above 35FPS....

On the plus side when it is working, High @ 1440p with Quality FSR achieves a decent 60FPS lock so the game is quite playable. Ray Tracing seems unplayable, at least at 1440p. Not really sure if I was expecting too much or there is a driver issue (given the above experience I am not ruling out the above).

No crashes yet that weren't caused by me being stupid with the Control panel settings.

Additionally ran Timespy and Superposition and scored:

Timespy - 13,141
Superposition (1080p Extreme) - 7,955

Both compare favourably to the 3080M (3070) so there is certainly potential for the card / architecture to perform well.

How much of the weird performance swings can be fixed in software is somewhat unknown but it will certainly be interesting to see what they can do (again, I would not recommend buying one on the basis that it *might* get more in line with a 3060Ti / 6700(XT) in the future - DX9 titles excluded given the architectural limitations of course).

EDIT: One last test before I really need to get some sleep!

Tried out the FX9 Unigine Heaven and it scored 2595 (Avg 103 FPS) with the 1080p Extreme preset as per the benchmark thread. Whilst that is around 40% slower than the 3080M it isn't as bad as I was expecting. :)
GN say the cyberpunk presets are broken and only work when changing every setting manually.
 
Ok so I have picked up a 3060Ti FE to act as my primary gaming card until the next gen launches, mainly so I don't get too annoyed when I come across something that ARC refuses to run. Thought for a bit of fun (and science!) I would compare it to the A750. Will probably also throw in a few RTX3080M comparisons along the way as well because why not. I know Intel don't position the 750 against the Ti, but the pricing isn't that far off and you could always pick up the 6700 non XT for Ti like performance for around A750 money so it's not too daft either. Note - I went Ti as my primary display is G-sync only.

First quick taster results are in and, well, it is a real mixed bag for the A750:

FH5 - High - 1440p

ARC A750 - 101FPS
RTX 3060Ti - 147FPS

Destiny 2 - Highest (mix) - 1440p

ARC A750 - 91FPS
RTX 3060Ti - (circa) 120FPS - Horrendous stuttering which I think was caused by remnants of the ARC driver hanging around - was fine in FH5.


Superposition - 1080p Extreme

ARC A750 - 7,955
RTX 3060Ti - 7,064

Quite interesting results so far. Superposition highlights the theoretical performance that ARC does have pulling in RTX 3070 matching performance. However, FH5 reigns it back in with a 45% delta to the 3060Ti. Destiny 2 is DX11 so I was expecting the Ti to dominate here but was quite surprised by how much. I need to do more testing on this one as I am sure the FPS is also noticeably higher than the 3080M on my wife's laptop. I *think* this might be down to the stronger CPU (12700K vs 5800H) but its hard to say without digging deeper.

Saying all of that, I will say that subjectively the performance between the two cards was quite hard to tell apart in a FPS counter off and just enjoying the game kind of way. In all the games I have tested so far (more than the above) the A750 has not come away with an unplayable result or crashed for reasons that weren't caused by overclocking.

I now need to carry on doing more testing (tomorrow now due to the time). :)
 
Asrock should be in stock in around a couple of weeks, I am still questioning the pricing as it seems extremely high to me especially on the 750 and 770, to the point if it does not improve we shall be increasing pricing on those latter two lines:



I personally would stick with AMD or NVIDIA, but for those happy to play and experiment, well we have a third player to the table. :)
 
I am still questioning the pricing as it seems extremely high to me especially on the 750 and 770,

Yes, I think you are right to question. The A750 and A770 are at least £100 - if not £150 - too expensive. And the Asrock A770 is £50 more than the FE card available from a competitor (which again is at least £100 too much).

Then again, I paid you a £100 excess for my RTX 4090 :D :D :D

BTW the texts of your listings don't mention that they are Asrock cards - you have to look at the pictures quite closely.

Can someone give DCS world a whirl with the a770? Would be great.

Ping the Adamant IT2 and Grim Reapers Youtube channels.
 
Yes, I think you are right to question. The A750 and A770 are at least £100 - if not £150 - too expensive. And the Asrock A770 is £50 more than the FE card available from a competitor (which again is at least £100 too much).

Then again, I paid you a £100 excess for my RTX 4090 :D :D :D

BTW the texts of your listings don't mention that they are Asrock cards - you have to look at the pictures quite closely.



Ping the Adamant IT2 and Grim Reapers Youtube channels.

Spoken to Asrock, it seems Intel is supporting launch MSRPs. Willing to do a lower price but only for a limited amount of stock at launch. As such the prices for Asrock on our site are pretty much correct for their long term price, if they are reduced it will be for a limited quantity at launch.
 
Last edited:
Spoken to Asrock, it seems Intel is playing the fake MSRP game. Willing to do a lower price but only for a limited amount of stock at launch. As such the prices for Asrock on our site are pretty much correct for their long term price, if they are reduced it will be for a limited quantity at launch.

Can't stand this dodgy MSRP business to win reviews of the back of artificial low launch pricing. Just launch at a price and make sure that is the price also in six months time. So if our price drops BUY, the talks so far seem to be around doing a special supported price for just 5-10 units sold, I will be bluntly honest, absolute joke!

How comes you're about 40 quid more than other places currently? Is it cause you didn't have at launch?

I would have bought day 1 at MSRP but is a shame the days of US pricing not being 1:1 are gone (obviously not your fault)
 
How comes you're about 40 quid more than other places currently? Is it cause you didn't have at launch?

I would have bought day 1 at MSRP but is a shame the days of US pricing not being 1:1 are gone (obviously not your fault)

Could be, were a little fedup of playing the fake MSRP launch game, so right now the prices on our website are like 6-8% margin based on the TRUE cost of the product, not a supported/rebated cost for a select quantity.

My advice is to anyone wanting an ARC buy them whilst they are somewhat cheaper, because the prices you see on our site will be the norm at least for Asrock anyway, this is not to say its an Asrock issue as well as but considering its such a small amount of products there seems quite a bit of confusion.
 
Last edited:
Spoken to Asrock, it seems Intel is playing the fake MSRP game. Willing to do a lower price but only for a limited amount of stock at launch

Then Intel are being foolish: on a price / performance basis the A750 and A770 are simply overpriced. The A750 should be noticeably cheaper than the Radeon RX 6600 (i.e. sub £260) and the A770 noticeably cheaper than the RX 6600XT (i.e sub £350).

I hope you sell loads but with competitor stock readily available I think only the experimenters are going to bite at those prices.
 
Last edited:
Then Intel are being foolish: on a price / performance basis the A750 and A770 are simply overpriced. The A750 should be noticeably cheaper than the Radeon RX 6600 (i.e. sub £260) and the A770 noticeably cheaper than the RX 6600XT (i.e sub £350).

I hope you sell loads but with competitor stock readily available I think only the experimenters are going to bite at those prices.

Doubt we will sell many, I firmly agree with you, I'd personally buy the AMD 6600/6600XT. :)
 
Definitely something odd going on when it comes to driver CPU overhead.

Continuing on my testing this time with Hitman 3, 1080p/1440p Ultra settings - built in benchmark

1080P

Dartmoor


A750 - 101.41
3060Ti - 148.06 (+46%)


Dubai

A750 - 158.18
3060Ti - 198.34 (+25%)


1440p

Dartmoor


A750 - 78.71
3060Ti - 117.01 (+48%)


Dubai

A750 - 107.86
3060Ti - 127.19 (+18%)


Performance loss from 1080p to 1440p

A750 - 22% / 32%
3060Ti - 21% / 36%


Some interesting results which highlight that there is a possible CPU overhead in the drivers holding the Arc cards back. The performance in the more CPU bound Dartmoor test is playable but well behind the 3060ti. In the GPU bound Dubai test the delta is way lower, even at 1080p where both cards are well north of 120FPS. The delta between cards is halved switching from the CPU bound to GPU bound tests. I am using a 12700K so I would expect minimal if any CPU limitation for these cards, especially at 1440p.

The A750 also loses less performance jumping to 1440p from 1080p in this GPU bound scenario. I can only guess that this performance loss delta would be greater against the 3060/RX6600 class cards given their weaker backends.

Now as a side note, I had to do a double take with the 1440p Dartmoor results. Going by Techspot / HUB my A750 is is performing in line with their results, but my 3060Ti is outperforming the 3070Ti in their table. Now they helpfully don't list where the results are taken from, but I was kind of assuming that the 3060Ti would align given the A750 does.
I have also checked to see if the new Nvidia driver is making the difference and according to the TPU article it makes only a minor difference in Hitman 3. I have re-run the results multiple times and go within margin of error each time.

Whatever is causing the performance issues on the ARC card(s) it is quite fascinating seeing the numbers and how close / far away they can be from a similar class card. Now fingers crossed Intel can keep driver updates coming which will hopefully release at least some of the potential.


EDIT: Just seen there is a beta driver a week never than the stable release driver. Will give that a whirl later today (was going to say tomorrow then saw the time... probably should get some sleep!!).
 
Last edited:
Latest beta driver has made no noticeable difference in either Destiny 2 or Hitman 3, at least not in terms of performance. I do now appear to have the correct refresh rates displaying for my second screen which is nice. Was either a bug with the original driver or might have been resolved by simply reinstalling that driver. Either way it is now correctly displaying 240Hz.

Done some more tests around driver overhead by trying to rule out a CPU bottleneck in isolation. Turned up the wick on my 12700k so it now pushes 5.3/5.2/4.0 (efficiency) depending on core load and it made a grand total of no difference in either game tested. Next thing I think I will try is either ensuring the game only can run on the P cores and maybe turning off the E cores altogether to see is there some kind of weird core load going on in the background. Doesn't cost anything beyond time so no harm in trying. :)
 
Asrock should be in stock in around a couple of weeks, I am still questioning the pricing as it seems extremely high to me especially on the 750 and 770, to the point if it does not improve we shall be increasing pricing on those latter two lines:



I personally would stick with AMD or NVIDIA, but for those happy to play and experiment, well we have a third player to the table. :)
Grim, I paid £389 for the 16GB intel one.
 
Well, I got impatient and my A770 arrived this morning and I've put it in my backup system, an Intel i5-8700 (non-K), and got 11097 on Timespy.

I also ran the XeSS tests and with a display resolution of 4K I got 36.8 fps in performance mode
 
Back
Top Bottom