• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Intel Arc owners thread

Soldato
Joined
1 Apr 2014
Posts
19,072
Location
Aberdeen
I have an Intel Arc A770 and I believe @oweneades has one too. Anyone else?

To get things rolling here are some results (copied from this thread):

Running the benchmark in Tiny Tina's Wonderlands now at 4k, medium settings and getting an average of 57 fps. Occasional judder. The judder is removed by turning down the texture streaming from High to Medium. This raises the average fps to 61. I'm using a SSD and not a NVME drive which may account for the streaming issue. I get 110+ fps solid on my 4090 on Badass settings - everything cranked

Here are my settings:

HJRIL76.jpg

Horizon Zero Dawn gets over 50 fps at Medium Settings with motion blur turned off.

TxZlC2s.jpg

The Forza Horizon 4 benchmark gives over 60 fps at Medium and High settings at 4k.

awbqhrC.png

Superposition managed about 70fps at 4k.

TiB941u.png
 
How you guys finding it so far?

I mean for the price it feels like stonking good value and the results from what you've posted so far seem good.
 
How you guys finding it so far?

I mean for the price it feels like stonking good value and the results from what you've posted so far seem good.

I think it's about £100-£150 too expensive as a gaming card: the A770 needs to be about £50 cheaper than its competitor, the RX 6600XT (maybe RX 6650 XT). That said, fine wine will almost certainly apply if Intel commit the resources, especially with DX11 and - particularly - DX9 games. DX9 games currently go through a translation layer which kills performance; make a native DX9 layer and performance there will increase massively. Hell, I used to play Freespace 2 until recently (recently I just watch videos) and that's DirectX 6 or 7! We'll see.

I've only given mine very limited usage - Tiny Tina's Wonderlands, mainly. I have also only used DX12 games and benchmarks.

My initial view, based on a whole 1.5 day's usage, is that Intel have made a fine start and if they stay the course their next cards will compete with Nvidia and AMD at the top level.

I will also point out that Intel have used the same 3/5/7 naming scheme as their CPUs which means that there may be an Arc A9xx in reserve.
 
Remind me, which CPU do you have? If you have one that is much better than my 8700 then it would be interesting to see how much difference the CPU makes.

Currently a 12700K @ 5.3/5.2/4.0 (efficiency) so it should be making a difference in CPU bound areas.

Per the other thread I think there is definitely a driver overhead issue coming into play. Its relative performance in more CPU heavy scenarios such as the Hitman 3 Dartmoor benchmark or the Destiny 2 tower (notorious on older Ryzen systems for being bad) is a noticeable step-down vs when it can stretch its legs.

Going from 48% behind to 17% behind vs a 3060Ti in Hitman 3 does point to this. I am not really CPU bound on the 3060Ti (@1440p) so the ARC card shouldn't suffer such a performance hit in the same benchmark on the same CPU when it is relatively strong in the other benchmark.
 
@Quartz @oweneades
Why did you pick up an ARC GPU? Did you just want to test it out, or was there a feature it has that was useful for you?
Do you plan to have the ARC GPU as the only GPU in your system? Or in future will you stick it alongside an Nvidia or AMD GPU?

If memory serves me right, it is pretty good in Blender, so I could have some use for it as a secondary GPU in my system. (Need to upgrade the 970 first though. :D )
 
Last edited:
@Quartz @oweneades
Why did you pick up an ARC GPU? Did you just want to test it out, or was there a feature it has that was useful for you?
Do you plan to have the ARC GPU as the only GPU in your system? Or in future will you stick it alongside an Nvidia or AMD GPU?

If memory serves me right, it is pretty good in Blender, so I could have some use for it as a secondary GPU in my system. (Need to upgrade the 970 first though. :D )

For me it was the tinkering aspect. I really wanted to try one out for myself in the games I actually play to get a proper feel for it. Plus delve into different scenarios deeper than that seen in reviews.


I picked up a 3060Ti FE as my day-to-day GPU mainly, so I have something reliable (in terms of performance by game). My main monitor is G-sync only otherwise I would have got the 6700 non XT. I am also using it as a reference point for the ARC GPU as based on the on-paper specs the A7xx should be around GA104 in terms of performance. In my own testing it gets there in synthetics and odd random game test but falls behind in general, especially at lower resolutions.
 
I've installed the latest beta driver and have noticed a slight degradation in performance in Tiny Tina's Wonderlands - a bit of stuttering - but that may have been TTW autoadjusting settings until I set them back.
 
Also tried the driver and noticed a consistent performance uplift in Destiny 2 (around 5FPS in the Throne World) and a *slight* uplift in Hitman 3 although that could be more run to run variance. Not had time to test anything else at the moment.
 
Ended up picking up an A750 from OcUK for my work machine. Good upgrade from a gtx 970, and my exports in after effects and lightroom are noticeably quicker.

Tried 3dmark11 and got 17k with the 970 and 25k with the A750.

Tried to run 3dmark TimeSpy, but get a dx3d adaptor error. Currently running the beta drivers.
 
Ended up picking up an A750 from OcUK for my work machine. Good upgrade from a gtx 970, and my exports in after effects and lightroom are noticeably quicker.

Tried 3dmark11 and got 17k with the 970 and 25k with the A750.

Tried to run 3dmark TimeSpy, but get a dx3d adaptor error. Currently running the beta drivers.
My A770 worked fine in Timespy, weird. Has the worst coil whine in history though, really not joking. High frame rates you can hear it outside with the door closed.
 
Back
Top Bottom