• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Intel Arc owners thread

Anyone had any experience with arc and Plex transcoding in windows?

I'm thinking of getting an a770 for this purpose as the drivers have improved it a lot recently..but it must work with Plex
 
anyone with first hand experience of how current drivers are doing? A770 looking tempting now that 16gig model is at 300 quid.

It does still depend on the games you want to play but with the most recent Beta drivers the performance is finally fixed in CyberPunk 2077 (along with work no doubt from CD Project Red).

Overall performance in the latest titles isn't that far off my Legions 3080M (clocking in at desktop 3070 Tier performance - but with 16GB of VRAM so none of its desktop counterparts problems...).

For example, at the same settings in Hogwarts - 3440x1440, high with a couple settings in medium and DLSS Quality / XeSS Quality (games reports same render resolution) the Legion is in the 80-90 FPS range whilst the A770 is in the 70-80 range. It is still behind in something like Destiny but half of the blame there can be with Bungie as based on my own experience, if you aren't running Nvidia then GTFO (seriously it took months for RNDA2 to be remotely playable).

So overall, both my A750 and A770 have come on leaps and bounds compared to the original launch drivers. There is still some sort of driver overhead causing performance to not quite be where it should be but based on the uplifts seen so far it will hopefully only be a matter of time before we see even more uplifts in the future. At the very least it bodes well for Battlemage (as long as the price is right of course).
 
It does still depend on the games you want to play but with the most recent Beta drivers the performance is finally fixed in CyberPunk 2077 (along with work no doubt from CD Project Red).

Overall performance in the latest titles isn't that far off my Legions 3080M (clocking in at desktop 3070 Tier performance - but with 16GB of VRAM so none of its desktop counterparts problems...).

For example, at the same settings in Hogwarts - 3440x1440, high with a couple settings in medium and DLSS Quality / XeSS Quality (games reports same render resolution) the Legion is in the 80-90 FPS range whilst the A770 is in the 70-80 range. It is still behind in something like Destiny but half of the blame there can be with Bungie as based on my own experience, if you aren't running Nvidia then GTFO (seriously it took months for RNDA2 to be remotely playable).

So overall, both my A750 and A770 have come on leaps and bounds compared to the original launch drivers. There is still some sort of driver overhead causing performance to not quite be where it should be but based on the uplifts seen so far it will hopefully only be a matter of time before we see even more uplifts in the future. At the very least it bodes well for Battlemage (as long as the price is right of course).
that sounds very promising.
 
Even when it's the game developers fault, I couldn't deal with a game simply not working/having major problems with an ARC card, then watching everyone else play the game just fine.

I remember when I had a Clevo laptop with Radeon 7970m and GTA V PC release simply would not work. It was many, many months before it was fixed and that put me right off buying anything that wasn't 'mainstream'.

Also, looking at the benchmarks, the A770 seems really slow :confused:
 
Last edited:
do these arc cards have adaptive vsync like nvidia cards? such as they lock to vsync 60fps but when the car cant do 60fps it auto turns off vsync to limit the framedrop that vsync would have caused if it was on and couldnt do 60fps.
 
Freesync via Displayport. They call it Smooth Sync.
Thats not what im on about. Im on about v-sync not freesync or gsync.

Vsync works on any monitor or tv and stops image tearing but if you cant maintain 60fps and can only do e.g 58fps then fps will drop to next vsync point which is 30fps (or 45 if triplebuffering is available).

I'm guessing this adaptive vsync is a nvidia unique feature which turns vsync on if your at 60fps and auto turns it off if you drop below 60fps. Major selling point for me since i always play with vsync on and cant stand tearing but dont want to fork out for a screen that supports other sync options since i use a tv as a monitor and missus uses it as a tv when im not on the computer

screenshot of the option in nvidia control panel:

vsync.jpg
 
Last edited:
Back
Top Bottom