• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Arc series unveiled with the Alchemist dGPU to arrive in Q1 2022

I'm not sure how much can read into this, just had a quick scan, looked interesting anyway.



Further reading here: https://www.pcgamer.com/overclocker-gives-intels-arc-a380-gpu-more-power-and-sees-huge-gains/

Tom's Hardware should really know better, either they didn't even bother to look at it or they really are scraping the bottom of the barrel for click bait revenue.

Look at the clocks.

So a 48Mhz overclock gave him 40%? Yeah right.......

Blyat!

Give me some of that green Intel, the ruble is worth about 12p right now.... PLEASE!!!!!!
 
If i have a bottleneck in the system my frame rates will be lower, how much power my GPU draws will also be lower.

If i remove that bottleneck my frame rates will be higher, along with the GPU's power consumption, the clocks the GPU is running at aren't necessarily going to be any different.

They are not a 35 watt GPU.

These Intel GPU's have a problem with Re-Bar, in that if you don't enable it performance is significantly down.

It does not gain 40% from overclocking, the whole review is a complete none sense. Toms Hardware should know better, it should be obvious to anyone who understands what they are looking at.
 
Last edited:
CB have another article.
They got rebar working on their Ryzen testbed despite Intel themselves saying it doesn't work:


They also got idle fan stop to work with Ryzen which hadn't worked on the Intel platforms.

Problem is that because Intel use 20W at idle, the GPU quickly heats up. Then the fan comes on cools the GPU down but not enough and while it then turns off it will come back on a few seconds later.
 
If i have a bottleneck in the system my frame rates will be lower, how much power my GPU draws will also be lower.

If i remove that bottleneck my frame rates will be higher, along with the GPU's power consumption, the clocks the GPU is running at aren't necessarily going to be any different.

They are not a 35 watt GPU.

These Intel GPU's have a problem with Re-Bar, in that if you don't enable it performance is significantly down.

It does not gain 40% from overclocking, the whole review is a complete none sense. Toms Hardware should know better, it should be obvious to anyone who understands what they are looking at.
don't forget that intel have an extra setting in their tool that does alter the power draw limit for the card, separate from voltage and frequency, etc etc - it's new compared to amd/nvidia options. That is what was increased. The rebar side was also altered and was shown near the start to increase wattage and fps, but then it was shown with that slider turned up. I believe the slider was shown and explained slightly in linus's demo he had.
 
don't forget that intel have an extra setting in their tool that does alter the power draw limit for the card, separate from voltage and frequency, etc etc - it's new compared to amd/nvidia options. That is what was increased. The rebar side was also altered and was shown near the start to increase wattage and fps, but then it was shown with that slider turned up. I believe the slider was shown and explained slightly in linus's demo he had.
Okay.... then i look forward to 40% overclocks.

I mean, we will soon find out if its BS, Intel.
 
It's hard to know exact what's going on - in the Linus video, Tom specifically said you can't trust what on screen overlays are showing because they tended to not work properly for Arc yet
 
TPU have a review of the A380 out too:
Raytrace loses are a bit less than AMD's. Well, a lot less if sticking to the 6400 but then maybe due to the 4GB those cards often cannot run with RT on.
Perf/watt is also poor in TPU's tests:
cCx4IGQ.png
Conclusions:

  • The Intel Arc A380 is only available in China at this time, at a price of around $190. The Intel MSRP is $150.
Pros:
  • Discrete graphics from a new player
  • Decent entry-level performance
  • Idle fan stop
  • Low temperatures
  • PCI-Express 4.0 x8 (unlike PCIe 4.0 x4 on RX 6400)
  • Support for DirectX 12 and hardware-accelerated ray tracing
  • Better RT performance than AMD, worse than NVIDIA
  • Support for DisplayPort 2.0
  • Support for AV1 hardware encode and decode
  • 6 nanometer production process
Cons:
  • Low overall performance
  • Expensive for the performance offered
  • Completely unusable without resizable BAR due to terrible stuttering
  • Immature drivers and software
  • High idle power consumption
  • Only available in China
  • No memory overclocking
  • Power monitoring inaccurate
  • Fan keeps stopping and starting during long periods of idle
  • Games don't run with latest non-Beta driver (some fixed in latest beta)
 
Maybe all those extra transistors were used for something after all?
Nvidia have only used GA107 for mobile 3050s so far and TPU in their GPU spec database don't have die size for GA107, and can't find any references. In theory it should be something like 2560/3840 of GA106 but that's a bit simplistic as some things don't scale. Anyway, 2560/3840 of 12 billion is around 8 billion which is pretty close to A380's 7.4 billion for whatever that is worth.
In the TPU raytracing tests, A380 only won when at 4K. Obviously these cards are far too slow for 4K. RTX 3050 has more memory bandwidth so something else is at place. Maybe the RT hardware has caches or something?
 
the a380 even beat a rtx2060 in a couple RT games, should be interesting in the higher end stuff

Only at 4K, in everything else it got humiliated.

That could be down to the RTX 2060 only having 6GB vs 8GB on the A380.

The RX 6500XT only has 4GB.

I wouldn't get too excited about Intel's RT performance, most of the cards it was up against seem to be hampered by a lack of Vram rather than RT performance.
 
Only at 4K, in everything else it got humiliated.

That could be down to the RTX 2060 only having 6GB vs 8GB on the A380.

The RX 6500XT only has 4GB.

I wouldn't get too excited about Intel's RT performance, most of the cards it was up against seem to be hampered by a lack of Vram rather than RT performance.
humbug certainly living up to his name, every post in here is some negative biased take..

It's about time NVidia stopped skimping on ram
 
The rumour mill has been doing the rounds on the future for Intel's dedicated graphics cards and it's been reported upper management are not happy with the late release of Alchemist and the poor state of the software and are seriously considering discontinuance for future releases. I do hope that's not the case, even with Intel's deep pockets it's was always going to be a tall order for them to hit a home run with it's first proper launch.
 
The rumour mill has been doing the rounds on the future for Intel's dedicated graphics cards and it's been reported upper management are not happy with the late release of Alchemist and the poor state of the software and are seriously considering discontinuance for future releases. I do hope that's not the case, even with Intel's deep pockets it's was always going to be a tall order for them to hit a home run with it's first proper launch.

GPU's is hard yo..... doesn't matter how deep your pockets are, money doesn't buy you IP and experience.

*Sigh*

They can't launch this crap, forget ARC and try again with something else, don't waste any more time on it.

 
Also, Intel's pockets are not that deep anymore.

Client computing, that's Core series CPU's, Operating Income is down 73%
Datacentre, Xeon ecte... Down 90%

They have just quit on Optane

Margins are now 46%, down from 50% last quarter, 2016 margins were 63%.

In the last week Intel have lost 25% of their market value.

The last slide, basically Intel are giving away their Xeon CPU's for cost.



NWQV165.png

XWJWquq.png
 
Last edited:
Back
Top Bottom