• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Arc series unveiled with the Alchemist dGPU to arrive in Q1 2022

AMD can split an X86 CPU in to its constitute components and stack them as individual chips both in the 2D and 3D, Intel can't, they are still monolithic. AMD are the first and still the only ones with this ability in X86. 5 years and counting, for 2D stacking.
AMD X86 CPU's are also architecturally more efficient, much smaller more power efficient cores for similar performance per core, its why Intel are losing the hyper scale data-centre battle. They just cannot compete with AMD's offerings.

TSMC are on 6nm, 5nm and now 3nm, Intel 7 is equvilent to TSMC 7nm.
Whos got the fastest ST performance? the fastest MT performance ?
whos got the best gaming CPU?

No one measures nodes in nm scale they are both just marketing terms.
Intel 7 > 4 > 3

TSMC doesnt even launch 3nm until 2023 and isn't that the same year intel is meant to be ready with intel 3.
so how is it 3 generations? seem more like 1 or 0.5 until they could catch up


people will suddenly see power efficiency having no meaning once AMD fall behind at it or if there GPUs need more power than NVDA, also in gaming who was more power efficient again? alderlake
 
Whos got the fastest ST performance? the fastest MT performance ?
whos got the best gaming CPU?

No one measures nodes in nm scale they are both just marketing terms.
Intel 7 > 4 > 3

TSMC doesnt even launch 3nm until 2023 and isn't that the same year intel is meant to be ready with intel 3.
so how is it 3 generations? seem more like 1 or 0.5 until they could catch up


people will suddenly see power efficiency having no meaning once AMD fall behind at it or if there GPUs need more power than NVDA, also in gaming who was more power efficient again? alderlake

I'm not denying Alder Lake is a good CPU, it is, but its very much 'Brute force' and that just wont do in the professional space, AMD's EPYC line of CPU's are very much better.

Who has the best gaming CPU is debatable, in a lot of edge cases, MS Flight Sim, Stellaris, Star Citizen.... to name a few the 5800X3D is significantly better than any Alder Lake with the best DDR5 memory, even for the mainstream games Zen 3D and Alder Lake are very even.

You can point at a few slides where Alder Lake has better performance per watt in selected games, you can also reverse that argument by pointing at other slides, different CPU's have different performance levels with different games and they use the CPU differently, every different test that you do can give you a positive or negative result, some people like to pick out sides that agree with their argument with the pretence that the slide they are citing is a global truth, it isn't.

The fact remains global corporation testing these CPU's with a view to buy tens of thousands of them agree EPYC CPU's are far more efficient.
Reputable reviewers testing these CPU's in high load high stress scenarios also agree, Ryzen is far more efficient than Alder Lake.

Power efficiency has meaning now more than ever, as everyone is looking at their energy costs.

PS: Intel will be on 5nm When?
 
Last edited:
Due Intel having been so profitable for so many decades, it is easy to forget that Intel have a long history of spending $billions and having little to show for it.
Pricing all these adventures is hard (plus it would have to adjusted for inflation), but a short list includes:
  1. Entry into networking when they were worried about Cisco and Juniper etc. Some of the work did lead somewhere.
  2. Larrabee aka x86 everywhere. A minor spin-off did launch as Knights Landing etc.
  3. Contra revenue to try save Atom. Dumping is illegal in some markets and segments but this wasn't quite the same.
  4. 4G and 5G modems
  5. McAfee
  6. some other things I've forgotten?
so yes, they have made so much profit over the decades but being the size they are also means they have wasted so much money it is hard to keep track. The above must be close to $10 billion in today's money.

Well not if Intel do their usual thing and give up on their entry into dGPUs!
It's a bit of shame the A380 drivers are so poor ATM. The 6GB of VRAM and good media capabilities are something the competition does not have in that range currently.
 
Seems Intel have now released some comparison of A770M as well. That seems to be based on the full DG2-512, but TPU seems to think that A730M is a very cut-down version of that chip too:
Just about the same speed as a 85W RTX 3060 Mobile.
To get there they have used a 406mm² TSMC 6nm chips (although TPU does say that for the A770M only a 3072 out of 4096 shader part), and a transistor budget of 21,7 billion. GA106 is 276mm² on the far worse Samsung 8nm and has about 12 billion transistors. Mobile Navi 23 is about the same performance is the RTX3060 mobile and is 11 billion transistors on 237mm² TSMC 7nm.

So pretty unimpressive. Power wasn't really mentioned, but in the Tom's article
they say that Intel say the A770M is between 130W and 150W. That's about worse than RTX 3080 Mobile (about 115W) or Radeon 6800M (about 145W) but without the performance.

EDIT: good post over on AT going on about perf/watt, perf/area etc:
although if I was making that comparison I would stick to mobile parts only. Still not looking good despite the usual Raja hype.
 
Seems Intel have now released some comparison of A770M as well. That seems to be based on the full DG2-512, but TPU seems to think that A730M is a very cut-down version of that chip too:
Just about the same speed as a 85W RTX 3060 Mobile.
To get there they have used a 406mm² TSMC 6nm chips (although TPU does say that for the A770M only a 3072 out of 4096 shader part), and a transistor budget of 21,7 billion. GA106 is 276mm² on the far worse Samsung 8nm and has about 12 billion transistors. Mobile Navi 23 is about the same performance is the RTX3060 mobile and is 11 billion transistors on 237mm² TSMC 7nm.

So pretty unimpressive. Power wasn't really mentioned, but in the Tom's article
they say that Intel say the A770M is between 130W and 150W. That's about worse than RTX 3080 Mobile (about 115W) or Radeon 6800M (about 145W) but without the performance.

EDIT: good post over on AT going on about perf/watt, perf/area etc:
although if I was making that comparison I would stick to mobile parts only. Still not looking good despite the usual Raja hype.

I'm going off the RX 6600XT because that is the closest thing to it

237mm^2
11,037 Million Transistors
RTX 3060 <> RTX 3060TI
180 watts
TSMC 7nm

The DG-512 is twice the size, twice as many transistors on a better TSMC 6nm Node, its not far off the size of an RX 6900XT and while TPU say its cut down at 130 watts + all it can manage is to keep up with an RTX 3060 Mobile?

All that in mind, yeah, its pretty bad, it looks like a Raja Koduri poor Volta special. I'd like to see if the full fat version can even beat the RX 6600XT which in a few months is going to be a low end class card.

If this is true its DOA.
 
Last edited:
Yeah it's not looking good for Intel, from our perspective.

It will still sell though no matter what we think, and that's because system integrators and OEMs will lap this **** up
 
Yeah it's not looking good for Intel, from our perspective.

It will still sell though no matter what we think, and that's because system integrators and OEMs will lap this **** up

No one wants slow high energy GPU's in their systems, Intel will have to pay OEM's to use them, and they probably will, which is fine, do it, see if Nvidia or AMD care, with the way Intel are going neither of them care about Intel up-turning their coffers now.

What was Intel thinking hiring the "Poor Volta" guy....
 
Yeah it's not looking good for Intel, from our perspective.

It will still sell though no matter what we think, and that's because system integrators and OEMs will lap this **** up
But how cheaply can Intel afford to sell a 406mm² TSMC 6nm chip for?

In theory for A730M with perfect yields they should be able to re-spin a DG2-256 chip for that eventually which might only be 260mm² or so but still.
 
It may appear in a few OEM boxes. But honestly i don't think we will actually see it on retail shelves, ever.

Who would buy a 250 watt RX 6600XT level card with probably bad drivers unless its about £150?
 
Self promotion is a skill. Not sure it should be the main reason to hire someone but still...

There was always a question about certain Radeon products: like what would this launch have been like if AMD had a bigger budget.
Near CEO material Raja has now answered that, and it doesn't look good.
 
Self promotion is a skill. Not sure it should be the main reason to hire someone but still...

There was always a question about certain Radeon products: like what would this launch have been like if AMD had a bigger budget.
Near CEO material Raja has now answered that, and it doesn't look good.

Yeah.

Raja tried to siphon off AMD's GPU division for himself, as a compromise AMD gave him full control of the now named "Radeon Technologies Group" but his political ambition didn't match his skill, Vega was large, power hungry, slow and problematic.
Ok, so AMD didn't have a lot of money to give him, sure that could have been the problem, but after AMD sacked him, and this was a coperate sacking, AMD marched in to 'RTG' with a few Ryzen engineers, a few months later AMD relaunched the Vega architecture with a 50% performance per watt improvement, on the same node, several more iterations of it were the staple of AMD's integrated graphics.

All AMD did was tweak it, and it turned from something that was objectively bad, in to something that was half decent. To me that said it all, and ARC looks like another Vega, doesn't it?
 
No one wants slow high energy GPU's in their systems, Intel will have to pay OEM's to use them, and they probably will, which is fine, do it, see if Nvidia or AMD care, with the way Intel are going neither of them care about Intel up-turning their coffers now.

What was Intel thinking hiring the "Poor Volta" guy....
thats the thing if intel are offloading these to oem's on the cheap, nvidia and amd will lose the oem sector, which in turn means they turn to gamers and push prices up even more to compensate for the oem business loss, thats my thinking anyway
 
thats the thing if intel are offloading these to oem's on the cheap, nvidia and amd will lose the oem sector, which in turn means they turn to gamers and push prices up even more to compensate for the oem business loss, thats my thinking anyway

Its Intel's MO to operate like this, its effectively using their cash reserves to deny a much smaller competitor who doesn't have these reserves sales, during the late 1990's to early 2000's AMD was frankly kicking Intel's backside in terms of performance and performance per watt on their X86 CPU's, it got to the point where AMD's market share hit 50%, Intel could see the writing on the wall and acted, AMD had spent the 1990's out developing Intel but they were still tiny in terms of wealth compared with Intel, so Intel literally bought up all the market share, for example they gave their CPU's to Dell for free and on top of that paid them $850 Million per year not to use any AMD products, they did a similar thing with HP and Compaq, it almost bankrupted AMD.

The market in those days was very much smaller than it is today, and AMD have a lot more cash reserves, if Intel tried to do that today they would be bankrupt long before AMD from just not getting sales., let alone Nvidia.

So, Intel can play their silly little games, they still are, IMO, but its not working, and there is a limit to how long and to what extent even Intel can go on spending billion on developing bad products only to then give them away to maintain market share, which they still aren't doing, eventually they have to take Nvidia and AMD head on with actually competitive products. This is not 1996.
 
Nevermind 1996.
Wasn't Bobcat/Jaguar vs Atom (at the time) was a technological loss for Intel. But contra revenue meant there was no money in for AMD. Ironically, due to the last gen console sales those Cat cores are possible the most common x86 chip ever.
 
Nevermind 1996.
Wasn't Bobcat/Jaguar vs Atom (at the time) was a technological loss for Intel. But contra revenue meant there was no money in for AMD. Ironically, due to the last gen console sales those Cat cores are possible the most common x86 chip ever.

lol yes, that's true, i think AMD had one tablet with those cores and a similar sort of handheld tablet PC from Acer, the W500P was one model, and it was good but yeah, that line of CPU's was never developed further.

This was the Acer one... can't remember what the other one was.

 
lol yes, that's true, i think AMD had one tablet with those cores and a similar sort of handheld tablet PC from Acer, the W500P was one model, and it was good but yeah, that line of CPU's was never developed further.

This was the Acer one... can't remember what the other one was.

Pretty sure this is an older tablet.
I had two of this specific tablet, one for me and one for my brother.

They were Windows 7 at launch.

I don't think I ever saw a Jaguar tablet.
 
Pretty sure this is an older tablet.
I had two of this specific tablet, one for me and one for my brother.

They were Windows 7 at launch.

I don't think I ever saw a Jaguar tablet.

The C-50 is a dual Bobcat core.


 
Back
Top Bottom