• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Arc series unveiled with the Alchemist dGPU to arrive in Q1 2022

Maybe, but how much money do they want to lose and for how long? Intel are not what they used to be, the big money bags are gone.

If Intel don't get a massively parallel processor architecture bedded in as a serious competitor, all the money bags will be gone. PC isn't what it used to be either. Consoles, mobile devices and streaming game services have taken most of the gaming market and existing PC hardware is already overpowered for most of the rest of the things done on PCs. Intel's existing x86 processor architecture isn't all that great for workstations or datacentres. I think they need Arc to succeed. Maybe not much. Maybe not as a mass market product. But enough to establish the name as at least a mediocre third place product to improve the chances of the next generation (Battlemage) being an actual success. But even if they sell Alchemist at a loss, they're got a huge amount of work to do on drivers. If they're going for the pro market (which they are), the awful state of their current drivers is a complete dealbreaker. Kit wouldn't sell in that market for 99p if the drivers are that bad. Gamers might put up with instability, semi-functional control panels and uncertain performance if the hardware is cheap enough and has potential and the drivers are being worked on. Professional users who need their kit working properly and reliably can't afford to.
 
It's hard not to see the writing is on the wall for ARC at this point, Intel have admitted that they are losing money and market share in the high margin CPU segments of server/DC and won't be competitive until 2025 or '26, which means less 'spare' cash to go around for projects that are not generating revenue, or indeed profit, for a good while.

Its a real shame for the consumer GPU space as well, as a third company properly involved would have been a good catalyst for natural price competition and further innovation, which is only starting to return after many years of being absent.
 
That's the same source though (MLID). We need to see it independently verified.
its 2022 we don't need ianything verified any random rumour on the internet is gospel the old "it's on the internet it must be true" has never been more true than now.

I don't think the human race has ever been more retarded than now.


all these leaks in relation to GPUs that are more than likely just made up numbers in someones basement too.... no one ever calls BS it's always "the specs have changed" like nvidia or whomever has deciding to change them. nothing was ever verified!

The truth no longer matters like ricky gervais recently stated, it's whos saying it that matters
 
Intel can still do it. All they need to do is price their cards competitively. Price the A380 at £99 and the A770 at £250 or less and they're set.
They already proved with the A380 that they're not interested in aggressive pricing to carve out market share. Everything about the upcoming launches just screams damage limitation and cutting losses. The "limited edition" reference card that'll only be available in select countries, with no mention of AIB models. The bizarre apology tour from Ryan Shrout and Tom Petersen, detailing the architecture's many failings and how they missed their target of competing with the RTX 3070. Admitting the drivers are half-finished (at best), with Pat Gelsinger publically lamenting decisions that they've made on that front. Outright telling you not to bother with Arc and buy a competitor card if you don't have ReBAR support, because they'll perform like crap. Tempering expectations on how these cards will age and warning not to expect any major performance increases in the future from driver updates. Delaying and delaying the launch to the point that they didn't just miss the crypto bubble, but will end up being kneecapped by AMD and Nvidia's announcements of a brand new generation of products.

Frankly, I wouldn't be surprised if MLID is right. Nothing here suggests that Arc is a product aimed at seriously competing with AMD and Nvidia. It seems more like an embarrassment to them that they're releasing out of obligation at this point, with a limited production run to sell what they have to curious hardware enthusiasts and collectors, before quietly sweeping the whole thing under the rug.
 
Lets say the A380 is $110 cost to manufacture, they sell it for $140, that's $30 profit on each one.

Lets say they sell 1 Million of them, which is extremely optimistic, that's $30 Million, that's nothing, it really is nothing at all.

That's not including development costs, i think it was Intel who said its cost $3,500 Million to date to develop, subtract that $30 Million you're $3,470 Million in the red.

Intel don't have that sort of money to burn anymore, they burned it all trying and failing to keep AMD from establishing themselves in data-centre and getting their failed fabs working.

The only logical conclusion is for graphics at least it has no future.

DX capable graphics, its hard yo, and expensive.
 
Seems like madness for them to quit when they are 80% of the way there.

They must be in some real trouble to have pumped in that amount of cash only to "cut their losses" at this point.

Can they even pick up their research at a later date? It's prolly all obsolete in a few years.

What a waste.
 
Lets say the A380 is $110 cost to manufacture, they sell it for $140, that's $30 profit on each one.

Lets say they sell 1 Million of them, which is extremely optimistic, that's $30 Million, that's nothing, it really is nothing at all.

That's not including development costs, i think it was Intel who said its cost $3,500 Million to date to develop, subtract that $30 Million you're $3,470 Million in the red.

Intel don't have that sort of money to burn anymore, they burned it all trying and failing to keep AMD from establishing themselves in data-centre and getting their failed fabs working.

The only logical conclusion is for graphics at least it has no future.

DX capable graphics, its hard yo, and expensive.
This Tech will be used in a lot more than gaming. They will use in iGPU's, data centre cards... Even if it’s a loss in gaming they will still get $$ from other areas. They have proved that they can outsell AMD, even with terrible products, big OEM's will probably sell lots of them. They just need to have stable drivers. The first gen was always going to be like this, it’s actually a lot better than I thought it would be. They could also try and get in the next consoles.
 
The question is will they get it stable enough?

Is it just drivers/software or, as has been rumoured before, are there some fundamental hardware issues? And if they are present then are they also present in Battlemage which is well into development?

If they've spent $3.5bln so far and it's entirely wasted I could see them cancelling it really, how many more billion would it require to get something vaguely sellable?
 
or, as has been rumoured before, are there some fundamental hardware issues?

I remember hearing about that. It was reported / announced on MLID and the story sank. And stank. No one else could corroborate it and games seemed to work just fine when tested by the usual sites - just slowly.
 
I remember hearing about that. It was reported / announced on MLID and the story sank. And stank. No one else could corroborate it and games seemed to work just fine when tested by the usual sites - just slowly.

Haven't there been various flickering/artifact issues with some of the refresh rate sync options? Also ReBar and the pre-DX12/Vulkan stuff. Who knows, it's unlikely but I don't think it's entirely ruled out.

If it was just drivers why isn't it released properly yet? They've given enough access to reviewers to get the bad press of "the drivers are shockingly bad" which would be the only reason to delay the release if the drivers were the only issue... There's got to be something more there.
 
This Tech will be used in a lot more than gaming. They will use in iGPU's, data centre cards... Even if it’s a loss in gaming they will still get $$ from other areas. They have proved that they can outsell AMD, even with terrible products, big OEM's will probably sell lots of them. They just need to have stable drivers. The first gen was always going to be like this, it’s actually a lot better than I thought it would be. They could also try and get in the next consoles.

OEM's are not going to put broken tech in their systems, Jon Peddie made this same point, millions will buy it just because its Intel, no, those days are gone.
 
In fact, what it come to OEM graphics millions buy it just because its Nvidia, and increasingly if its also got a Ryzen sticker on it. I know people buying hardware who know nothing about it, they don't know who AMD are, but they do know what Ryzen is.

AMD? Oh... Ryzen, yeah they make good chips.
 
OEM's are not going to put broken tech in their systems, Jon Peddie made this same point, millions will buy it just because its Intel, no, those days are gone.
OEM's would put a block of cheese in if the price was right and it was stable. I don’t think Intel will just drop out of the GPU market because the first GPU has issues, having a CPU+GPU is a big deal, and they need to keep going. The first gen R&D costs will be much higher than the second gen as they can fix and refine instead of having to do everything. Also, we don’t know it’s broke hardware, it could be drivers as drivers are very difficult to create in general and GPU drivers are one of the most difficult. Even if the hardware has issues, a lot of the time software patches can fix things. It does not need to be the fastest, but it does need to be stable.
 
Back
Top Bottom