• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Intel Arc owners thread


I wonder what gaming performance will be like?
 
This adds some attention to the events leading up to the launch of CD, and the initial exclusion of Intel GPU's...

 
Not able to see the vid right now, but on my HTPC setup I've swapped out the case to get my dormant A770 back in to play. Seemed stupid to spend so much on the cooler low-profile card, when I could just get a bigger case. :D
I think once Arc support is sorted out, then I'll buy the game. It's been a real negative how they had handled it all initially, but I'm sure AMD's initial cash injection helped a lot with the decision.

I'd likely still play on the AMD card too. I just don't want to be supporting devs who lock out a whole vendor like they did.

Edit: Not sure if I have some sort of Arc Stockholm syndrome. It's just I have mostly had good experiences, compared to the internet's "I've never used one but..." nonsense.
 
Last edited:
The truth is it does cost the vendor money to get your card specifically optimised and especially to have your features integrated.

It costs the game developer money to do this and if they do this its not going to be free.

It was Nvidia who started this trend of going to game developers to have their card specifically optimised for, that was all a good thing then, right? Then it didn't matter that AMD initially weren't doing the same thing and having optimisation issues with some titles, nor did it matter when AMD features were not added to some titles, i notice Jay only cited AMD doing this to Nvidia, which a couple of times they have but then also failed to include Nvidia doing this to AMD, as always its clear to see who he's more afraid of upsetting and choosing very carefully what he says. It makes me feel nauseous.

This is a thing now, its been a thing for a long time and its way too late now to complain about it, should have done that long before this sort of crap made the Nvidia monopoly that it is today, your constant past Nvidia boot licking, Jay, is why we are where we are at, so now just like AMD Intel also have to cough up the cash to get a look in.

This is how it works now... its not a choice AMD have, they are forced into it by a trend Nvidia set. You're a fool for not foreseeing it, i find it incredible that you still don't understand what is going on. Yes its about who has the most money, that's not AMD. Fool.
 
Last edited:
Yes its about who has the most money, that's not AMD. Fool.
I don't know why either one of them would bother. How many cards have Intel sold? 2? Doesn't seem worth the effort. Nvidia's market/mindshare is so big they could give AMD a 100% optimisation bonus and still sell more cards.
 
Last edited:
I don't know why either one of them would bother. How many cards have Intel sold? 2? Doesn't seem worth the effort. Nvidia's market/mindshare is so big they could give AMD a 100% optimisation bonus and still sell more cards.

I can see why AMD still try, and to some extent Intel.

AMD do have a professional GPU segment, not all of those dies are professional standard and its better to sell those dies somewhere rather than write them off, they also own the entire console segment, handheld gaming devices... so even if not directly through discrete gaming GPU's there is a lot of worthwhile gaming IP to invest in.
Frankly tho when it does come to discrete gaming GPU's as an isolated product... i think AMD care much less than they used to, they have accepted defeat to Nvidia and frankly i don't blame them, Nvidia spend more on R&D than AMD earn, its not possible to keep up with that no matter how good you are with much less cash.

Intel are trying to break into the professional GPU market, they also see the threat from AMD having much better integrated graphics for mobile devices and are trying to combat that because if they don't they will lose laptops ecte... and that is all they have left that is of any significance, so for Intel its survival, i think.

Problem is Intel don't have a lot of money to play with, having to pay every other game developer millions and millions of $ to get some work from them on their architecture is not something they thought to add to their equations nor can they afford it, especially selling their discrete GPU's at cost or even a loss just to try and get a foot in the door.
You see the cost of that is baked into the product, the GPU you buy, Intel never thought of that.
 
Last edited:
With their Battlematrix technology and the dual B60 cards they’re trying to break into the AI market. I wonder how soon we’ll see a dual B70 card?
Hopefully soon.... but i also hope Intel are clever about it when it comes to investing in AI because i can see the whole thing come crashing down in 2028 or even 2027 and those who invested too much are going to feel it the most.

With any luck it will take Nvidia down a few notches.
 
First B70 news!


Half the price of the competition is a good start.

US price is $949 so perhaps £950?

Looking at the pictures it seems that the B70 is ~50% faster than the B60 so if we were to transfer that to the B580 in gaming terms that puts a putative B770 modestly better than the RX 9060 XT, which is half the price. So I think we can kiss goodbye to the B770 unless the VRAM is 50% of the cost. Of course, I await actual benchmarks!
 
Allegedly (from a comment on Videocardz) the B70 has the gaming performance of the RTX 5060 Ti. So for a putative 16 GB B770 to be competitive it would have to be well under £400, more like around £350, and even then it would be only marginally cheaper than the RX 9060 XT. I don't think 16 GB of VRAM costs £600.

Roll on Celestial.
 
First gaming benchmarks are out: 45% faster than a B60.


Edit: this is on the default power configuration, but the power can go up to 330W whereas the default card is 230W so we could see a considerable performance increase on top. A better cooler is likely needed for 330W.
 
Last edited:
I can’t help but feel a lot of people are going to be disappointed by their B70s, unless Intel pull their finger out and get *much* better support sorted out. It definitely happened with the B60. You’re stuck using older versions of software, without support for newer models. Sure LM Studio can be used, but performance is much worse than using openvino.
It’s a “if you build it they will come” situation, but I think Intel fired all the people who could make the software experience any good.
 
Back
Top Bottom