• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Battlemage Live & Available to order from OcUK

Bit of a shame about this driver overhead thing, the B580 looked like a solid budget gaming pick, but it's going to be a hard sell for budget builds if this issue isn't fixable. Particularly as the price over here is the same as the 4060.
 
Bit of a shame about this driver overhead thing, the B580 looked like a solid budget gaming pick, but it's going to be a hard sell for budget builds if this issue isn't fixable. Particularly as the price over here is the same as the 4060.
the 4060 is ******* and you're likely to have more issues with running out of VRAM on newer games than any slight driver overheads there is.. you'll also notice that these tests were only conducted on 'one' game - which everyone knows isn't best optimised for Intel yet.. i dunno - it's like a content creator has tried to fabricate a controversial narrative for likes and clicks..
 
the 4060 is ******* and you're likely to have more issues with running out of VRAM on newer games than any slight driver overheads there is.. you'll also notice that these tests were only conducted on 'one' game - which everyone knows isn't best optimised for Intel yet.. i dunno - it's like a content creator has tried to fabricate a controversial narrative for likes and clicks..
Hardware Canucks found it originally, HUB then verified and have since made another video testing multiple games and CPUs. They're going to re-review fully by the looks.
 
the 4060 is ******* and you're likely to have more issues with running out of VRAM on newer games than any slight driver overheads there is.. you'll also notice that these tests were only conducted on 'one' game - which everyone knows isn't best optimised for Intel yet.. i dunno - it's like a content creator has tried to fabricate a controversial narrative for likes and clicks..
Take into account having to upgrade your CPU/mobo to a new X3D and I bet the 4070 is a much, much better buy. This is a brutal misstep, just when it seemed they had done something decent - unbuyable. You can't buy this. It's buying something you know has massive broken drawbacks. It's a failed product. If it seems like I'm being harsh, I'm not. If you were on a deserted island and it was the only GPU in the world, then yes, fine. But if you're buying a broken product to... support a corporation, that's mental illness.

Although to be frank, if I was in this tier I'd be hunting for like a 6800xt or GRE or 6750xt or 10 gb 3080 or something IDK, I haven't looked into that stuff. I saw a 500 CAD refurbished 3080 on newegg yesterday, I'd go for that if I was in the market for a budget GPU tbh. Since nowadays, budget GPU is anything under 3,000 dollars.
 
So reading about this driver overhead issue, i still have mine on pre order, my CPU is 3700X, am i better off cancelling and getting a 6700XT instead?
 
Last edited:
So reading about this driver overhead issue, i still have mine on pre order, my CPU is 3700X, am i better off cancelling and getting a 6700XT instead?
I don't see how anyone can trust intel for anything outside of their laptop cpus. I've read everything older than the 7600x runs into problems.

Go check reddit r/intel or r/hardware to see what people are saying. I don't know if that specific CPU is particularly affected, but I imagine it is since apparently the 5600 is hit by this.

I've seen technical people who seem to know what they're talking about claim that this is baked into how intels whole graphics processing base is done and can't just be patched out, but I'm not gonna pretend I know what that might entail or how accurate it might be
 
So reading about this driver overhead issue, i still have mine on pre order, my CPU is 3700X, am i better off cancelling and getting a 6700XT instead?
I would certainly wait until more info comes out. Best case scenario, this gets sorted in a driver update.
 
Go check reddit r/intel or r/hardware to see what people are saying. I don't know if that specific CPU is particularly affected, but I imagine it is since apparently the 5600 is hit by this.
I mean, having looked at reddit yesterday and today, this is just terrible advice. Just a load of people who've never used an Arc card parroting the same Spider-Man screenshot, and spouting comical dramatics.
Best thing to do is just wait for the real videos to come out. HUB's video, was a quick jumping on the bandwagon one after seeing the Hardware Canucks video. They'll definitely get a longer video out, I'm sure Gamers Nexus will and maybe Hardware Canucks will do a follow-up (not sure if I can stomach looking at their charts again though :D ). We can see how bad it really is then.

Think I'll get my A770 back in a system that can take the Ryzen 1400 again this week, and run some stuff at 1080p, just so we can get some laughs at those numbers. I never use afterburner, so will need to work that out.
What games would be good to test? I'll high-seas them just for this purpose. :D
 
the 4060 is ******* and you're likely to have more issues with running out of VRAM on newer games than any slight driver overheads there is.. you'll also notice that these tests were only conducted on 'one' game - which everyone knows isn't best optimised for Intel yet.. i dunno - it's like a content creator has tried to fabricate a controversial narrative for likes and clicks..

It's surprising how many people in this thread didn't watch the HUB video...

There were four games demonstrated, not one:

Warhammer 40,000: Space Marine 2
Starfield
Hogwarts Legacy
Marvels Spiderman Remastered

I'll say it again, the point isn't "if you play the above four games with a slow CPU, the B580 is trash".

How many games do you have in your Steam/Epic/GOG library? Dozens (or hundreds ;) ). Some of them will probably be CPU intensive. How many games yet to be released will you buy during the lifespan of the B580? Some of them will probably be CPU intensive.

Already one guy in here is using a 3700x with the B580, people will use slow CPUs because the B580 is a cheap GPU.

Since people still won't watch the video, I'll copy the results here for the four games above so you can make your own mind up:

Warhammer 40,000: Space Marine 2
B580 - 3600 - 48fps
4060 - 3600 - 68fps

B580 - 5600 - 53fps
4060 - 5600 - 74fps

Starfield
B580 - 3600 - 34fps
4060 - 3600 - 49fps

B580 - 5600 - 36fps
4060 - 5600 - 50fps

Hogwarts Legacy
B580 - 3600 - 53fps
4060 - 3600 - 63fps

B580 - 5600 - 64fps
4060 - 5600 - 67fps

Marvels Spiderman Remastered
B580 - 3600 - 68fps
4060 - 3600 - 97fps

B580 - 5600 - 76fps
4060 - 5600 - 111fps
 
Last edited:
So reading about this driver overhead issue, i still have mine on pre order, my CPU is 3700X, am i better off cancelling and getting a 6700XT instead?
Depends on what games and what settings you'll play games at or what you'll be using it for. If playing at 1440p, this CPU overhead issue has less of an impact and the B580 doesn't lose as much performance. I wouldn't have much trouble on my 7800X3D or even using it for hardware encoding on my 10YO Z97 server build.

That said, not sure where one might find decent deals on 6700XTs right now, most of them are out of stock or sky-high EOL pricing. You might be lucky enough to find a 6750XT, but that's costing almost £100 more than the B580. If wanting to buy new, could potentially be worth waiting a few days to see what Nvidia and AMD have up their sleeves for next gen.

Otherwise, as others have said, Intel's driver team has been making great progress and even if there is currently a CPU overhead issue, it's reasonably likely that there will eventually be driver updates to fix it. Same way they did for the original Alchemist release. So there is the option to keep it too (and put up with lower performance at 1080p for a bit).
 
Depends on what games and what settings you'll play games at or what you'll be using it for. If playing at 1440p, this CPU overhead issue has less of an impact and the B580 doesn't lose as much performance. I wouldn't have much trouble on my 7800X3D or even using it for hardware encoding on my 10YO Z97 server build.

That said, not sure where one might find decent deals on 6700XTs right now, most of them are out of stock or sky-high EOL pricing. You might be lucky enough to find a 6750XT, but that's costing almost £100 more than the B580. If wanting to buy new, could potentially be worth waiting a few days to see what Nvidia and AMD have up their sleeves for next gen.

Otherwise, as others have said, Intel's driver team has been making great progress and even if there is currently a CPU overhead issue, it's reasonably likely that there will eventually be driver updates to fix it. Same way they did for the original Alchemist release. So there is the option to keep it too (and put up with lower performance at 1080p for a bit).

I have a Ryzen 7 7700 running 1440p. Looking for a new GPU I'd though 580 might be the one. I'll leave it until Intel release a fix and see what amd/nvidia release
 
I have a Ryzen 7 7700 running 1440p. Looking for a new GPU I'd though 580 might be the one. I'll leave it until Intel release a fix and see what amd/nvidia release
If playing at 1440p, you might be okay as the numbers from HWUB showed that there wasn't as much of an issue at 1440p.

That said, yes, waiting for a few days to see what AMD and Nvidia are going to be releasing is certainly the best idea.
I just hope they mention some of the more budget options too, as they usually like to start off with and only show off details for their top cards.
 
I mean, having looked at reddit yesterday and today, this is just terrible advice. Just a load of people who've never used an Arc card parroting the same Spider-Man screenshot, and spouting comical dramatics.
Best thing to do is just wait for the real videos to come out. HUB's video, was a quick jumping on the bandwagon one after seeing the Hardware Canucks video. They'll definitely get a longer video out, I'm sure Gamers Nexus will and maybe Hardware Canucks will do a follow-up (not sure if I can stomach looking at their charts again though :D ). We can see how bad it really is then.

Think I'll get my A770 back in a system that can take the Ryzen 1400 again this week, and run some stuff at 1080p, just so we can get some laughs at those numbers. I never use afterburner, so will need to work that out.
What games would be good to test? I'll high-seas them just for this purpose. :D
You don't think you might be biased a bit? Cmoooon. Reddit might be full of virgins but they at least have some neat information at their fingertips, like how many drawcalls the b580 is making - less than an rx580, and not even in the ballpark of modern ones.

And that video wasn't just the spider-man game. It looks like this is just a... feature of the GPU. I'm sure it'll work fine with a new CPU @ 1440p in most games, but buying a gimped GPU to support a broken company is the kind of terrible consumer decision that shouldn't be defended publicly. Of course I'm always on the side of more information which is why I told the guy to go to different forums, but it was like 4/4 in games tested that it works this way with any CPU older than a 7600, and yea, in one of those games anything less than the top CPU in the world was hemorrhaging performance.

There's no reason to buy this. The 4060 is a terrible card, but at least it's consistently mediocre in all use cases. IDK, this is the sorriest GPU market in history. There must be like 6800 or 6750 xt's or 3070's out there. Just something to not buy broken tech.

I still think people should just save a bit more and get a 4070, because it's almost like a real card. Or wait a few months and see what the 5060 is about, or what the 5 series does to 4 and 3 series. You might be a few months from being able to buy like 300 USD 3080's. I saw newegg with a 500 CAD refurbished one. I'm too lazy to convert that to pound sterlings or USD or schmeckels or galleons or whatever you mangy gash mingers in londonistan use but by the grace of allah it can't be that much
 
All this testing by HUB and HC is in some ways misleading

HC used old unsupported hardware and are then surprised they had issues

Intel supplies minimum specs they chose to ignore and even then these specs start at what is five year old hardware

Pointless testing in Ryzen 2000 series as that only had PCIe Gen 3 on the IO die

Why test at 1080p when it's marketed as a 1440p GPU ?

I have had my B580 since before Xmas and run it on 8600G at 1440p with no issues

HUB can't even seem to get their bench runs consistent as the 4060 performance in Spiderman is very different in the new testing Vs their B580 review at 1080p very high settings 90fps Vs 127fps both on the 9800x3D

At least provide both 1080p and 1440p results
 
Last edited:
It's surprising how many people in this thread didn't watch the HUB video...

There were four games demonstrated, not one:

Warhammer 40,000: Space Marine 2
Starfield
Hogwarts Legacy
Marvels Spiderman Remastered

I'll say it again, the point isn't "if you play the above four games with a slow CPU, the B580 is trash".

How many games do you have in your Steam/Epic/GOG library? Dozens (or hundreds ;) ). Some of them will probably be CPU intensive. How many games yet to be released will you buy during the lifespan of the B580? Some of them will probably be CPU intensive.

Already one guy in here is using a 3700x with the B580, people will use slow CPUs because the B580 is a cheap GPU.

Since people still won't watch the video, I'll copy the results here for the four games above so you can make your own mind up:

Warhammer 40,000: Space Marine 2
B580 - 3600 - 48fps
4060 - 3600 - 68fps

B580 - 5600 - 53fps
4060 - 5600 - 74fps

Starfield
B580 - 3600 - 34fps
4060 - 3600 - 49fps

B580 - 5600 - 36fps
4060 - 5600 - 50fps

Hogwarts Legacy
B580 - 3600 - 53fps
4060 - 3600 - 63fps

B580 - 5600 - 64fps
4060 - 5600 - 67fps

Marvels Spiderman Remastered
B580 - 3600 - 68fps
4060 - 3600 - 97fps

B580 - 5600 - 76fps
4060 - 5600 - 111fps
Thank you. That's exactly my point about buying broken stuff. If you have to check if it 'works' for what you're getting up to, it's already red flagged

I don't fault anyone for buying it, because it looked promising enough. And nobody expects such an established tech mainstay to keep pulling amateur nonsense like Intel has been. Although I was already wondering why people would buy something on the eve of new GPU's being released.
 
Intel supplies minimum specs they chose to ignore and even then these specs start at what is five year old hardware
The issue is present on the 5600, which is less than 3 years old.

It's an odd stance to take that people buying budget cards will not being running it at 1080p. If I buy a card that runs 4k settings well then I would expect it to run on my 1440p display even better, not worse.
 
Last edited:
You don't think you might be biased a bit? Cmoooon. Reddit might be full of virgins but they at least have some neat information at their fingertips, like how many drawcalls the b580 is making - less than an rx580, and not even in the ballpark of modern ones.
I saw the draw calls post, but it wasn’t backed up by anything. It was a "I saw someone say this". Need something a bit more credible than the doomsayers. You can check a lot of people’s recent posting who’re in the IntelArc section and its 1-3 pages of wetting the bed about HUB’s B580 video.
I do have positive Intel Arc bias, because I’ve used one for over a year now. In this time I’ve heard many times how it doesn’t work and is truly useless, and it’s just not been my experience. I’ll remain on the positive side, until things are clarified.
We’ve seen plenty of positive user experiences people have had with the B580 over the last couple weeks (and some issues). HUB video comes out and suddenly it’s just all over, and all those positives must have just been in people’s imagination.
 
Back
Top Bottom