• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Battlemage Live & Available to order from OcUK

All this testing by HUB and HC is in some ways misleading

HC used old unsupported hardware and are then surprised they had issues

Intel supplies minimum specs they chose to ignore and even then these specs start at what is five year old hardware

Pointless testing in Ryzen 2000 series as that only had PCIe Gen 3 on the IO die

Why test at 1080p when it's marketed as a 1440p GPU ?

I have had my B580 since before Xmas and run it on 8600G at 1440p with no issues

HUB can't even seem to get their bench runs consistent as the 4060 performance in Spiderman is very different in the new testing Vs their B580 review at 1080p very high settings 90fps Vs 127fps both on the 9800x3D

At least provide both 1080p and 1440p results

Don't understand the "it's for 1440p", "test at 1440p" argument. Why?

Lets break it down a bit and isolate one result:

Warhammer 40,000: Space Marine, 1080p
Screenshot-2025-01-05-134439.png


If we run the test at 1440p, will the B580 get more than 53fps? No.

Will the 4060 get less than 74fps? Yes, could even be less than 53fps.

Does that make the B580 "better"? In my opinion, no, being limited to 53fps is bad regardless of resolution.

All testing at 1440p will do is hide the problem that's being explained...

I guess ignorance is bliss :cry:
 
I have had my B580 since before Xmas and run it on 8600G at 1440p with no issues


I do have positive Intel Arc bias, because I’ve used one for over a year now.

You both have an 8600G and 5700x3d respectively, fast enough to (mostly) avoid the problem.

How hard is it to understand that HardwareCanucks and Hardware Unboxed are warning people with CPUs significantly slower than yours to think twice before buying the B580?

We’ve seen plenty of positive user experiences people have had with the B580 over the last couple weeks (and some issues). HUB video comes out and suddenly it’s just all over, and all those positives must have just been in people’s imagination.

No one is saying that, but it's disingenous to think that someone with a Ryzen 3600, who could be getting half the fps you're getting, is having a "positive" experience and doesn't need to know why their getting such low performance.

Maybe that B580 owner keeps the card, upgrades to a 5700x3d and enjoys many years of gaming with Intel ARC. Without this knowledge and data, they could have thought "wow, ARC is crap, lets go back to NVIDIA".
 
The issue is present on the 5600, which is less than 3 years old.

It's an odd stance to take that people buying budget cards will not being running it at 1080p. If I buy a card that runs 4k settings well then I would expect it to run on my 1440p display even better, not worse.
We don't see the results from 1440p testing on the 5600 to give a indication of the loss at 1080p

Performance worse at 1080p is nothing new though AMD has shown the same thing in the past

With the price of 1440p panels these days the B580 is an ideal pairing at its proper RRP of course
 
I saw the draw calls post, but it wasn’t backed up by anything. It was a "I saw someone say this". Need something a bit more credible than the doomsayers. You can check a lot of people’s recent posting who’re in the IntelArc section and its 1-3 pages of wetting the bed about HUB’s B580 video.
I do have positive Intel Arc bias, because I’ve used one for over a year now. In this time I’ve heard many times how it doesn’t work and is truly useless, and it’s just not been my experience. I’ll remain on the positive side, until things are clarified.
We’ve seen plenty of positive user experiences people have had with the B580 over the last couple weeks (and some issues). HUB video comes out and suddenly it’s just all over, and all those positives must have just been in people’s imagination.
I saw someone state Intel does a lot of shader recompiling on the fly so this might go in some way to explain why processors with higher IPC are not affected as much
 
Don't understand the "it's for 1440p", "test at 1440p" argument. Why?

Lets break it down a bit and isolate one result:

Warhammer 40,000: Space Marine, 1080p
Screenshot-2025-01-05-134439.png


If we run the test at 1440p, will the B580 get more than 53fps? No.

Will the 4060 get less than 74fps? Yes, could even be less than 53fps.

Does that make the B580 "better"? In my opinion, no, being limited to 53fps is bad regardless of resolution.

All testing at 1440p will do is hide the problem that's being explained...

I guess ignorance is bliss

Why does it hide the problem when it's promoted as a 1440p card though ?

Like I said elsewhere we have seen other GPUs in the past perform better at higher resolutions for various reasons

If Intel came out and said here is our new 1080p gaming GPU then it would have course be more valid

I had zero expectations for the B580 but I watched the deep tech dives and thought for £248 it was worth a look, see what Intel are offering and at 1440p it's surprised me which is something a GPU hasn't done for years.

Of course it's not perfect but Intel can only improve things

It's a shame none of the tech channels tested this at launch though

I don't call it Ignorance as I can only base things on my own experience
 
Last edited:
How many games do you have in your Steam/Epic/GOG library? Dozens (or hundreds ;) ). Some of them will probably be CPU intensive. How many games yet to be released will you buy during the lifespan of the B580? Some of them will probably be CPU intensive.

Point of order: it has yet to be proven that the problem is with some games that are CPU intensive. If you look at the HUB video, CPU usages are pointedly absent. We don't know what the problem is, only that there is a problem.

Further, the problem is not across all games. These games are cherry-picked as having problems. This will help Intel pinpoint the problem. Are there any commonalities - the game engine, for instance? The issue may be entirely unrelated to the GPU - remember how AMD CPUs got a massive improvement when Windows' process scheduler was updated? Time will tell.
 
I've just seen a post on Reddit saying that using OBS 'destroys the 1% and 0.1% lows' so I wonder if there's an element of a false positive here?

A key update by the poster:

The issue is on the intel issue tracker where older than 10th gen intel cpus are affected. Its not present on a 10600K but a 9700K is affected. The QSV H264 media encoder overloads and makes the game and recording stutter really badly. QSV AV1 does the exact same thing. Even the video file is affected as its not playable back properly. In OBS just having the preview window open causes stuttering.
 
So royal mail did their thing and delivered my steel legend card today with absolutely 0 notification of collection or proposed delivery date/time. Luckily I was home as it was a 10am delivery....
 
Back
Top Bottom