• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Intel Arc owners thread

I refuse to watch such videos but the thumbnail shows neither at playable frame rates so I'm not sure it really matters.

At more sensible settings I imagine the 4060 will be around 10% faster

Edit:
I imagine they'll have settings on Ultra to max out vram usage, whereas the reality are they are both budget cards aimed at Mid-high 1080P gaming
 
Last edited:
  • Like
Reactions: G J
RTX4060 is getting beaten by an RTX3060:

In the new Indiana Jones game,the A770 16GB is beating the RTX4060.

What's more interesting is the overall low performance of both cards in question. To get 60fps, settings have to drop significantly, to the point where VRAM is rarely a concern. Some games can't get to 60fps at all...

To be honest, I would consider none of these cards to be good enough for modern gaming, imo. I would include 3060, 4060, B580, A770, and the AMD equivalents. The raster performance is just too low.

The next tier up (e.g. 6750XT/4060Ti) or the tier after that (6800/7700XT), should be the bare minimum (with at least 12GB VRAM, obviously).

Shows you how far the budget segment of GPUs have fallen :(
 
I refuse to watch such videos but the thumbnail shows neither at playable frame rates so I'm not sure it really matters.

At more sensible settings I imagine the 4060 will be around 10% faster

Edit:
I imagine they'll have settings on Ultra to max out vram usage, whereas the reality are they are both budget cards aimed at Mid-high 1080P gaming


What's more interesting is the overall low performance of both cards in question. To get 60fps, settings have to drop significantly, to the point where VRAM is rarely a concern. Some games can't get to 60fps at all...

To be honest, I would consider none of these cards to be good enough for modern gaming, imo. I would include 3060, 4060, B580, A770, and the AMD equivalents. The raster performance is just too low.

The next tier up (e.g. 6750XT/4060Ti) or the tier after that (6800/7700XT), should be the bare minimum (with at least 12GB VRAM, obviously).

Shows you how far the budget segment of GPUs have fallen :(

I have had an RTX3060TI since 2021,which is a bit faster and it's running out of VRAM now at reasonable settings(1440p) in newer games. I don't just max out games. My mate with an RX6700XT doesn't have that issue. We both had similar systems until recently(5700X+ B450 motherboards).

This was on both a PCI-E 3.0(5700X) and PCI-E 5.0(7800X3D) system BTW. The newer system helps a bit more,probably because of the faster connection and DDR5. It manifests in 1% lows and texture pop-ins. The RTX4060 is a bit slower,because it uses a PCI-E 8X connection so when it pages into system RAM,there is even less PCI-E bandwidth available.

The RX6700XT/RX6750XT have been under £300 for the last year or more if you shop around. The RX6800 16GB/RX7700XT 12GB have been available for almost £300 a few times over October and November,as AMD starts clearing out stock.

The new Indiana Jones game is Nvidia sponsored,but despite this see what happens:
oYeHCRz.png

It needs at least 12GB of VRAM to perform OK. The rest of the Nvidia cards perform fine.

GCDTyl7.png

The RTX4060 makes a lot of sense as an OEM card,because you can find complete desktops for as low as £600 using it. But DIY pricing is a bit higher than it should be IMHO.

With the consoles having between 10GB to 12GB of VRAM,12GB is probably the entry level with newer more PC focussed games IMHO.

If the Intel benchmarks are true and the drivers are OK,the B580 12GB is basically a lower power consumption RX6700XT,with better RT,lower power and a more advanced media engine.

I hope it also means the RX8600 and RTX5060 are shipped at least 12GB VRAM next year.
 
Last edited:
I have had an RTX3060TI since 2021,which is a bit faster and it's running out of VRAM now at reasonable settings(1440p) in newer games. I don't just max out games. My mate with an RX6700XT doesn't have that issue. We both had similar systems until recently(5700X+ B450 motherboards).

This was on both a PCI-E 3.0(5700X) and PCI-E 5.0(7800X3D) system BTW. The newer system helps a bit more,probably because of the faster connection and DDR5. It manifests in 1% lows and texture pop-ins. The RTX4060 is a bit slower,because it uses a PCI-E 8X connection so when it pages into system RAM,there is even less PCI-E bandwidth available.

The RX6700XT/RX6750XT have been under £300 for the last year or more if you shop around. The RX6800 16GB/RX7700XT 12GB have been available for almost £300 a few times over October and November,as AMD starts clearing out stock.

The new Indiana Jones game is Nvidia sponsored,but despite this see what happens:
oYeHCRz.png

It needs at least 12GB of VRAM to perform OK. The rest of the Nvidia cards perform fine.

GCDTyl7.png

The RTX4060 makes a lot of sense as an OEM card,because you can find complete desktops for as low as £600 using it. But DIY pricing is a bit higher than it should be IMHO.

With the consoles having between 10GB to 12GB of VRAM,12GB is probably the entry level with newer more PC focussed games IMHO.

If the Intel benchmarks are true and the drivers are OK,the B580 12GB is basically a lower power consumption RX6700XT,with better RT,lower power and a more advanced media engine.

I hope it also means the RX8600 and RTX5060 are shipped at least 12GB VRAM next year.


The rumor mill says the 8600 is going to be 8gb again, but we'll have to wait and see
 

TL:DW the B580 is mostly slightly slower than an A770 and slightly faster than an A750. I didn't see any UE 5 games there, though, where the B580 is supposed to support hardware acceleration.
 
Slightly slower than the A770 should be the expectation. Nice generational jump

Also, MSI Claw 8 looking good and cheaper than current Strix competitors.


I still do not understand the choice to go for High RR 1200p displays this size. You could save battery life, money and improve performance at the same time going for a 1080p@60 or even 75 screen.
 
Last edited:
What's more interesting is the overall low performance of both cards in question. To get 60fps, settings have to drop significantly, to the point where VRAM is rarely a concern. Some games can't get to 60fps at all...
a lot of the clickbait and testing with these Youtube videos and people commenting online is kinda useless as you said as its all good showing a 8GB card stuttering while the higher VRAM card does not but whats the point when the higher VRAM card or the cards higher up in tiers/VRAM cant even maintain 60fps.
To be honest, I would consider none of these cards to be good enough for moderngaming, imo. I would include 3060, 4060, B580, A770, and the AMD equivalents. The raster performance is just too low.
The next tier up (e.g. 6750XT/4060Ti) or the tier after that (6800/7700XT), should be the bare minimum (with at least 12GB VRAM, obviously).

Shows you how far the budget segment of GPUs have fallen :(
Yep it's bad as I've been saying through out all of these debates as such that all whole low to low mid range cards need to get a lot faster with more VRAM doing one or the other isnt going to cut it. Nvidia and AMD know what they are doing and will happy charge one more for more VRAM on the lower end cards but leave 4070/4070ti/6700XT/7700XT with 12GB.

Getting back more on to topic as I would think that Battlemage should be mostly concerned with making it viable to the OEM/prebuilt market and building off that.
 
Pity they didn't just put 16GB, 4060 performance with 16GB for £250 would be a purchase, might not be blazing fast but enough VRAM to last a while, and affordable- compared to other rip off models.

Equal to 4070 would be even better, and would be purchase.
 
Pity they didn't just put 16GB, 4060 performance with 16GB for £250 would be a purchase, might not be blazing fast but enough VRAM to last a while, and affordable- compared to other rip off models.

But it's totally pointless - see for example Radeon 7600XT


1733847460772.png
 

TL:DW the B580 is mostly slightly slower than an A770 and slightly faster than an A750. I didn't see any UE 5 games there, though, where the B580 is supposed to support hardware acceleration.

Will have to wait for official reviews in a couple days but I feel slightly underwhelmed with this one.

With the 50 series and 8000 series cards due around spring (maybe) is this where intel need to be?
 
"What users should note, however, is that Intel GPUs tend to perform very well in synthetic benchmarks, particularly in DirectX12-based tests like Time Spy. This performance does not always correspond to gaming performance, so the real value is almost certainly below 30% for the B580 vs. the RTX 4060. In fact, Intel themselves only mentioned a 10% increase over the RTX 4060 at 1440p, and the gap may be even lower at 1080p."
 
Reviews are out
.

Hard to believe it's done that well... good news finally I guess. :D
Some more reviews and discussion in the other thread:
 
Honestly, I'm kinda shocked by the performance. I was expecting 4060 (but actually slightly better) and I thought that would've been an ok performance but I see them actually struggling to even do that. Worst of all, the RT numbers vs AMD are also pretty bad, which is where I'd have thought they'd wipe the floor with them. And ofc plenty of problems still there typical of ARC. Basically it has 12 GB of vram going for it on the budget, but that's it. No way they only manage to put out low range cards and still can't even outcompete there (after they already have to use a much larger to die to even do that!) and they survive. Don't forget, RDNA 3 & Lovelace are the old cards, this will have to compete for much longer with Blackwell & RDNA 4!

I genuinely think this kills their GPU division.

fL2Wp6n.png
 
Last edited:
Back
Top Bottom