• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

***THE BF4 BENCHMARKS THREAD***

So out of 100% that have tried Mantle 80% love it while the other 20% suffer from PEBAC. :D

Not quite,

3267 people who voted used mantle which is 45% of that vote.

23% of those loved it,

22% of those chose the other 3 options, which means 52% of those 3267 people loved it, and 48% of those people did not.
 
Not quite,

3267 people who voted used mantle which is 45% of that vote.

23% of those loved it,

22% of those chose the other 3 options, which means 52% of those 3267 people loved it, and 48% of those people did not.

Not quit. Those with Nvidia and those that are waiting for a more refined version have not even tried it. There opinions don't really count as they have not experienced Mantle. The people that used and loved it along with the people that went back to dx are the only people that have experienced Mantle. These are the only people qualified to judge the experience. Was a wild weekend so won't be doing any number crunching lol.
 
Check the Gain here I would like to see some you Guys with Nvidia post some screens from this location.
:D

Settings and scene as posted by you:

171FPS
resort_c_dx_scene.png

Manly 140% Resolution Scaling:

131FPS
resort_c_dx_set140.png
resort_c_dx_scene140.png

Double 780Tis
 
I had between 155/160fps that is not bad from stock 290s just 11 fps

Don't even think I got 100% GPU usage in that scene at those settings. Anything above and beyond 120FPS is just lunacy though :p

Comparing like for like DX vs DX - It's over 60 FPS difference. Take from that what you will, which is tbh - I have no idea :D

Mantle giving your 2600k some headroom compared to DX perhaps.
 
Last edited:
Don't even think I got 100% GPU usage in that scene at those settings. Anything above and beyond 120FPS is just lunacy though :p

Comparing like for like DX vs DX - It's over 60 FPS difference. Take from that what you will, which is tbh - I have no idea :D

Mantle giving your 2600k some headroom compared to DX perhaps.

I haven't tested but matt pointed out bf4 dx11.1 on 14.2 driver isn't giving good performance. If I wanted better dx performance for bf4 to use 12.13 driver.
And I totally agree about going over 120fps is pointless I lock mine to 120fps.
 
I haven't tested but matt pointed out bf4 dx11.1 on 14.2 driver isn't giving good performance. If I wanted better dx performance for bf4 to use 12.13 driver.
And I totally agree about going over 120fps is pointless I lock mine to 120fps.

I've started locking my fps to 121 also. If anything it keeps temps down instead of ramming both cards at 98% plus with 180 FPS :D
 
I'll be using the 2x16 slots though. Just testing on this board that's going back the space between the x16 slots is twice that of my p8p67 deluxe and that had a decent space between the gpu's.

I know this is off topic but, sorry!
I'm hoping that running 16x/16x gives some benefit to the 290s as I'm running mine in 16x/16x PCIe 3.0 (basically, I don't think X79 + SB-E technically supports PCI-e 3.0, but it's running as GEN3 in BIOS). However, breaking it down I'd have to say I'm sceptical, here's my logic...

XDMA Crossfire is AMD tech. I believe the top AMD motherboard chipset is the FX990. The specs of this say that it can run 2 dual 16x PCI-e. It also says it supports 4-way CrossfireX. This would mean running PCI-e 2.0 8x/8x/8x/8x.
So if PCI-e 2.0 @ 8x isn't enough for XDMA crossfire then even AMDs top of the range chipset would be unable to run their own hardware without issue/bottleneck (admittedly in extreme circumstances). Intel would be able to run them at PCI-e 3.0 8x/8x/8x/8x, which I believe is effectively PCI-e 2.0 @ 16x/16x/16x/16x. So XDMA crossfire could in theory give Intel CPUs a performance boost. Doesn't seem to be a great idea for AMD, especially with Mantle looking to make AMD CPUs more competitive from a gaming point of view.
Removing the CPU bottleneck from both platforms and then introducing a PCI-e bandwidth issue to your own platform but not your competitors seems silly.
Interested to know if AMD have done this though, so await your findings.
 
I know this is off topic but, sorry!
I'm hoping that running 16x/16x gives some benefit to the 290s as I'm running mine in 16x/16x PCIe 3.0 (basically, I don't think X79 + SB-E technically supports PCI-e 3.0, but it's running as GEN3 in BIOS). However, breaking it down I'd have to say I'm sceptical, here's my logic...

XDMA Crossfire is AMD tech. I believe the top AMD motherboard chipset is the FX990. The specs of this say that it can run 2 dual 16x PCI-e. It also says it supports 4-way CrossfireX. This would mean running PCI-e 2.0 8x/8x/8x/8x.
So if PCI-e 2.0 @ 8x isn't enough for XDMA crossfire then even AMDs top of the range chipset would be unable to run their own hardware without issue/bottleneck (admittedly in extreme circumstances). Intel would be able to run them at PCI-e 3.0 8x/8x/8x/8x, which I believe is effectively PCI-e 2.0 @ 16x/16x/16x/16x. So XDMA crossfire could in theory give Intel CPUs a performance boost. Doesn't seem to be a great idea for AMD, especially with Mantle looking to make AMD CPUs more competitive from a gaming point of view.
Removing the CPU bottleneck from both platforms and then introducing a PCI-e bandwidth issue to your own platform but not your competitors seems silly.
Interested to know if AMD have done this though, so await your findings.

It wasn't that 8x wasn't enough it was the chipset Nf2000 not playing nice with XDMA
 
NF200 is just a context switcher hence why XDMA doesn't play nice with it - instead of having 2x true 16x ports you have 1x 16x split and it switches between them as needed/load balanced. So it can't truely talk from one device to the other in different sockets in the same clock cycle.

Its great for SLI however.
 
Back
Top Bottom