• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

BF4 Retail CPU scaling measured

Thing is? the game is going to be GPU bound.
No. The point is that the sequence they choose is BADLY GPU bounded. Imagine people that don't know how hardware works all that well base their purchase on that result, getting a A10-5700 to pair with a 290x...yes it might be all fine and dandy playing the (traditionally boring) single player campaign. However once he go and start playing on multiplayer, he's likely to see the GPU usage on the 290x dipping down to below 50% even on Ultra settings because of CPU bottleneck.

Their comment of "As was the case with Battlefield 3, as long as your processor has four cores/threads, it shouldn't have a problem in EA's latest shooter" is simply ill-advised, as it is spoken in the context of like "there's only singleplayer mode for this game and multiplayer mode doesn't exist".

Nobody would blame them for not having multiplayer results, but at the very least they should give a heads-up in warning people that multiplayer mode would be more CPU demanding than that sequence which they benched, rather than the above comment.
 
Last edited:
No. The point is that the sequence they choose is BADLY GPU bounded. Imagine people that don't know how hardware works all that well base their purchase on that result, getting a A10-5700 to pair with a 290x...yes it might be all fine and dandy playing the (traditionally boring) single player campaign. However once he go and start playing on multiplayer, he's likely to see the GPU usage on the 290x dipping down to below 50% even on Ultra settings because of CPU bottleneck.

Their comment of "As was the case with Battlefield 3, as long as your processor has four cores/threads, it shouldn't have a problem in EA's latest shooter" is simply ill-advised, as it is spoken in the context of like "there's only singleplayer mode for this game and multiplayer mode doesn't exist".

Nobody would blame them for not having multiplayer results, but at the very least they should give a heads-up in warning people that multiplayer mode would be more CPU demanding than that sequence which they benched, rather than the above comment.

If Mantle does what it says it does? it won't matter what CPU you have.

I'm definitely looking forward to the multiplayer results. I've got this feeling that stuff like the I3 and locked I5 could well be left in the dust, but we'll see.
 
If Mantle does what it says it does? it won't matter what CPU you have.

I'm definitely looking forward to the multiplayer results. I've got this feeling that stuff like the I3 and locked I5 could well be left in the dust, but we'll see.

The CPU will still matter. Mantle (according to what we know so far) will aim to use the 8 threads more efficiently, so that it's far less reliant on single thread performance, by moving much of the stuff currently loaded to a single core onto the GPU. The engine will still want to use 8 threads though.

Multiplayer benchmarks are near impossible, as every run will be completely different. What's best is to find the most CPU intensive parts of the single player campaign to get an idea. Techspot's clearly not bothered here.
 
If Mantle does what it says it does? it won't matter what CPU you have.

I'm definitely looking forward to the multiplayer results. I've got this feeling that stuff like the I3 and locked I5 could well be left in the dust, but we'll see.
From what we've all read about Mantle, while it did mention it would reduce (not completely remove) the stress being placed on the CPUs and help improve efficiency, to what extend it is only as good as anyone's guesses.

Even with the help of Mantle, realistically speaking I still think it would take miracle for low-end CPU (i.e. A10, FX4) not to bottleneck high-end card(s) like 290x/GTX780/SLI and Crossfire 79xx or GTX6x0 setup...in intensive battles on multiplayer.
 
Last edited:
From what we've all read about Mantle, while it did mention it would reduce (not completely remove) the stress being placed on the CPUs and help improve efficiency, to what extend it is only as good as anyone's guesses.

Even with the help of Mantle, realistically speaking I still think it would take miracle for low-end CPU (i.e. A10, FX4) not to bottleneck high-end card(s) like 290x/GTX780/SLI and Crossfire 79xx or GTX6x0 setup...in intensive battles on multiplayer.

For years now GPUs have had more than enough hardware packed in to run a game completely void of the CPU. In technical terms? GPUs are light years ahead of CPUs. Yet we're being held back by nothing other than the API used to run the games.

Many years ago you could take a crap 1mb onboard GPU (one like the s3 Virge that cost around £20 as a PCI card) and link it to a 3DFX Voodoo 4mb and you could then run stuff like Quake on any old Pentium CPU and you'd be laughing.

IIRC it was Glide being used as an API for that.

Then M$ come along with DirectX, which was bloody awful. To run something like Daytona USA or Sega Rally you needed a seriously hefty PC, CPU and all. When compared to Screamer which ran in DOS they looked hardly any better, yet the performance hit from using DX was massive.

However, unified, easy and popular all made DX a success, even though it was god awful. It's called laziness. What DICE are doing with Mantle is hard work, just like making their game support multiple CPU architectures. Very seldom to companies even bother.

Quake was an exception to the rule as it used Glide. Few games supported the original Voodoo cards, most of them still ran in DOS.
 
Thing is? the game is going to be GPU bound. Which is why an AMD CPU makes more sense, simply because it's cheaper. I priced up two bundles today.. One was -

AMD FX 6300 £84.87
Asus M5A97 R2.0 £59
Crucial Ballistic 2x4gb DDR3 £53.

Total = £196.87. The 4670K alone costs £174. £22 less than an entire AMD bundle. The board I picked is full sized, 4+2 so not 4+1 and has great VRM cooling.. Onto the big hitter...

AMD FX 8320 £116.93
Asus M5A99x Evo R2.0 £95.98
Crucial Ballistic 2x4gb DDR3 £53.

Total = £265.91 The I7 4770K costs £237.40. So again, for £28 or so you can have a full bundle, rather than just a CPU.

Then you can use the money you saved (well in excess of £100 on either system) to put into your GPU. So instead of getting a R280X you can get a R290, and your money has gone where it matters, not on stupid snobbery.

Are you calling me a snob because if you are I will expect an apology.


What you failed to notice or worse chose to ignore was the point of that review was to show CPU scaling.
You cannot do that if the GPU is maxed out.

It was epic fail on the part of the reviewer.
 
For years now GPUs have had more than enough hardware packed in to run a game completely void of the CPU. In technical terms? GPUs are light years ahead of CPUs. Yet we're being held back by nothing other than the API used to run the games.

Many years ago you could take a crap 1mb onboard GPU (one like the s3 Virge that cost around £20 as a PCI card) and link it to a 3DFX Voodoo 4mb and you could then run stuff like Quake on any old Pentium CPU and you'd be laughing.

IIRC it was Glide being used as an API for that.

Then M$ come along with DirectX, which was bloody awful. To run something like Daytona USA or Sega Rally you needed a seriously hefty PC, CPU and all. When compared to Screamer which ran in DOS they looked hardly any better, yet the performance hit from using DX was massive.

However, unified, easy and popular all made DX a success, even though it was god awful. It's called laziness. What DICE are doing with Mantle is hard work, just like making their game support multiple CPU architectures. Very seldom to companies even bother.

Quake was an exception to the rule as it used Glide. Few games supported the original Voodoo cards, most of them still ran in DOS.
I think you are seriously getting ahead of yourself...not even AMD themselves has claim, or even hinted that Mantle would reduce the importance of CPU to that extent.

It's all fine posting what you think for the sake of discussion, but posting it in a manner like it's already proven and lecturing other like it is fact is a bit much.

The most I would tell people about Mantle is only its "potential" at this point...as for actual execution and performance in reality, I would not say things that could potentially be untrue as advise to other until actual reviews for Mantle's effect are out...I would suggest you do the same.
 
Last edited:
If GPU's are being run at their absolute 100%, and benchmarking programs cannot possibly push any more out of them at any given clocks than already are, how the heck could GPU's be in front of CPU's "In lightyears"?

Check for example PhysX, the lovechild of Nvidia. It can be run either with CPU or GPU. Similar performance hit on both. Nothing to do with directX by the way. And yes, PhysX is not an API, but it is a comparison of a thing that can be run on both CPU and GPU in a similar way..

You cannot possibly say the things you say without claiming them as being only opinions.
 
AMD and Nvidia are always good at making noise, but we very rarely get anything substantial from it.

Take TressFX, it was a one hit wonder (And IMO sucked)

Mantle is completely different, but it's all too similar, it has potential as Marine says, until we see sustained results, that's all it has.

There's also noise about OpenGL and it having low level support at the same level as Mantle ; From the Software "genius" Carmack.

If that is the case, OpenGL being readily used would open up gaming massively, native Linux and Mac gaming and efficiently.
 
With apparently 15 games at least using Mantle instead of DirectX, it's going to have an impact. The degree is just not known yet.
 
One thing those results don't really show is how smooth the different CPUs feel - atleast going by the bf4 beta and having had a play with a few different CPUs theres quite a big difference in how smooth different CPUs feel even at the same framerates i.e. 4 core CPUs tend to feel bogged down a lot more when playing even compared to slower 6 or 8 core CPUs. (This is fairly unique to BF4).
 
One thing those results don't really show is how smooth the different CPUs feel - atleast going by the bf4 beta and having had a play with a few different CPUs theres quite a big difference in how smooth different CPUs feel even at the same framerates i.e. 4 core CPUs tend to feel bogged down a lot more when playing even compared to slower 6 or 8 core CPUs. (This is fairly unique to BF4).

With MSI latest beta, we can monitor CPU usage and frametimes etc can't we?
 
For years now GPUs have had more than enough hardware packed in to run a game completely void of the CPU. In technical terms? GPUs are light years ahead of CPUs. Yet we're being held back by nothing other than the API used to run the games.

Many years ago you could take a crap 1mb onboard GPU (one like the s3 Virge that cost around £20 as a PCI card) and link it to a 3DFX Voodoo 4mb and you could then run stuff like Quake on any old Pentium CPU and you'd be laughing.

IIRC it was Glide being used as an API for that.

Then M$ come along with DirectX, which was bloody awful. To run something like Daytona USA or Sega Rally you needed a seriously hefty PC, CPU and all. When compared to Screamer which ran in DOS they looked hardly any better, yet the performance hit from using DX was massive.

However, unified, easy and popular all made DX a success, even though it was god awful. It's called laziness. What DICE are doing with Mantle is hard work, just like making their game support multiple CPU architectures. Very seldom to companies even bother.

Quake was an exception to the rule as it used Glide. Few games supported the original Voodoo cards, most of them still ran in DOS.

I will reserve judgement on Mantle until it's out and the benefits can be seen. Nevertheless I am excited about it. In the early days of 3D cards games often had a standard DirectX version and an optimised version for Glide, PowerVR or both. After a year or so the optimisations stopped because, I assume, games companies realised that people would continue to buy games even if only the DX version was available.
 
Back
Top Bottom