• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Battlefield 4 Video Card Performance - AMD Vs Nvidia

Caporegime
Joined
12 Jul 2007
Posts
43,880
Location
United Kingdom
AMD cards providing the better gameplay experience in Battlefield 4.


Apples To Apples


kSIxH8R.gif



kVrmn1a.gif.png




Highest Playable Settings


GMPtNiG.png



eT4QMWS.png



From a gameplay performance perspective, the AMD Radeon HD 290X came out on the top of the heap, finding itself playable at 2560x1600 with all in-game settings set to maximum values and 2X MSAA enabled. The GeForce GTX TITAN followed by being playable at the same settings, however, it produced an average frame rate about 10FPS below that of the R9 290X.

Taking a step down from the top two performers, the AMD Radeon R9 290, AMD Radeon R9 280X and NVIDIA GeForce GTX 780 were all playable at 2560x1600 with all in-game settings set to maximum values and low FXAA enabled. The GeForce GTX 770 was able to play at 1920x1080 with all in-game settings set to maximum values and 2X MSAA enabled.

Rounding out the lineup, both the NVIDIA GeForce GTX 760 and the AMD Radeon R9 270X was capable of playing at 1920x1080 with all in-game settings set to maximum values and low FXAA enabled.

From an overall gameplay perspective, it seems that the Frostbite 3 engine has gone a long way to prevent significant sudden frame rate dips from impacting the gaming experience. Playing Battlefield 3, these frame rate dips (for example, when taking out a tank) could easily render your ability to escape from the situation futile. With Battlefield 4, these dips do not feel like these bring action to a halt.

What was more disconcerting was the inconsistent performance between AMD and NVIDIA cards. Overall, AMD cards tended to perform better than the NVIDIA counterpart at the particular price point, with the exception of the GeForce GTX 760 vs. the Radeon R9 270X.

With each NVIDIA card, we observed frame rates to be far more varied than AMD based cards over the course of playing the game with them. It almost seems that the performance concerns that we had with NVIDIA cards during our Beta evaluation have only been partially fixed at this point.


Full Article
http://www.hardocp.com/article/2013...deo_card_performance_iq_review/1#.UopuzsTBT_E
 
I hate Hardocp. An example of why i hate them: The 770 has 2xMSAA where as the others only have FXAA

: /

All that "highest playable settings" stuff is rubbish.
 
I hate Hardocp. An example of why i hate them: The 770 has 2xMSAA where as the others only have FXAA

: /

All that "highest playable settings" stuff is rubbish.

Check out the apples to apples section then Jono and ignore the rest. Applies to apples = same settings for all cards. :)

The highest playable settings is variable settings using the maximum possible for the card in question while maintaining what they deem as playable fps.
 
I bet this thread lasts. Tried to do a OcUK user one but the mods moved it into the other thread.

Hardocp don't even use the same settings, so pointless doing a Vs.
 
Do they not have apples to apples of Titan/290X?

At 1080?

2560x1400 with 4xMSAA is a bit overkill on settings personally and is likely to be extremely taxing on mem bandwidth so the 290X lead here is unsurprising. Would have been good to see an OC comparison too but c'est la vie.

Cards are pretty close though all things considered. The Titan lead over the stock 780 is surprising.
 
Last edited:
Phew! Thanks for the confirmation on setting HardOCP/Matt, I game at 2560x1600 ;) :cool:

It's good to know the 290X handles it, even before Mantle :D

Speaking of Mantle...

“[We] will definitely compete with the GTX 780 and Titan” ... “with Battlefield 4 running with Mantle (AMD’s new graphics API), the card will be able to ‘ridicule’ the Titan in terms of performance.”

Source
http://www.cinemablend.com/games/AMD-Says-R9-290X-Ridicule-Nvidia-Titan-Performance-59418.html
 
I bet this thread lasts. Tried to do a OcUK user one but the mods moved it into the other thread.

Hardocp don't even use the same settings, so pointless doing a Vs.

You know mate, your thread would have been an actual useful user comparison bench, this thread will just turn into another Purple vs Pink :D

Both colours i like, given the choice it would be purple though....maybe pink. Damn, hard choice!!
 
Thanks for posting. Two questions:

How come the 280X is within 10% of the 290X on average frames? 61.7 vs 66.9 fps. The 290X should be more like 30% faster?

Also why is the 290 and 290X min frames varying by about 50% when the average frames are so similar ?

I have a feeling there is going to be some debate on these charts in general.
 
Thanks for posting. Two questions:

How come the 280X is within 10% of the 290X on average frames? 61.7 vs 66.9 fps. The 290X should be more like 30% faster?

Also why is the 290 and 290X min frames varying by about 50% when the average frames are so similar ?

I have a feeling there is going to be some debate on these charts in general.

The inconsistencies of multiplayer benchmarks. My minimum fps can vary quite a bit depending on what happens.
 
Am I missing something? Why no 780Ti? Not giving it the time of day TBH. You've only got to look at the posts in this section on the forum to know that AMD users aren't having a rosy experience too.

Honestly, coming at you as unbias as I can - I wouldn't trust HardOCP to fine tune the air in my tyres. And I say that knowing that 290X users seem to be out framing Titan / 780s users. Just HardOCPs testing layout/methods are about as useful as an impact parachute.
 
Last edited:
Thanks for posting. Two questions. How come the 280X is within 10% of the 290X on average frames? 61.7 vs 66.9 fps. The 290X should be more like 30% faster?

Also why is the 290 and 290X min frames varying by about 50% when the average frames are so similar ?

Different resolutions, one being 1080 (280X) the other on 1600 (290X)
As for the differences in minimums, no idea other than maybe server side? (are they even testing on MP?)
 
Back
Top Bottom