• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

(gamegpu) Battlefield 1 Benchmarks

I noticed that DX12 whilst impacting the performance of the Nvidia GPUs, is less CPU limited. Compare the AMD FX results on the Nvidia GPU between DX11 and 12. Something worth considering for those with AMD CPUs and Nvidia GPUs.
 
+2

I think too many people get hung up on maximum framerates and even average.

I know I would much rather play a game with a card which gave a min of 40 and max of 50 fps with an average of 45 rather than another card which technically is 10% faster with an av of 50 fps but had max fps of 90 and min of 10.

Exactly my point, especially as Adaptive Sync screens are more and more common, and especially if your using Freesync as the ranges are very varied.

Some Freesync screens have really tight ranges too, some are 48-75hz, lots are 35/40 - 144hz.

You might find your card handles 1080p fine @ 60hz on Ultra, and runs over the 60fps limit all the time, then you go buy a 27" Freesync Screen thats 1440p and suddenly find your only getting 60fps on that screen, which is nice as your sat in the Freesync range.

You fire up a new AAA Title that has DX11 and DX12, DX11 nets you 70fps on your new screen but in some parts of the game this dips to 35fps you start noticing freezes and hitching and wonder why Freesync is not working.

So you switch it to DX12, your only getting 65fps now :( but wait in that DX11 part where you was getting 35fps you are now getting 42fps and its still buttery smooth, wow...

I was worried about the hit i would take going from my 290 and 1080p 60hz screen to my 27" 1440p 40-144hz Freesync screen, at the time i was mainly playing the Divsion, i benched the game on Ultra settings on my 1080p and got like 80fps or something, tested it again on my 1440p and got like 60fps, im well within the Freesync range but i was worried about dipping out, so i adjusted some settings and brought my average fps to 85fps again with minimums never going below around 55fps, im sat in the freesync range and its lovely. However if this game was DX12 i would probably have been able to leave the settings on Ultra without having to fiddle to keep frames in range.

This is the point of DX12 to me, enabling me to enjoy the game with more bells and whistles while keeping an acceptable frame rate.
 
qOlDIqY.jpg
 
It's like to hell with progress as my Nvidia Gpu does not have the capability, but when Nvidia do eventually get there with most likely Volta they will be shouting about dx12 from the rooftops.
 

This is all well and good but the gamegpu benches show that the minimums are worse in DX12 as well (for Nvidia) in this game.

It is a fair point - From the benches in the OP it looks like DX12 is useless for Nvidia owners (unless it adds anything graphically/makes the game look nicer but it doesn't appear that it does)
 
Last edited:
It's like to hell with progress as my Nvidia Gpu does not have the capability, but when Nvidia do eventually get there with most likely Volta they will be shouting about dx12 from the rooftops.

Lol always been the same hasnt it?

Pretty sure by the time Nivida uses something, they invented it or were the first.
 
Oh god how many times do we have to say it, it's not all about high fps numbers? How is having 100fps useful if your monitor is only 60hz? How is having 200fps useful if your adaptive sync range is 40-144hz?

What IS useful, and this is the part your struggling with, is DX12 brings up the minimums so it lessens out the variance and spikes you get.

If im getting 100fps with dips to 20fps using dx11 on a 40-144hz screen, but Dx12 gives me 90fps with dips to 40fps which should I use?

Come on Sherlock even you should be able to work that one out

Clue, it's not just about high fps numbers it's about a smoother experience

+5 :D
 
I have to ask, running an Asus GTX 970 Strix is it worth a cross grade to a RX480 for BF1 for the DX12 support? The min/max on the first page would suggest it is...

Primarily play BF1, GTAV and Star Wars Battlefront...
 
This is all well and good but the gamegpu benches show that the minimums are worse in DX12 as well (for Nvidia) in this game.

It is a fair point - From the benches in the OP it looks like DX12 is useless for Nvidia owners (unless it adds anything graphically/makes the game look nicer but it doesn't appear that it does)

I think this points squarely at the hardware architechture, Nvidia rely on their software drivers to pull them ahead in DX11, but in DX12 a lot of this is probably bypassed or not used or just does not work.

AMD suffer in DX11 because their drivers are nowhere near as good, or their hardware just cant run as well as Nvidia, plus they are burdened with having the DX12 hardware onboard going unused. But in DX12 this hardware gets used, and their rubbish drivers suddenly get a new lease of life as they can finally utilise this hardware.

This i think is why most people say Nvidia currently do not have DX12 Hardware, and they tackle everything from Software, hence the poor gains in DX12 and better gains in DX11, where its reversed for AMD mostly.
 
I find it a lot of fun that the laughing stock of 2013 (Hawaii XT) is the one proving to be the real tough old ******, its heroic, like a 60 year old Rocky Balboa still able to hold its own in a fight with a champ 2 generations younger.

After all the hate laid on it 3 years ago i'd like to award it the Victoria cross, its a legend.

I love watching it bosh its younger rivals round the ears.
 
Last edited:
I have to ask, running an Asus GTX 970 Strix is it worth a cross grade to a RX480 for BF1 for the DX12 support? The min/max on the first page would suggest it is...

Primarily play BF1, GTAV and Star Wars Battlefront...

If it were me? i would check benchmarks from multiple sources if possible for those 2 cards and those games and use that to weight my decision.

a lot depends on budget as well, and how long you intend to keep the card before upgrading.

You may find it better to just buy a 1070 or a 1080 now, or wait for Vega / Volta.

Personally if your not going to upgrade again for say 3 years and your budget's around £250 then the 480 is a good card, where the Nvidia card will not get optomised drivers for newer games, the AMD card will get driver updates as they update for the hardware which is almost the same across many of their gpu's.
 
I find it a lot of fun that the laughing stock of 2013 (Hawaii XT) is the one proving to be the real tough old ******, its heroic, like a 60 year old Rocky Balboa still able to hold its own in a fight with a champ 2 generations younger.

After all the hate laid on it 3 years ago i'd like to award it the Victoria cross, its a legend.

Still rocking my 290, Bug, been tempted lately though to buy 2 of those XFX GTR cards or the HIS cards and never look inside my case lol.

But with rumors of a Polaris refresh already and being so close to Vega im withstanding the itch to just buy them and waiting to see whats coming early next year.
 
Oh god how many times do we have to say it, it's not all about high fps numbers? How is having 100fps useful if your monitor is only 60hz?

FPS != Monitor Refresh Rate. They've got nothing to do with each other! Please stop spreading this nonsense once and for all. You can still experience a benefit from 300 fps on a 60 Hz monitor (replace panel here with framebuffer - as the scan itself doesn't take 16.67 ms; the monitor just updates the screen depending on what is in the framebuffer of the GPU at the exact time of the refresh):
framedeliverygraphic_h7yo4.jpg


Say if you have an object the size of 1px moving at a speed of 1px/8.33ms from left to right, this is what you'll experience with 60fps and 120fps on a 60 Hz monitor, assuming your frame times are completely smooth:

60fpsvs120fps_noteari25xzo.png


Now you also need to keep in mind that frame times aren't consistent, which can cause all sorts of trouble, like showing or rendering the same frame twice (vsync), which is why minimum fps is important, but only if these kind of frame time spikes happen often! This means you should aim for a system that is capable of providing you with a consistent frame time that is lower than you screen fresh rate (so on a 60Hz monitor you would want a system that consistently outputs frames faster than 16.67 ms). That is why charts showing min. avg. and max. fps are completely useless; what you really want is a graph showing the time it takes for the system to render each frame!

The lower the frame times the newer the frame you get on your screen is going to be! At 240 fps the difference between what is actually happening and what is drawn by your GPU will vary between 0 - 4.16 ms, while this can vary between 0 - 16.67 ms when you are running at 60 fps. Combine this with a 60 Hz monitor whose refresh rate isn't synced to your FPS you'll run into all sorts of trouble, especially if your frame times vary constantly.

So you can easily begin to see how a high frame rate is actually better even on a 60 Hz screen, especially when it comes to competitive games.

A higher refresh rate also reduces lag in the exact same way, even if you are running a frame rate that is lower than the refresh rate of the screen!

What IS useful, and this is the part your struggling with, is DX12 brings up the minimums so it lessens out the variance and spikes you get.

If im getting 100fps with dips to 20fps using dx11 on a 40-144hz screen, but Dx12 gives me 90fps with dips to 40fps which should I use?

Come on Sherlock even you should be able to work that one out

Clue, it's not just about high fps numbers it's about a smoother experience

Again this doesn't matter if the minimum fps dips only happen for a few frames. Having a higher minimum FPS doesn't automatically lead to a smoother experience as I already explained - what you want is less variance!

I'll try to explain this best as I can:

r---r-r---r-r---r-r---r-r---r-r---r-r---r-r---r-r---r-r---r => avg. -- between r's, min. -, and max. ---

r--r--r--r--r--r--r--r--r--r--r--r--r-r---r-r--r--r--r--r-r => avg. -- between r's, min. -, and max ---, yet this would be a lot smoother due to the time between r's being more consistent!

EDIT: Found a video that explains this better:

 
Last edited:
DX12 looks good for lower end CPUs (especially AMD 6-8 thread) - but the GPU bottleneck threshold comes down compared to DX11.

Edit: Although 4 thread FX seem to get almost exactly the same framerate in DX11 and DX12. Floating point limited I wonder?
 
Last edited:
Lol Si, when Volta lands the DX12 bashing will probably stop if they pull off a 480(without the flaws) again, it's almost as bad as Nv missing DX11, thank **** there's hardly any titles but more importantly at least they can do some of it this time.:D
 
FPS != Monitor Refresh Rate. They've got nothing to do with each other! Please stop spreading this nonsense once and for all. You can still experience a benefit from 300 fps on a 60 Hz monitor (replace panel here with framebuffer - as the scan itself doesn't take 16.67 ms; the monitor just updates the screen depending on what is in the framebuffer of the GPU at the exact time of the refresh):
framedeliverygraphic_h7yo4.jpg

Say if you have an object the size of 1px moving at a speed of 1px/8.33ms from left to right, this is what you'll experience with 60fps and 120fps on a 60 Hz monitor, assuming your frame times are completely smooth:

60fpsvs120fps_noteari25xzo.png

Now you also need to keep in mind that frame times aren't consistent, which can cause all sorts of trouble, like showing or rendering the same frame twice (vsync), which is why minimum fps is important, but only if these kind of frame time spikes happen often! This means you should aim for a system that is capable of providing you with a consistent frame time that is lower than you screen fresh rate (so on a 60Hz monitor you would want a system that consistently outputs frames faster than 16.67 ms). That is why charts showing min. avg. and max. fps are completely useless; what you really want is a graph showing the time it takes for the system to render each frame!

The lower the frame times the newer the frame you get on your screen is going to be! At 240 fps the difference between what is actually happening and what is drawn by your GPU will vary between 0 - 4.16 ms, while this can vary between 0 - 16.67 ms when you are running at 60 fps. Combine this with a 60 Hz monitor whose refresh rate isn't synced to your FPS you'll run into all sorts of trouble, especially if your frame times vary constantly.

So you can easily begin to see how a high frame rate is actually better even on a 60 Hz screen, especially when it comes to competitive games.

A higher refresh rate also reduces lag in the exact same way, even if you are running a frame rate that is lower than the refresh rate of the screen!



Again this doesn't matter if the minimum fps dips only happen for a few frames. Having a higher minimum FPS doesn't automatically lead to a smoother experience as I already explained - what you want is less variance!

I'll try to explain this best as I can:

r---r-r---r-r---r-r---r-r---r-r---r-r---r-r---r-r---r-r---r => avg. -- between r's, min. -, and max. ---

r--r--r--r--r--r--r--r--r--r--r--r--r-r---r-r--r--r--r--r-r => avg. -- between r's, min. -, and max ---, yet this would be a lot smoother due to the time between r's being more consistent!

EDIT: Found a video that explains this better:


This has nothing to do with displaying the frame rate. a 60hz display can only display 60fps @16ms what you do get however is more responsive inputs because the frame rate isn't being capped.

Framerate latency and input latency are two different things.. People getting better playing experience from uncapped frame rate @60hz is all down to the responsive controls making them believe they seeing more frames.

Also having a higher min frame rate makes games play much more smoother plus the added benefit from DX12 frame latency being more consistent = smoother gameplay. I do agree with frame rate also needs to not jump up and down in big changes to make a game play smoother.

You dont want 100fps and then down to 60fps and back up again this will result in gudder, you best best is to cap @60fps here.
 
Back
Top Bottom