Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
+2
I think too many people get hung up on maximum framerates and even average.
I know I would much rather play a game with a card which gave a min of 40 and max of 50 fps with an average of 45 rather than another card which technically is 10% faster with an av of 50 fps but had max fps of 90 and min of 10.
 but wait in that DX11 part where you was getting 35fps you are now getting 42fps and its still buttery smooth, wow...
	snip
It's like to hell with progress as my Nvidia Gpu does not have the capability, but when Nvidia do eventually get there with most likely Volta they will be shouting about dx12 from the rooftops.
Oh god how many times do we have to say it, it's not all about high fps numbers? How is having 100fps useful if your monitor is only 60hz? How is having 200fps useful if your adaptive sync range is 40-144hz?
What IS useful, and this is the part your struggling with, is DX12 brings up the minimums so it lessens out the variance and spikes you get.
If im getting 100fps with dips to 20fps using dx11 on a 40-144hz screen, but Dx12 gives me 90fps with dips to 40fps which should I use?
Come on Sherlock even you should be able to work that one out
Clue, it's not just about high fps numbers it's about a smoother experience

This is all well and good but the gamegpu benches show that the minimums are worse in DX12 as well (for Nvidia) in this game.
It is a fair point - From the benches in the OP it looks like DX12 is useless for Nvidia owners (unless it adds anything graphically/makes the game look nicer but it doesn't appear that it does)
I have to ask, running an Asus GTX 970 Strix is it worth a cross grade to a RX480 for BF1 for the DX12 support? The min/max on the first page would suggest it is...
Primarily play BF1, GTAV and Star Wars Battlefront...
I find it a lot of fun that the laughing stock of 2013 (Hawaii XT) is the one proving to be the real tough old ******, its heroic, like a 60 year old Rocky Balboa still able to hold its own in a fight with a champ 2 generations younger.
After all the hate laid on it 3 years ago i'd like to award it the Victoria cross, its a legend.
Oh god how many times do we have to say it, it's not all about high fps numbers? How is having 100fps useful if your monitor is only 60hz?
	
	What IS useful, and this is the part your struggling with, is DX12 brings up the minimums so it lessens out the variance and spikes you get.
If im getting 100fps with dips to 20fps using dx11 on a 40-144hz screen, but Dx12 gives me 90fps with dips to 40fps which should I use?
Come on Sherlock even you should be able to work that one out
Clue, it's not just about high fps numbers it's about a smoother experience

FPS != Monitor Refresh Rate. They've got nothing to do with each other! Please stop spreading this nonsense once and for all. You can still experience a benefit from 300 fps on a 60 Hz monitor (replace panel here with framebuffer - as the scan itself doesn't take 16.67 ms; the monitor just updates the screen depending on what is in the framebuffer of the GPU at the exact time of the refresh):
![]()
Say if you have an object the size of 1px moving at a speed of 1px/8.33ms from left to right, this is what you'll experience with 60fps and 120fps on a 60 Hz monitor, assuming your frame times are completely smooth:
![]()
Now you also need to keep in mind that frame times aren't consistent, which can cause all sorts of trouble, like showing or rendering the same frame twice (vsync), which is why minimum fps is important, but only if these kind of frame time spikes happen often! This means you should aim for a system that is capable of providing you with a consistent frame time that is lower than you screen fresh rate (so on a 60Hz monitor you would want a system that consistently outputs frames faster than 16.67 ms). That is why charts showing min. avg. and max. fps are completely useless; what you really want is a graph showing the time it takes for the system to render each frame!
The lower the frame times the newer the frame you get on your screen is going to be! At 240 fps the difference between what is actually happening and what is drawn by your GPU will vary between 0 - 4.16 ms, while this can vary between 0 - 16.67 ms when you are running at 60 fps. Combine this with a 60 Hz monitor whose refresh rate isn't synced to your FPS you'll run into all sorts of trouble, especially if your frame times vary constantly.
So you can easily begin to see how a high frame rate is actually better even on a 60 Hz screen, especially when it comes to competitive games.
A higher refresh rate also reduces lag in the exact same way, even if you are running a frame rate that is lower than the refresh rate of the screen!
Again this doesn't matter if the minimum fps dips only happen for a few frames. Having a higher minimum FPS doesn't automatically lead to a smoother experience as I already explained - what you want is less variance!
I'll try to explain this best as I can:
r---r-r---r-r---r-r---r-r---r-r---r-r---r-r---r-r---r-r---r => avg. -- between r's, min. -, and max. ---
r--r--r--r--r--r--r--r--r--r--r--r--r-r---r-r--r--r--r--r-r => avg. -- between r's, min. -, and max ---, yet this would be a lot smoother due to the time between r's being more consistent!
EDIT: Found a video that explains this better: