• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

(gamegpu) Battlefield 1 Benchmarks

Eventually just as with all the previous technological advances one side will catch up with the other, in this case NVidia behind with DX12, previously AMD behind with variable refresh, tessellation, Nvidia behind with Async.....etc etc. When both sides catch up then that tech moves forward and it will be the same with DX12 just as it has been with other techs.

Roll on next year and the new GPU's that it will bring from both camps. :)
 
This has nothing to do with displaying the frame rate. a 60hz display can only display 60fps @16ms what you do get however is more responsive inputs because the frame rate isn't being capped.

Re-read this part and yes that is exactly what I said, but it has nothing to do with "seeing more frames" as you said.

Framerate latency and input latency are two different things.. People getting better playing experience from uncapped frame rate @60hz is all down to the responsive controls making them believe they seeing more frames.

Seeing more frames? No - it has to do with showing a newer frame as you can see from the illustration in my previous post - at 60fps the display is going to use a frame that is 16.67ms old, while at 120fps the display is going to draw a frame that is 8.33ms old. Wish I had a high framerate camera and led light for my mouse so I could show you.

Also having a higher min frame rate makes games play much more smoother plus the added benefit from DX12 frame latency being more consistent = smoother gameplay. I do agree with frame rate also needs to not jump up and down in big changes to make a game play smoother.

You dont want 100fps and then down to 60fps and back up again this will result in gudder, you best best is to cap @60fps here.

What? That's just not true at all. Capping framerate will increase the input latency, as you will be seeing an older frame as per the illustration from before. This isn't rocket science.

As for the frame latency in DX12: http://www.guru3d.com/articles-pages/battlefield-1-pc-graphics-benchmark-review,9.html

There's no difference between DX12 and DX11, even though the reported minimum fps was higher while using DX12.
 
Last edited:
Eventually just as with all the previous technological advances one side will catch up with the other, in this case NVidia behind with DX12, previously AMD behind with variable refresh, tessellation, Nvidia behind with Async.....etc etc. When both sides catch up then that tech moves forward and it will be the same with DX12 just as it has been with other techs.

Roll on next year and the new GPU's that it will bring from both camps. :)

True that!
 
Switching to DX12 with my 1070 gives me a stuttering mess.

Which is a shame as my 4690k is at 100% and struggling.
 
I find it a lot of fun that the laughing stock of 2013 (Hawaii XT) is the one proving to be the real tough old ******, its heroic, like a 60 year old Rocky Balboa still able to hold its own in a fight with a champ 2 generations younger.

After all the hate laid on it 3 years ago i'd like to award it the Victoria cross, its a legend.

I love watching it bosh its younger rivals round the ears.

Wasn't the main complaint/joke regarding the 290/290X the noise of the ref cooler and the temps they ran at, not that it was ever a bad card?
It still runs just as loud and as hot today as it did then too! (I know i'm running one)

Frame times seem better on AMD than Nvidia and post above you shows nvidia gain nothing from dx12 frame times.
Maybe that is why people on here running DX12 and Nvidia dont always agree with me and DX12 offers extra smoothness?
While AMD gains..

This could have something to do with it.
Looking at the computerbase review it looks like AMD's DX11 frametimes were worse than Nvidia's DX11 frametimes, which actually look more like AMD's DX12 frametimes.
So maybe AMD users think it's smoother because to them it is because they're DX11 frametimes are not good. Nvidia's DX11 frametimes are good though so don't notice an improvement (in fact it seems quite the opposite).
 
Wasn't the main complaint/joke regarding the 290/290X the noise of the ref cooler and the temps they ran at, not that it was ever a bad card?
It still runs just as loud and as hot today as it did then too! (I know i'm running one)

Intelligent people and none brand warriors could see the only problem it was its pathetic reference cooler.

Sadly for a lot Hardware Enthusiasts vendors are a religion. With that nothing rational has ever come out of any religion or cult.
 
Have you tried setting the amount of pre-rendered frame to 1 from the nvidia control panel? Should reduce the CPU load.

Tried most things I can think of (including that) and it's not made much difference.

I'm running with everything on low at the moment to try and reduce the CPU load but it's just causing my 1070 to run around 40-50% usage whilst my CPU is still at 100%.

With a 144Hz monitor, i've always sacrificed looks for frames as I prefer the smoother game play but BF1 seems a lot more CPU intensive than BF4 did.

Either way, I've upgraded to a 5820k which should arrive today which I'll overclock as per usual. Should give me more grunt in the CPU department and last me for a few years.
 
Intelligent people and none brand warriors could see the only problem it was its pathetic reference cooler.

Sadly for a lot Hardware Enthusiasts vendors are a religion. With that nothing rational has ever come out of any religion or cult.

Yep. Which is why I had 290x under water. When xfire worked they were awesome and matched/beat an overclocked 1080GTX.
 
When xfire worked they were awesome

They were that, pity it went from incredible to lousy in such short a space of time.

IQ techniques and now Api's have made mgpu much harder, Nv dropping 3/4 way SLi and removing it all together on any gpu's<£300 is a good indication too, at this rate mgpu will only be good for BM's soon.
 
Back
Top Bottom