• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** Stunning COD Advanced Warfare Benchmarks ***

Well Greg, just watched your MP video and all I can say is ....oh dear.

The game seems to run fairly smoothly but the visuals, well some parts seem to be ok but the bits when you were in the water, I am very surprised that they have been allowed to get away with it looking that bad for a modern title.
The textures of the stuff scattered around the floor, looks like a recent episode of Dr Who with everything being 2D, sorry to say it doesn't look good.

Performance looks ok, but game play, just one word.

Boing
bouncer_animated.gif
 
Oh dear shanks, re read that article and you'll see it is telling you the exact same thing that I am, that you can convert a frametime to fps and back very easily, a frame time is just an fps measurement expressed in milliseconds instead of "fps"

You cant have a "MINIMUM FPS" of 88, and a maximum frametime of 28.7, if you use the calculation from the link you posted (but in reverse) it is 35fps

You cant have both, by definition a frametime of 28.7ms would be recorded as an instantaneous fps of 35, so should be 35 on both charts, if 88 then max frametime can be 11.36ms

what that chart means is that
99 percent of the frames rendered in 1 second had 11ms between them
1 percent had 18.9ms between
and 0.1 percent had 28.7ms

it is a way to measure the variance in time between frames in 1 second because a high variance or inconsistant delivery of the frames can seem to stutter even though the frames per second is high.

I could be wrong but I think you are mixing up what the ms represents in this case. sorry about the bad attempt to explain
 
what that chart means is that
99 percent of the frames rendered in 1 second had 11ms between them
1 percent had 18.9ms between
and 0.1 percent had 28.7ms

it is a way to measure the variance in time between frames in 1 second because a high variance or inconsistant delivery of the frames can seem to stutter even though the frames per second is high.

I could be wrong but I think you are mixing up what the ms represents in this case. sorry about the bad attempt to explain

That doesn't seem to be how the Vortez link explains it.
That suggests that the 99th percentile means that 99% of frames will render faster than that value (e.g. 11ms).

Either way it seems that the 99th percentile is the important thing and lower is better. So at 1080p it's pretty even with Nvidia giving slightly better FPS (especially minimums, which are apparently very important), at 1440p Nvidia have better frametimes (smoothness?) and better FPS (except Max, least important?) and at 4K AMD have better frametimes and better FPS across the board.
 
That doesn't seem to be how the Vortez link explains it.
That suggests that the 99th percentile means that 99% of frames will render faster than that value (e.g. 11ms).

Either way it seems that the 99th percentile is the important thing and lower is better. So at 1080p it's pretty even with Nvidia giving slightly better FPS (especially minimums, which are apparently very important), at 1440p Nvidia have better frametimes (smoothness?) and better FPS (except Max, least important?) and at 4K AMD have better frametimes and better FPS across the board.

I'm not saying I'm right and probably have something explained arseways.

even though nvidia 99th was lower , the AMD card is still smoother at 1440 because the nvidia card has frames which took longer to render. The frame times need to be all as close as possible to be smooth.

EDIT
at 4k the variance in frame times is very similar so both are equally as smooth. AMD card has a higher frame rate though.
 
Last edited:
what that chart means is that
99 percent of the frames rendered in 1 second had 11ms between them
1 percent had 18.9ms between
and 0.1 percent had 28.7ms

it is a way to measure the variance in time between frames in 1 second because a high variance or inconsistant delivery of the frames can seem to stutter even though the frames per second is high.

I could be wrong but I think you are mixing up what the ms represents in this case. sorry about the bad attempt to explain

You are right except for "in one second" and "between them"
The frametime graph show what percentage of frames took how many ms to render

The thing is that any frametime can be converted to an Fps value by dividing 1000 by it, so 1000 / 28.7ms = 35fps
And you can convert an instantaneous (per frame) fps to a frametime by dividing 1000 by it, so
1000/88fps = 11.36ms

You cant say that you travelled at a top speed of 120mph but you did 200 miles in under an hour, it doesnt make sense

When you look at the 2 sets of graphs they dont match up, the minimum fps from one has to be lower than the effective maximum frametime from the other after you convert them to the same value, if on the same run a frametime of 28.7ms was measured, then the minimum fps was 35fps, and not 88
That alone throws all of the graphs and measurements in to question

There are 1000ms in a second, so you can express any measurement of something in terms of how many ms it took, and convert that in to how many of these would fit in a second. Where articles talk about frametime being different to fps is that frame time recordings record an instantaneous value every frame where as an Average fps only tells you the average frametime, not the min or the max, but in this case we DO have the min and the max fps, and the min fps has to be smaller than the maximum frametime shown on the other graph. It has to or it isn't the MINIMUM , by definition.

I may as well say that my minimum fps was aadvark and it makes as much sense.
 
Last edited:
Sorry for OT using bf4 here..

120fps takes a display 8.3ms, my game is showing 120fps
Yet my rendering time is showing 114fps

They not the same thing, rendering time vs frame per second is different.
120fps.png
 
Last edited:
I'm not saying I'm right and probably have something explained arseways.

even though nvidia 99th was lower , the AMD card is still smoother at 1440 because the nvidia card has frames which took longer to render. The frame times need to be all as close as possible to be smooth.

EDIT
at 4k the variance in frame times is very similar so both are equally as smooth. AMD card has a higher frame rate though.

OK 99% of the time Nvidia is smoother as there's less variance.
It may hitch worse at times, but 99% of the time it's smoother.

Although the Vortez article also says that you're unlikely to notice anything below 30ms frametime anyway.


EDIT:
Doesn't Helios1234 'work' for Vortez? Pity he doesn't see this as he might be able to explain it.
 
Last edited:
Sorry for OT using bf4 here..

120fps takes a display 8.3ms, my game is showing 120fps
Yet my rendering time is showing 102fps

They not the same thing, rendering time vs frame per second is different.
120fps.png

Your using 2 seperate display methods, neither of which you really know how instantaneous or averaged they are
The display in the bottom left even says min max and average, none of those are current/instant

You can convert a frametime to an instantaneous fps reading, you know this as you know from above that 120fps is equivalent to a constant 8.3ms

By definition the minimum instantaneous fps recorded anywhere in a run HAS to be smaller than the equivalent largest frametime
 
Last edited:
Right shanks, a quick google tells me that you are using the wrong perf overlay, the cpu gpu one is used to tell you a relationship between cpu and gpu utilisation so that you can see if/which is a bottleneck, theres another one that shows ACTUAL frametimes, that one isnt it
 
Right shanks, a quick google tells me that you are using the wrong perf overlay, the cpu gpu one is used to tell you a relationship between cpu and gpu utilisation so that you can see if/which is a bottleneck, theres another one that shows ACTUAL frametimes, that one isnt it

They is another graph? You know the command and tomorrow I'll take some more screenshots.
 
Back
Top Bottom