So ~2ms when under the monitors refresh rate and ~7ms when over the refresh rate. I can live with that. 
http://www.blurbusters.com/gsync/preview2/
Good article there on latency and input lag.
The whole article is worth a read in truth and blurbusters have done a lot of investigating. I don my cap to them for their hard work in bringing this info. They end with this news as well, which sounds extremely promising.

http://www.blurbusters.com/gsync/preview2/
Good article there on latency and input lag.
Total Input Lag of Battlefield 4
![]()
The game, Battlefield 4, is known to be extremely laggy, even on fast systems. It low 10Hz tick rate ads a huge amount of input lag, and the game rarely caps out at a monitor’s full refresh rate. Battlefield 4 is a game that typically runs at frame rates that benefits immensely from G-SYNC in eliminating erratic stutters and tearing.
Crysis 3
![]()
It was good that we were also unable to detect any input lag degradation by using G-SYNC instead of VSYNC OFF. There were many situations where G-SYNC’s incredible ability to smooth the low 45fps frame rate, actually felt better than stuttery 75fps — this is a case where G-SYNC’s currently high price tag is justifiable, as Crysis 3 benefitted immensely from G-SYNC.
Total Input Lag of Counter Strike: Global Offensive
![]()
The older game, CS:GO, easily runs at 300 frames per second on a GeForce Titan, so this presents an excellent test case to max-out the frame rate of a G-SYNC monitor. We were curious if G-SYNC monitors started having input lag when frame rates were maxed out at the G-SYNC monitors’ maximum frame rate. We got some rather unusual results, with some very bad news immediately followed by amazingly good news!
At first, it was pretty clear that G-SYNC had significantly more input lag than VSYNC OFF. It was observed that VSYNC OFF at 300fps versus 143fps had fairly insignificant differences in input lag (22ms/26ms at 300fps, versus 24ms/26ms at 143fps). When I began testing G-SYNC, it immediately became apparent that input lag suddenly spiked (40ms/39ms for 300fps cap, 38ms/35ms for 143fps cap). During fps_max=300, G-SYNC ran at only 144 frames per second, since that is the frame rate limit. The behavior felt like VSYNC ON suddenly got turned on.
The good news now comes: As a last-ditch, I lowered fps_max more significantly to 120, and got an immediate, sudden reduction in input lag (27ms/24ms for G-SYNC). I could no longer tell the difference in latency between G-SYNC and VSYNC OFF in Counterstrike: GO! Except there was no tearing, and no stutters anymore, the full benefits of G-SYNC without the lag of VSYNC ON.
This is still low-latency territory
Even when capped out, the total-chain input lag of 40ms is still extremely low for button-to-pixels latency. This includes game engine, drivers, CPU, GPU, cable lag, not just the display itself. Consider this: Some old displays had more input lag than this, in the display alone! (Especially HDTV displays, and some older 60Hz VA monitors).
In an extreme case scenario, photodiode oscilloscope tests show that a blank Direct3D buffer (alternating white/black), shows a 2ms to 4ms latency between Direct3D Present() and the first LCD pixels illuminating at the top edge of the screen. This covers mostly cable transmission latency and pixel transition latency. Currently, all current models of ASUS/BENQ 120Hz and 144Hz monitors are capable of zero-buffered real-time scanout, resulting in sub-frame latencies (including in G-SYNC mode).
The whole article is worth a read in truth and blurbusters have done a lot of investigating. I don my cap to them for their hard work in bringing this info. They end with this news as well, which sounds extremely promising.
This is still low-latency territory
Even when capped out, the total-chain input lag of 40ms is still extremely low for button-to-pixels latency. This includes game engine, drivers, CPU, GPU, cable lag, not just the display itself. Consider this: Some old displays had more input lag than this, in the display alone! (Especially HDTV displays, and some older 60Hz VA monitors).
In an extreme case scenario, photodiode oscilloscope tests show that a blank Direct3D buffer (alternating white/black), shows a 2ms to 4ms latency between Direct3D Present() and the first LCD pixels illuminating at the top edge of the screen. This covers mostly cable transmission latency and pixel transition latency. Currently, all current models of ASUS/BENQ 120Hz and 144Hz monitors are capable of zero-buffered real-time scanout, resulting in sub-frame latencies (including in G-SYNC mode).
Conclusion
As even the input lag in CS:GO was solvable, I found no perceptible input lag disadvantage to G-SYNC relative to VSYNC OFF, even in older source engine games, provided the games were configured correctly (NVIDIA Control Panel configured correctly to use G-SYNC, and game configuration updated correctly). G-SYNC gives the game player a license to use higher graphics settings in the game, while keeping the gameplay smooth.
We are very glad that manufacturers are paying serious attention to strobe backlights now, ever since this has been Blur Busters raison d’être (ever since our domain name used to be www.scanningbacklight.com in 2012, during the Arduino Scanning Backlight Project).
Last edited: