This is why I like to use modern tools to measure not just frame latency but game engine latency, along with GPU Busy. Ergo, nothing beats Intel's Present Mon currently (it's Open Source too), though even that has some trouble with FG. NVIDIA seems to have their own tool based on Present Mon but they modified it and did not made any changes public, so it's a bit of a black box, hence I avoid it.
Example of what I see in CP2077 just now on my 4090 - exact settings, scene etc. irrelevant (it's with all settings to max, including PT and DLSS Quality), just pure comparison of Reflex on/off and FG:
Reflex off, FG off: 72FPS, 50ms game latency
Reflex on, FG off: 72FPS, 26ms game latency
Reflex on, FG on: 125FPS, 36ms game latency (and frame time itself is about 16ms)
Reflex by itself cuts game latency by half, then FG adds 10ms on top of it (still with Reflex, which matter a lot with FG). Frame packing isn't ideal with FG, but playable with base 72FPS, even though I can already feel game lagging a bit whilst using my mouse. Adding game latency, mouse and KB latency and monitor latency and we're looking (even with Reflex) into 60ms+ of overall input lag. Humans need on average over 250ms of time to react to things changing on the screen, which this puts in summary at over 300ms of input lag including humans. Younger gamers are quite a bit below 200ms reaction time, though, apparently.

But even in my 40s I can feel instantly a difference between FG on and off and between playing on a modern PC versus playing on older consoles and computers (8 and 16 bit machines), where input lag was close to 0 and it was just down to human reaction.