• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

V-Sync and power draw

I also cap games to a reasonable FPS to save unnecessary load on the GPU. Also helped in the past to quieten down a particularly loud GPU eg 3070 Ti

Enabling VSync can affect input lag making some games feel sluggish.
 
I thought gsync was vsync anyway of sorts, thought the whole ideal of gsnyc was to stop the tearing, games without vsync enabled look bad to me.

G-Sync will act like V-Sync, but without the latency and V-Sync frame rate multiplier penalties, up to the max G-Sync range frame rate which is usually the same as the monitor max refresh rate - G-Sync itself does not limit frame rates like V-Sync does. If you render faster than the G-Sync range you will get tearing unless V-Sync is enabled. In some cases it is better to cap frame rate slightly below the max G-Sync range to minimise any latency rather than use V-Sync w/ G-Sync.

nVidia did tweak how G-Sync + V-Sync work together more recently which can remove some of the potential latency issues so sometimes capping frame rate might not be necessary.

EDIT: YouTube channel Battlenonsense covers a lot of it in detail with advanced testing - some of the mainstream tech media have got the wrong end of the stick or inaccurate results on G-Sync.
 
Last edited:
I didnt know this. So for a slow game like The Long Dark, i can set to 60fps and bask in the money saving. but for Rocket League i can crank upto 144.

Yes.
All my games are set to different rates.
My main gripe with games is when the frame-rate swings about like crazy. Also I usually cap it, same as you, to save the poor GPU. I would prefer it to stay reasonably constant, so I cap it for each game at about the lowest it will hit during play which is usually about 90fps.
 
 
G-Sync will act like V-Sync, but without the latency and V-Sync frame rate multiplier penalties, up to the max G-Sync range frame rate which is usually the same as the monitor max refresh rate - G-Sync itself does not limit frame rates like V-Sync does. If you render faster than the G-Sync range you will get tearing unless V-Sync is enabled. In some cases it is better to cap frame rate slightly below the max G-Sync range to minimise any latency rather than use V-Sync w/ G-Sync.

nVidia did tweak how G-Sync + V-Sync work together more recently which can remove some of the potential latency issues so sometimes capping frame rate might not be necessary.

EDIT: YouTube channel Battlenonsense covers a lot of it in detail with advanced testing - some of the mainstream tech media have got the wrong end of the stick or inaccurate results on G-Sync.
Thx for that post and youtube channel info, cheers
 
Thx for link.
 
arent you meant to set it on anyway so gsync syncs your frames without tearing? or are you setting your fps and refresh so low it doesnt tear?
I thought gsync was vsync anyway of sorts, thought the whole ideal of gsnyc was to stop the tearing, games without vsync enabled look bad to me.

Yes, I am assuming that people now have G-Sync or Freesync, neither of which limit your framerate. It's a good idea to set G-Sync (or Adaptive Sync) "on" and also set the frame-rate in the NVIDIA Control panel to a "sensible limit".
 
Yep I can confirm. On my 120Hz G-sync monitor I get the best results by choosing a framerate limit in the Nvidia Control Panel at 100fps (110 or 115 would also work but I like round numbers), G-sync on, V-sync off. No tearing, no extra latency, reduced coil whine.


Only need to turn the framerate limit off when setting benchmark high scores :cry:
 
Yep I can confirm. On my 120Hz G-sync monitor I get the best results by choosing a framerate limit in the Nvidia Control Panel at 100fps (110 or 115 would also work but I like round numbers), G-sync on, V-sync off. No tearing, no extra latency, reduced coil whine.


Only need to turn the framerate limit off when setting benchmark high scores :cry:
Actually, you need to activate both Gsync AND Vsync, and then set a framerate cap 3 or 4 fps below your monitors refresh rate.

Best method would be to do all of the above within game settings cause it adds less input latency, but not all games have the option to framecap, so do it from nvcp. So Gsnyc on, Vsync on, fps cap at 116 for 120hz monitors.

Only activating Gsync even with a framecap still causes tearing cause of frametime variance. You need vsync on to solve that, and since you framecaped the game you don't get any added latency from vsync
 
Last edited:
Actually, you need to activate both Gsync AND Vsync, and then set a framerate cap 3 or 4 fps below your monitors refresh rate.

Best method would be to do all of the above within game settings cause it adds less input latency, but not all games have the option to framecap, so do it from nvcp. So Gsnyc on, Vsync on, fps cap at 116 for 120hz monitors.

Only activating Gsync even with a framecap still causes tearing cause of frametime variance. You need vsync on to solve that, and since you framecaped the game you don't get any added latency from vsync
I have literally never had screen tearing with V-sync disabled, G-sync on and 100fps cap under my 120Hz monitor refresh rate. So what is the advantage of turning V-sync on?
 
The whole G-sync thing is so damn confusing, turn v-sync on or off at the ingame settnigs?

if my monitor is set to 120hz but the gpu can only render say 70fps, what happens to the missing 50 frames
 
The whole G-sync thing is so damn confusing, turn v-sync on or off at the ingame settnigs?

if my monitor is set to 120hz but the gpu can only render say 70fps, what happens to the missing 50 frames
Nothing. G-sync changes your monitor's refresh rate to 70Hz when you have 70 fps, and 53 Hz when you have 53 fps.

In game just choose your resolution and monitor refresh rate (e.g. 120Hz for me), leave V-sync off. In Nvidia Control Panel make sure G-sync is enabled and frame rate is limited to a number under your monitor's max refresh rate (e.g. 100 or 116 fps for me). Leave the V-sync option here to Application Controlled. I never have to change this for any games, it just works
 
Last edited:
I have literally never had screen tearing with V-sync disabled, G-sync on and 100fps cap under my 120Hz monitor refresh rate. So what is the advantage of turning V-sync on?
Well for a start games look so much better graphics wise with V-sync on, with it off the graphics don't look so crisp to me, very obvious to me.
 
The whole G-sync thing is so damn confusing, turn v-sync on or off at the ingame settnigs?

if my monitor is set to 120hz but the gpu can only render say 70fps, what happens to the missing 50 frames
The definition of adaptive sync is this:

Instead of the monitor drawing 60 frames per second, it only draws a new frame when the card/game says "Hey I have a new frame ready, draw it now".

So if the card is only able to deliver frames at 50 per second, the monitor draws those 50. Without adaptive sync, it would have had to draw the same frame 2 times in a row, while it waited for the next frame from the graphics card. It would have to do this "double draw" the used frame 10 times per second to make up for the 50 GPU frames into 60 Hertz.
 
The whole G-sync thing is so damn confusing, turn v-sync on or off at the ingame settnigs?

if my monitor is set to 120hz but the gpu can only render say 70fps, what happens to the missing 50 frames

Check out the link I posted, don't do what Icewolf said, the link actually has the science behind it, not anecdotal evidence.

 
Check out the link I posted, don't do what Icewolf said, the link actually has the science behind it, not anecdotal evidence.

Having read the 5 year old article you linked I still cannot find a reason to complicate things by turning V-sync on with G-sync. Maybe on LCD monitors it makes things clearer. I haven't used one for a while since I found that they made everything look washed out and blurry. On OLEDs the image is perfectly sharp and smooth with the settings I described. I have never heard anywhere else that V-sync makes an image look sharper: that sounds more like the monitor was having issues or another setting was being disabled when you turned V-sync off.

If you find that it helps you then by all means go ahead and enable V-sync as well, but as several people have said, what matters most is the frame cap under your max refresh rate and having G-sync enabled, both via the Nvidia Control Panel.
 
Having read the 5 year old article you linked I still cannot find a reason to complicate things by turning V-sync on with G-sync. Maybe on LCD monitors it makes things clearer. I haven't used one for a while since I found that they made everything look washed out and blurry. On OLEDs the image is perfectly sharp and smooth with the settings I described. I have never heard anywhere else that V-sync makes an image look sharper: that sounds more like the monitor was having issues or another setting was being disabled when you turned V-sync off.

If you find that it helps you then by all means go ahead and enable V-sync as well, but as several people have said, what matters most is the frame cap under your max refresh rate and having G-sync enabled, both via the Nvidia Control Panel.

Did you read the updated section from 2019 (hint the technology doesn't have an expiry date)
I'm also not sure how clicking yes on an option is"complicating things", unlike say giving incorrect contradictory advice from a site literary dedicated the subject.

However, it works for you, great, carry on.

Wait, why should I enable V-SYNC with G-SYNC again? And why am I still seeing tearing with G-SYNC enabled and V-SYNC disabled? Isn’t G-SYNC suppose to fix that?
(LAST UPDATED: 05/02/2019)

The answer is frametime variances.

“Frametime” denotes how long a single frame takes to render. “Framerate” is the totaled average of each frame’s render time within a one second perio

At 144Hz, a single frame takes 6.9ms to display (the number of which depends on the max refresh rate of the display, see here), so if the framerate is 144 per second, then the average frametime of 144 FPS is 6.9ms per fram

In reality, however, frametime from frame to frame varies, so just because an average framerate of 144 per second has an average frametime of 6.9ms per frame, doesn’t mean all 144 of those frames in each second amount to an exact 6.9ms per; one frame could render in 10ms, the next could render in 6ms, but at the end of each second, enough will hit the 6.9ms render target to average 144 FPS pe

So what happens when just one of those 144 frames renders in, say, 6.8ms (146 FPS average) instead of 6.9ms (144 FPS average) at 144Hz? The affected frame becomes ready too early, and begins to scan itself into the current “scanout” cycle (the process that physically draws each frame, pixel by pixel, left to right, top to bottom on-screen) before the previous frame has a chance to fully display (a.k.a. tearing

G-SYNC + V-SYNC “Off” allows these instances to occur, even within the G-SYNC range, whereas G-SYNC + V-SYNC “On” (what I call “frametime compensation” in this article) allows the module (with average framerates within the G-SYNC range) to time delivery of the affected frames to the start of the next scanout cycle, which lets the previous frame finish in the existing cycle, and thus prevents tearing in all instance

And since G-SYNC + V-SYNC “On” only holds onto the affected frames for whatever time it takes the previous frame to complete its display, virtually no input lag is added; the only input lag advantage G-SYNC + V-SYNC “Off” has over G-SYNC + V-SYNC “On” is literally the tearing seen, nothing mor

For further explanations on this subject see part 1 “Control Panel,” part 4 “Range,” and part 6 “G-SYNC vs. V-SYNC OFF w/FPS Limit” of this article, or read the excerpts below…[e.s.).r.e.d.xcerpts below…
 
Back
Top Bottom