• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia preparing new Geforce with GK110

Thanks. More helpful than the posts above. I wonder what effect the monitors refresh rate has dropping down to match the fps has, at say 30fps? I've really no idea.

Well screen tearing is the result of frames being out of sync because refresh rates are static but FPS is dynamic. If you have a game hovering between 30-35FPS every now and then a frame is going to be out of sync due to varying frame times.

If you make the refresh rate dynamic and match the FPS though you eliminate the problem.

I suppose you could say in some sense, the GPU is telling the monitor when to refresh rather than the monitor just happily refreshing 60 times a second.
 
Well screen tearing is the result of frames being out of sync because refresh rates are static but FPS is dynamic. If you have a game hovering between 30-35FPS every now and then a frame is going to be out of sync due to varying frame times.

If you make the refresh rate dynamic and match the FPS though you eliminate the problem.

I suppose you could say in some sense, the GPU is telling the monitor when to refresh rather than the monitor just happily refreshing 60 times a second.

How will the image on screen be affected by the refresh rate matching the fps, say if the fps drops to 30fps or below?
 
How will the image on screen be affected by the refresh rate matching the fps, say if the fps drops to 30fps or below?

This is what I'm curious about as well. People say they won't buy these current 4K screens that are only 30Hz because that's rubbish for games. If the refresh rate is going to be that low for long periods or constantly up and down, psychovisual perception of the image will be impacted harshly.
 
This is what I'm curious about as well. People say they won't buy these current 4K screens that are only 30Hz because that's rubbish for games. If the refresh rate is going to be that low for long periods or constantly up and down, psychovisual perception of the image will be impacted harshly.

Agreed that's the point ive been trying to raise, rather poorly it would seem. :p
 
Don't kill me but what's the diff between this and adaptive v-sync.

Adaptive v-sync turns VSync off when FPS is below the monitors refresh rate so that VSync doesn't force you to 30FPS (which is horrible!).

How will the image on screen be affected by the refresh rate matching the fps, say if the fps drops to 30fps or below?

If the refresh rate = FPS, there's no need for synchronisation to ensure no tearing. So the impact is no tearing, no gittering and no input lag.
 
How will the image on screen be affected by the refresh rate matching the fps, say if the fps drops to 30fps or below?

This is what I'm curious about as well. People say they won't buy these current 4K screens that are only 30Hz because that's rubbish for games. If the refresh rate is going to be that low for long periods or constantly up and down, psychovisual perception of the image will be impacted harshly.

Well the G-Sync kicks in at 30 or above fps, so if it drops under, you will see stutter/lag. I guess you already know this though Matt, as you used 30fps or below as a reference ;)
 
This is what I'm curious about as well. People say they won't buy these current 4K screens that are only 30Hz because that's rubbish for games. If the refresh rate is going to be that low for long periods or constantly up and down, psychovisual perception of the image will be impacted harshly.

Well yes that's crap because you're limited to 30 FPS :p That's not the same thing as a screens refresh rate lowering to 30Hz because that's all your game us running at.
 
Okay. Here is what I think I understand from it and having a play with Crysis 3 with sync on and off with fps limit,
basically, anytime you are not exactly matched to the refresh rate of the monitor e.g v sync, then you will have moments where the GPU is out of sync with the monitor, meaning that something weird will happen on screen, either a tear or a frame not updating or whatever

So right now 31fps looks awful because on top of the monitor only getting an update every other frame, you are also getting out of sync frames, further reducing your effective frame rate

If you don't get tearing you'll get hitching every other second or so

With gsync, the monitor is told to update as soon as the gpu is ready, so all of the problem updates go away, so you get the full benefit of every frame the GPU is capable of delivering

sync has to jump from 30 to 60 and back because it's locked to the monitor which is jarring, and low fps looks bad without sync because of these issues, with gsync the transitions are smoother, so the whole experience is smoother

I would imagine a 30fps would still be noticeable, but if drops to say 40 become imperceptible then it really opens up a lot of options in terms of what settings people can get away with using without worrying about keeping 60fps as a minimum
 
Last edited:
Okay. Here is what I think I understand from it and having a play with Crysis 3 with sync on and off with fps limit,
basically, anytime you are not exactly matched to the refresh rate of the monitor e.g v sync, then you will have moments where the GPU is out of sync with the monitor, meaning that something weird will happen on screen, either a tear or a frame not updating or whatever

So right now 31fps looks awful because on top of the monitor only getting an update every other frame, you are also getting out of sync frames, further reducing your effective frame rate

If you don't get tearing you'll get hitching every other second or so

With gsync, the monitor is told to update as soon as the gpu is ready, so all of the problem updates go away, so you get the full benefit of every frame the GPU is capable of delivering

sync has to jump from 30 to 60 and back because it's locked to the monitor which is jarring, and low fps looks bad without sync because of these issues, with gsync the transitions are smoother, so the whole experience is smoother

I would imagine a 30fps would still be noticeable, but if drops to say 40 become imperceptible then it really opens up a lot of options in terms of what settings people can get away with using without worrying about keeping 60fps as a minimum

Even that one small aspect of its got a lot of potential - if you were capable of rendering at 40fps but using 60Hz you have the options of:

Dropping to 30fps with vsync on and noticeable jumps up and down from the next multiplier up and down.
40fps without vsync and tearing
40fps with adaptive vsync for less pronounced effect as the fps jumps up and down but tearing when not at at a framerate that can be locked with the Hz.

With g-sync you now get 40fps, no tearing and none of the big jumps to and from the next multiplier of the Hz.

EDIT: Kind of repeating what your saying a bit but trying to pull that 40fps example out a bit more.
 
Last edited:
Nvidia just need to drop prices tbh, simple laymans terms. They charge too much. I own a 670 sli setup, great cards. Was at a mates house tonight. He has the same cpu and ram as me but a maximus 6 hero board. Same oc on the cpu, playing bf3 at the same res of 1920x1200, the game felt a bit smoother than my own setup. Cant explain why, it just felt better. Vram wasdnt the case on both rigs, because i see a max of 1500mb of vram use in bf3 on ultra with my 670's. Both systems using windows 7.
 
Enter NVIDIA G-SYNC, which eliminates screen tearing, VSync input lag, and stutter. To achieve this feat, we build a G-SYNC module into monitors, allowing G-SYNC to synchronize the monitor to the output of the GPU, instead of the GPU to the monitor, resulting in a tear-free, faster, smoother experience that redefines gaming..

Awesome! I've been anticipating this tech coming out for many years and now it's actually happening. Not sure why it took so long, but I always thought vsync was a silly way of doing things. It made much more sense for the monitor to sync with the graphics card. I guess there was some technical limitation preventing monitors from refreshing at variable speed.

If this does work the way they say it will, then you can be assured it will be a significant milestone in video gaming technology. Screen tearing, vsync input lag and stutter, all banished for good.
 
Nvidia just need to drop prices tbh, simple laymans terms. They charge too much. I own a 670 sli setup, great cards. Was at a mates house tonight. He has the same cpu and ram as me but a maximus 6 hero board. Same oc on the cpu, playing bf3 at the same res of 1920x1200, the game felt a bit smoother than my own setup. Cant explain why, it just felt better. Vram wasdnt the case on both rigs, because i see a max of 1500mb of vram use in bf3 on ultra with my 670's. Both systems using windows 7.

Interesting observation, what gpu does he have setter?
 
You're making it sound like you don't know either otherwise you'd just present some facts to say im wrong. :D

How can something be not smooth at 30fps with vsync off but smooth at 30fps with gsync on. Waiting for the sync regardless of if its handled by the monitor or gpu is going to make performance worse than if it was not handled by anything no? I can understand it might offer some benefit vs traditional vsync, but at 30 fps its not going to matter what is used vs what is not used. 30 fps on most games is not going to represent smooth and stutter free gaming.

Really surprised Matt that you're not grabbing the concept of how two devices are never fully in sync. E.g. the GPU and monitor. Not sure what you mean by not handled by anything? An LCD much like am old tube CRT refreshes vertically frame by frame. What part of removing this buffer and having the GPU in complete harmony isn't a good thing?
 
Really surprised Matt that you're not grabbing the concept of how two devices are never fully in sync. E.g. the GPU and monitor. Not sure what you mean by not handled by anything? An LCD much like am old tube CRT refreshes vertically frame by frame. What part of removing this buffer and having the GPU in complete harmony isn't a good thing?

I guess its something that's going to have to be seen to fully understand the difference it will make, at least for me. My fps are never that low so i guess its something that will benefit a slower gpu generally than the more powerful ones. I have to admit though, i don't really have a problem with lag or stutter when using vsync. Some input lag granted, but i can remove that by either taking off vsync or using an fps limit. Obviously if fps drops then its noticeable, but personally i don't start to notice it unless fps drops below say 50. Tearing does not always occur with it off either, that seems to vary on a game by game basis for me. So it sounds like a nice feature, but not one id personally describe as game changing because theres not really an issue there in the first place. Except in a few circumstances. For those situations im sure it will help. I can see the benefit of it a bit more now, but i don't think it would change too much for me personally.
 
Last edited:
I guess its something that's going to have to be seen to fully understand the difference it will make, at least for me. My fps are never that low so i guess its something that will benefit a slower gpu generally than the more powerful ones. I have to admit though, i don't really have a problem with lag or stutter when using vsync. Some input lag granted, but i can remove that by either taking off vsync or using an fps limit. Obviously if fps drops then its noticeable, but personally i don't start to notice it unless fps drops below say 50. Tearing does not always occur with it off either, that seems to vary on a game by game basis for me. So it sounds like a nice feature, but not one id personally describe as game changing because theres not really an issue there in the first place. Except in a few circumstances. For those situations im sure it will help. I can see the benefit of it a bit more now, but i don't think it would change too much for me personally.


You are probably acclimatised to the artefacts so you tune them out... I typically run with vsync on and put up with a bit of input lag, but testing last night and ramping up the settings to cause dips to 40-50 I could definitely see them happening

Now if this tech, as the people at the demo said, causes dips to 40 become imperceptible, it means you can run settings that mean up to 50% better min frame rates by having them off

50% perceived performance difference matt, that is huge, I can see why the guys in the room were so excited

Add to that no game support needed and it makes mantle pale by comparison

Bear in mind, carmack is CTO of Rift now, given how great he seems to think this is, I am now expecting Oculus Rift Gsync edition :D
 
Last edited:
Mantle is a low level API. Gsync is monitor synchronisation. If anything the two technologies combined would be brilliant. Although it is unlikely to happen. Not sure how this in anyway shape or form makes Mantle seem any less of a good thing.
 
Back
Top Bottom