Nvidia G-Sync - eliminates stutter and screen tearing

What im wondering about gsync is that ok it will feel smooth at 60 or 50 or 40 but does the speed of the game slow down even if its smooth or does 60fps speediness of the game stay the same as it goes lower but still smooth. Kinda a weird question but i think we all know what fps we would like to play at and be smooth. If somehow gsync makes it feel the same as 60 at say 45 then that really would be something. But how far down can gsync drop to before the speed would be very noticable and not feel like 60 speed?
 
Definitely looks interesting! :)

Personally I suffer through tearing as I find the input lag due to vsync unbearable in most titles. (Even on my 120Hz though the effect is a little less pronounced)
 
I've always wondered, with digital displays like LCDs, why this technique wasn't used in the first place. If you think about it, there is no need to refresh a screen unless something has changed. With CRTs the screen had to constantly refresh to keep an image on the screen. LCDs never had to do this, but still did.

Nate
 
Maybe I'm missing something but as I said before is this not something that the screen/monitor manufacturers could have implemented as in an adaptive vsync rather than a proprietary nvidia module.

Is there some kind of communication going on between the monitor and graphics card that is proprietary rather than using standard monitor signaling?
 
Well since you need a Kepler card it would seem to indicate the tech needs to both something monitor side and something pc side. But strangely they did seem to want this tech on everything in the press conference i.e tv's mobile devices etc. So not really sure how that would work as a mobile device wouldn't have a Kepler card and neither would a standard tv?
 
Well since you need a Kepler card it would seem to indicate the tech needs to both something monitor side and something pc side. But strangely they did seem to want this tech on everything in the press conference i.e tv's mobile devices etc. So not really sure how that would work as a mobile device wouldn't have a Kepler card and neither would a standard tv?

Exactly. That's what makes me think it's 'just' (using that word loosely) and adaptive vsync card, because how else would it ever work on anything else but a PC monitor being driven by a nvidia graphics card.
 
It's not an adaptive vsync card at all, adaptive vsync won't make 30fps look smooth. Here's a video showing the effects in slow motion, you still aren't seeing all the benefits you would in real time though sitting in front of that screen.

 
No point showing a video of it being demo'd because its nigh on impossible to show gsync in action. Linus explained it a few times that its very different to what he was streaming.

Tho saying that i did actually see the tearing on the columns when they showed that bit when they were rotating the camera around it. But thats the only instance i saw somit. Couldnt see the gsync monitor cos it was on the other side of the table.

Edit - nvm i should have watched ur vid lol. Yeah i guess if you show it that way you can see the tearing. Still in a real live environment i would like to see it personally myself instead of it being shown with a vid cam as experiences will differ greatly. Still that vid shows some of what gsync could be like.

Btw what i find interesting is that gsync is basically a monitor refresh rate auto adjuster so it matches up with the gpu rendering. Why cant they do this just by software right now because lets say you go to your monitor settings and switch from 60 to 50mhz, aint that basically adjusting the monitor refresh rate by software and so couldnt they come up with a pure software version of gsync which controls the montior refresh rate based on the gpu refresh rate?
 
Last edited:
It's to do with the way the monitor takes in the frame, it can't be done via software until the monitor has the hardware to do it, which is where the gsync chip comes into play.

In the future I'd imagine we'll see something similar in all monitors.
 
It's to do with the way the monitor takes in the frame, it can't be done via software until the monitor has the hardware to do it, which is where the gsync chip comes into play.

In the future I'd imagine we'll see something similar in all monitors.

I understand that, but how is it going to work with mobile devices and TV's etc, sounds to me more like it's all down to the gsync chip on the monitor rather than needing an NVidia card to be the source.




(P.S. it appears we have 3 separate threads on this topic now in different sections; Graphics Cards, Monitors and PC Games)
 
I understand that, but how is it going to work with mobile devices and TV's etc, sounds to me more like it's all down to the gsync chip on the monitor rather than needing an NVidia card to be the source.

Who said it was going to work with those? It isn't really needed for mobile devices and TV's as consoles are AMD based. Rarely anyone uses a TV for PC gaming as most of them have input lag issues anyway.
 
Would have to see it in person and in action but I honestly cannot imagine a benefit or improvement worth the extra cost. Can't remember the last time I had tearing, I don't use vsync or adaptive.
 
sounds to me more like it's all down to the gsync chip on the monitor rather than needing an NVidia card to be the source.

Way to state the obvious.
However, it has been designed by Nvidia to work with Nvidia GPUs.

I imagine people will hack AMD drivers to get it working on AMD cards too. Unless there is something in the Kepler hardware that is also required, they did say they've been working on it for years. Time will tell.
 
Back
Top Bottom