Frame skipping and gsync. How does it work?

Associate
Joined
20 Nov 2014
Posts
246
I was wondering about this simple fact: the Acer Predator XR341CKA, gsync version, should mount the same panel as his brother XR341CK (freesync) and as the Dell U3415W and I reckon it's the same of the LG's too.

Now, these panels have been tested for overclocking previously and they did fail in all and every case. They can handle higher frequencies but they skip frames right after the 60's so that's basically a no-go. You don't overclock something out of his specs not to have the benefits you overclocked it for in the first place right?

Wrong, apparently, since it's what Acer (and presumably Asus too with the coming ROG) did with the aforementioned Predator, both versions. One is working @75hz the other is rumored to support up to 100hz.

Can someone please explain me how's that possible? I mean technically, how's a different pcb and the g/freesync chip altering the panel's efficiency in order to work so much beyond its specs? How can one be sure that if the gpu is generating 100 frames per second all of them are shown and none is skipped but not detected since "synced"? Moreover: isn't it a bit tricky in the sense we should expect these screens to burn out so much faster than they should? I don't know you but when I pay 1k quid for a monitor I want it to last forever, at least, and not to be working out of its actual specs either :o
 
Associate
OP
Joined
20 Nov 2014
Posts
246
Let me try to reword my doubts about this "100hz with gsync enabled" thing.

Since these panels don't take overclocking very easily, couldn't be gsync a way to trick the eye and the monitor specs too? Like (this is pure fantasy ok?) recognizing if there is a skipped frame and showing the previous one twice (or just longer) so that inspector tools see that there are 100 frames shown every sec but the real ones, those generated by the gpu, are maybe still 60 or 75 but synced so that no one see the gap?
 
Associate
OP
Joined
20 Nov 2014
Posts
246
the gsync module is AFTER the GPU, so if it was the gsync module duplicating frames, in game or PC based fps counters would only see the number being generated on the PC end, it wouldn't ever try to claim there were 100 frames, it would be the 60 or 75
That's a pretty valid point against my "conspiracy theory" of gsync altering the results of the fps count :D

Still..
Baddass said:
it remains to be seen what the X34 G-sync model can handle and how the 100Hz works in practice
...it has to be seen at work. How it acts with frames being dropped will remain a mistery as long as someone like Baddass doesn't put his hands on one of these things. And I reckon it's not even that easy to find out. Probably a test like the FCAT one to measure the frame pacing. Anyway.... great answers so far thanks to all. There's nothing else to do but waiting now :)
 
Last edited:
Associate
OP
Joined
20 Nov 2014
Posts
246
Well, look at what I found.

Video: https://www.youtube.com/watch?v=VkrJU5d2RfA
Article: http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ

Seems like I wasn't that wrong with my suspicions

Basically: both gsync and freesync do work in a certain range of frequencies depending on each monitor, but this is no news. If the frame rate dips down below that interval two different behaviours take place: the gsync will double (and triple, quadruple....) the frequency and insert the same frame twice keeping it in pace with the refresh rate so not the eye nor the inspector tools will notice any frame drop, because in fact there is no frame drop. The freesync instead will keep the minimum refresh rate stable and keep eating frames as they are sent to the monitor by the gpu, which will result in stuttering and tearing at the same time.

So yeah, gsync tricks us, but that's exactly its purpose.
 
Associate
OP
Joined
20 Nov 2014
Posts
246
The thing is kinda different. FPS meters are not wrong, if it say 20fps it's because the gpu is pushing those out, but a gsync monitor will render 40 @40hz, doubling them and keeping them perfectly paced with the "main clock" or however you wanna call the sync between the gpu and monitor. There's a buffer for that I guess so it's easily done.

On an even more extreme example lat's say that:
- your gsync monitor has a range of 40-100hz
- your gpu is in deep trouble and can push only 15 fps

What will happen is:
- fraps will measure 15fps
- your gsync buffer will notice that and will double the freq and insert the same frame twice, but 30 fps/hz is still below the working specs, so it will triple the freq and insert the same frame three times. Now you have 15fps but 3 frames shown on a 45hz monitor and everything looks smooth and clean

TL;DR: gsync owns :cool:
 
Last edited:
Associate
OP
Joined
20 Nov 2014
Posts
246
it's even better than that, it doesn't hold the monitor at the low frame rate, it keeps inserting the image but at the monitors MAXIMUM frame rate [...]
If that makes sense?

And yeah, its pretty clever stuff

stonk.001.gif


Geez I missed that part. Makes a lot of sense, in fact I finally understand why they ask £150 extra for that damn logo :D
 
Back
Top Bottom