Frame skipping and gsync. How does it work?

Associate
Joined
20 Nov 2014
Posts
246
I was wondering about this simple fact: the Acer Predator XR341CKA, gsync version, should mount the same panel as his brother XR341CK (freesync) and as the Dell U3415W and I reckon it's the same of the LG's too.

Now, these panels have been tested for overclocking previously and they did fail in all and every case. They can handle higher frequencies but they skip frames right after the 60's so that's basically a no-go. You don't overclock something out of his specs not to have the benefits you overclocked it for in the first place right?

Wrong, apparently, since it's what Acer (and presumably Asus too with the coming ROG) did with the aforementioned Predator, both versions. One is working @75hz the other is rumored to support up to 100hz.

Can someone please explain me how's that possible? I mean technically, how's a different pcb and the g/freesync chip altering the panel's efficiency in order to work so much beyond its specs? How can one be sure that if the gpu is generating 100 frames per second all of them are shown and none is skipped but not detected since "synced"? Moreover: isn't it a bit tricky in the sense we should expect these screens to burn out so much faster than they should? I don't know you but when I pay 1k quid for a monitor I want it to last forever, at least, and not to be working out of its actual specs either :o
 
Soldato
Joined
31 Dec 2006
Posts
7,224
I don't think it's unreasonable for a 60Hz panel to reach 75Hz, IF they are cherry picked and tweaked by experts in a factory. I would say that's a big difference from Joe Bloggs trying to do it with an average panel at home using a bit of software. The kit and tech Acer will have access to shouldn't make this prohibitive.

As for the 100Hz G-Sync version, I am certainly more sceptical of this, so I eagerly await reviews. It does seem a stretch for this particular panel, but the key seems to reside in the G-Sync module/scaler. It was used to similar effect in the ROG Swift to achieve 144Hz, and as I understand it, it's being employed in the same fashion here to achieve 100Hz. It remains to be seen how effective that actually is however. We really won't know until it's in the hands of the likes of TFT Central... they will no doubt be very comprehensive and detailed in their analysis as they always are.
 
Associate
OP
Joined
20 Nov 2014
Posts
246
Let me try to reword my doubts about this "100hz with gsync enabled" thing.

Since these panels don't take overclocking very easily, couldn't be gsync a way to trick the eye and the monitor specs too? Like (this is pure fantasy ok?) recognizing if there is a skipped frame and showing the previous one twice (or just longer) so that inspector tools see that there are 100 frames shown every sec but the real ones, those generated by the gpu, are maybe still 60 or 75 but synced so that no one see the gap?
 
Soldato
Joined
30 Nov 2011
Posts
11,376
the gsync module has direct control of the panel and can alter the voltage being supplied - that is why most panels that have a freesync and gsync version have a wider range on the gsync version... normal non-sync monitors have an overdrive feature to reduce ghosting, but it is tuned for the fixed refresh rates the monitor supports, the gysnc module is tuned for the full range they can get the panel to support, where as freesync is stuck with tuning for a fixed refresh which then ends up with monitors having a narrower range, because drift too far from where the overdrive is tuned for and you would get bad ghosting

the 75hz monitors/panels are still operating off the same controller as the 60hz ones, so it is limited in how it can control the panel... the gsync module is programmable, so the benefit of that is stuff like this

also, normal panel controllers are ASIC's, so developing a new one just for a 3440x1440 @100hz monitor would be cost prohibitive... as high res/high refresh monitors become more common, they will make one eventually, but as of right now I doubt there is a controller than can do 3440@100hz, hence why the programmable gsync module is needed (and obviously extra development time has been spent on) to get it to work

Let me try to reword my doubts about this "100hz with gsync enabled" thing.

Since these panels don't take overclocking very easily, couldn't be gsync a way to trick the eye and the monitor specs too? Like (this is pure fantasy ok?) recognizing if there is a skipped frame and showing the previous one twice (or just longer) so that inspector tools see that there are 100 frames shown every sec but the real ones, those generated by the gpu, are maybe still 60 or 75 but synced so that no one see the gap?

the gsync module is AFTER the GPU, so if it was the gsync module duplicating frames, in game or PC based fps counters would only see the number being generated on the PC end, it wouldn't ever try to claim there were 100 frames, it would be the 60 or 75
 
Last edited:
Man of Honour
Joined
13 Oct 2006
Posts
91,341
One way to check is to render different colours or symbols each frame, capture externally with a high speed camera and see if they are all displayed.
 
Man of Honour
Joined
12 Jan 2003
Posts
20,574
Location
UK
andybird123, i'd say that's a pretty good explanation.

keep in mind we are talking in theory at the moment as so far no one has actually released a 100Hz capable screen of this type. The panel is almost certainly the same as that used in the XR341CK and Dell U3415W models as the OP has said, but it is the G-sync module added afterwards which seems to be the key here.

Most panels will have a recommended refresh rate of 60Hz and a maximum of 75Hz so it's not that unusuale for a screen to support up to 75Hz. it just means you need to push it to the maximum possible and have a reliable controller which can handle that properly and not drop frames. you will see a fair few screens support up ot 75Hz if pushed. It does seem that on the Acer XR341CK this worked reliably from an AMD card without frames being dropped, but from an NVIDIA card it struggled and dropped frames, so it's still not perfect.

it remains to be seen what the X34 G-sync model can handle and how the 100Hz works in practice
 
Associate
OP
Joined
20 Nov 2014
Posts
246
the gsync module is AFTER the GPU, so if it was the gsync module duplicating frames, in game or PC based fps counters would only see the number being generated on the PC end, it wouldn't ever try to claim there were 100 frames, it would be the 60 or 75
That's a pretty valid point against my "conspiracy theory" of gsync altering the results of the fps count :D

Still..
Baddass said:
it remains to be seen what the X34 G-sync model can handle and how the 100Hz works in practice
...it has to be seen at work. How it acts with frames being dropped will remain a mistery as long as someone like Baddass doesn't put his hands on one of these things. And I reckon it's not even that easy to find out. Probably a test like the FCAT one to measure the frame pacing. Anyway.... great answers so far thanks to all. There's nothing else to do but waiting now :)
 
Last edited:
Man of Honour
Joined
12 Jan 2003
Posts
20,574
Location
UK
so the BLB will most likely be equally as bad as on the DELL? that's a bummer.

the panel doesn't dictate backlight bleed. thats down to other factors like chasis design, build quality, factory quality control, transport, storage etc. The IPS glow (viewing dark content from an angle where it becomes pale) will be exactly the same, that's a "feature" of the modern IPS panel used
 
Associate
OP
Joined
20 Nov 2014
Posts
246
Well, look at what I found.

Video: https://www.youtube.com/watch?v=VkrJU5d2RfA
Article: http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ

Seems like I wasn't that wrong with my suspicions

Basically: both gsync and freesync do work in a certain range of frequencies depending on each monitor, but this is no news. If the frame rate dips down below that interval two different behaviours take place: the gsync will double (and triple, quadruple....) the frequency and insert the same frame twice keeping it in pace with the refresh rate so not the eye nor the inspector tools will notice any frame drop, because in fact there is no frame drop. The freesync instead will keep the minimum refresh rate stable and keep eating frames as they are sent to the monitor by the gpu, which will result in stuttering and tearing at the same time.

So yeah, gsync tricks us, but that's exactly its purpose.
 
Soldato
Joined
30 Nov 2011
Posts
11,376
It wont trick an fps meter though, as fps meters measure what is happening on the GPU... The gsync controller is after the gpu so it repeating a frame wont be measured by an fps meter running on the gpu

Also, 20fps is going to look bad regardless - on a "normal" monitor with a fixed refresh of 60hz with vsync on, re-reading the same frame from the buffer twice or more is exactly what happens, so the fact that gsync intelligently handles that instead of what freesync does is still a benefit, because it reverts the monitor to its maximum refresh, meaning that as soon as the next GPU generated frame is ready it gets displayed on the next cycle, so on a 100hz monitor the maximum lag will be 9.999ms, but if you have a 100hz monitor, to really get the benefit you probably want to target 50hz as a minimum, so you will never get in to one of these low refresh double read situations anyway

I'm not quite sure why your getting a bit tinfoil about it, nvidia aren't trying to trick anyone with gsync, it is setup to give the best display possible even in bad situations... Your main worry seems to be that they are trying to claim they have a 100hz monitor but you think its really 60 or 75 and they are interpolating the extra frame - that wont happen! If you gpu says it is rendering 100 frames then it really is rendering 100 frame - the gsync controller is AFTER the gpu, so anything that queries the GPU for frame rate will show you the real frame rate and not whatever is happening with frame pacing on the monitor
 
Last edited:
Associate
OP
Joined
20 Nov 2014
Posts
246
The thing is kinda different. FPS meters are not wrong, if it say 20fps it's because the gpu is pushing those out, but a gsync monitor will render 40 @40hz, doubling them and keeping them perfectly paced with the "main clock" or however you wanna call the sync between the gpu and monitor. There's a buffer for that I guess so it's easily done.

On an even more extreme example lat's say that:
- your gsync monitor has a range of 40-100hz
- your gpu is in deep trouble and can push only 15 fps

What will happen is:
- fraps will measure 15fps
- your gsync buffer will notice that and will double the freq and insert the same frame twice, but 30 fps/hz is still below the working specs, so it will triple the freq and insert the same frame three times. Now you have 15fps but 3 frames shown on a 45hz monitor and everything looks smooth and clean

TL;DR: gsync owns :cool:
 
Last edited:
Soldato
Joined
30 Nov 2011
Posts
11,376
it's even better than that, it doesn't hold the monitor at the low frame rate, it keeps inserting the image but at the monitors MAXIMUM frame rate, so if the gpu dips to 15fps, that is a single frame taking up 66.6ms to render, if gsync was holding the monitor at 45fps then it would be waiting up to 22ms to display the next "ready" frame, but it actually keeps trying to run the monitor at its maximum refresh, so after the 66ms (15fps), the next frame only has to wait 3.4ms for the next refresh, minimising lag

So it actually inserts the same frame 6 times @ 100hz, not three times @ 45hz
If that makes sense?

And yeah, its pretty clever stuff
 
Associate
OP
Joined
20 Nov 2014
Posts
246
it's even better than that, it doesn't hold the monitor at the low frame rate, it keeps inserting the image but at the monitors MAXIMUM frame rate [...]
If that makes sense?

And yeah, its pretty clever stuff

stonk.001.gif


Geez I missed that part. Makes a lot of sense, in fact I finally understand why they ask £150 extra for that damn logo :D
 
Soldato
Joined
30 Nov 2011
Posts
11,376
Actually, I might even have that a bit wrong, as it puts a frame up and then tries to hold it as long as it can, then if there's no new frame it repeats the old one and then holds that as long as it can, so in the 15fps example, it would be displayed for exactly 66.6ms and then the new frame would be displayed without any delay
 
Back
Top Bottom