• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Richard Huddy - Gaming Scientist Interviewed

I did watch this bit earlier in the week, interesting point about the 9-240 range, though I think he did say that would naturally stick a premium on the monitor as its outside of usual operating ranges of 24/30/59/60/75/120. Be interested to see what the first lot of units do range at. He seemed quite careful not to mention any specific monitor manufacturers, which makes you wonder how many ate actually on board with pushing this?

As for the gsync latency thing, idk tbh, no one else has mentioned latency besides the complete opposite of what this guy is saying, in that it removed all additional frame lag, leaving that limit down to the panel. Now this fella claims it adds lag?

I do have to say though the fact its a driver driven solution worries me slightly, **** ups are easily made by any driver vendor, add new feature/fix existing bug - break 3 other things then wait 3 months for a driver that does everything :p

Edit: apologies for any inaccuracies, was days ago I watched the video, or skipped through it at least

Apparently because G-Sync uses a frame buffer between the GPU and screen it adds 1 frame to the latency.

I guess its reducing latency when you compare it to running V-Sync because that buffers 2 frames on the GPU, or 3 if you use Triple Buffering.

Free-Sync does not use any buffering of any kind.
 
I see so its added latency vs freesync not vs the normal operation of the monitor. I think blurbusters did some depth latency testing with gsync, so hopefully they'll be able to expose any differences in that ddepartment.
 
I see so its added latency vs freesync not vs the normal operation of the monitor. I think blurbusters did some depth latency testing with gsync, so hopefully they'll be able to expose any differences in that ddepartment.

I think you have to think of the G-Sync module as a middle man.

Without G-Sync:

The GPU renders the frame, (with software V-Sync off) the GPU will send that frame out as soon as rendered and the screen displays it as soon as it has it.

With G-Sync the GPU behaves in the same way, sending the frame out as soon as rendered. the G-Sync module siting between the GPU and the screen intercepts that frame and holds onto it until it gets the next frame from the GPU, once it has that frame it will send the first frame to the screen and hold onto frame 2 until its gets frame 3, and so on.... that way it can regulate the frame consistency.

Free Sync works differently, the screen communicates directly with the GPU, telling the GPU "my hz range is 40 to 120" the GPU acknowledges that and sends the frames out at no more than 120 FPS as it renders them, the GPU and screen are in constant communication to regulate eachothers output. the screen will adjust its hz on the fly as needed.

That's how I understand it from how Huddy described it.
 
Last edited:
G-sync does NOT add 1 frame of latency, this has been tested and confirmed ages ago by blurbusters.

lag-bf4.png


lag-crysis3.png


It's just another lie by Huddy / AMD targeted at people who don't know better.

The memory is not used for buffering 1 frame. How they even thought that would happen is beyond me. Oh wait, they saw an opportunity to spread yet another lie.
 
Last edited:
Not yet as excluding the problematic eizo the highest refresh rate on the market is 144hz. The only way you can exceed the gsync cap rate is to disable it, sure you can go under it 30-33fps I think it was) but I wouldn't imagine any gamer will want frames dropping that low (let alone the 9hz huddy talks about).

To similar Tech's with very different approaches, I like it :)
 
G-sync does NOT add 1 frame of latency, this has been tested and confirmed ages ago by blurbusters.

lag-bf4.png


lag-crysis3.png


It's just another lie by Huddy / AMD targeted at people who don't know better.

The memory is not used for buffering 1 frame. How they even thought that would happen is beyond me. Oh wait, they saw an opportunity to spread yet another lie.

Three things with this.

1, How do those slides prove that? not a single run is the same so its impossible to compare them to each other, all they do show is that not one run is the same as another, which is why its impossible to measure the latency.

2, Did they use FCAT? if not then you can't measure the FPS that are going into the screen because the software calculates the FPS directly from the API. not what actually going into the screen.

The only way to do it is to record each frame that comes out of the G-Sync module as its happening, even then there is no accuracy in comparing that to anything as run 1 the game may render 52 FPS and run 2 the game may render 56 FPS. there for you do not know any difference between the latency as variables have been added to the results.

3, the whole thing looks flawed, you can't measure the time taken from the GPU rendering the frame to that same frame being displayed on the screen and if that frame is the same one rendered on the GPU.

To measure that you would need to identify and tag the frames once the GPU has rendered them (frame 1, 2 and 3) and then identify it on the screen, once you identify frame 1 on screen the frame on the GPU should be frame 2, if its frame 3 then you have to look for frame 2, and I suspect you will find it sitting in the G-Sync buffer. :)
 
Perhaps he's talking about latency when it exceeds the gysnc cap rate, it doesn't have a range of 240Hz?

You can easily solve this by capping the frames to something like 140.

Or just wait for Nvidia to implement to polling fixes that they say they're going to implement. Not that they're in much hurry considering that A-sync monitors aren't even available and wont be for a while.
 
Edit: sorry mod for double posting, I thought I was editing... :(

Three things with this.

1, How do those slides prove that? not a single run is the same so its impossible to compare them to each other, all they do show is that not one run is the same as another, which is why its impossible to measure the latency.

2, Did they use FCAT? if not then you can't measure the FPS that are going into the screen because the software calculates the FPS directly from the API. not what actually going into the screen.

The only way to do it is to record each frame that comes out of the G-Sync module as its happening, even then there is no accuracy in comparing that to anything as run 1 the game may render 52 FPS and run 2 the game may render 56 FPS. there for you do not know any difference between the latency as variables have been added to the results.

3, the whole thing looks flawed, you can't measure the time taken from the GPU rendering the frame to that same frame being displayed on the screen and if that frame is the same one rendered on the GPU.

To measure that you would need to identify and tag the frames once the GPU has rendered them (frame 1, 2 and 3) and then identify it on the screen, once you identify frame 1 on screen the frame on the GPU should be frame 2, if its frame 3 then you have to look for frame 2, and I suspect you will find it sitting in the G-Sync buffer. :)

They measured it perfectly and those are completely legit results showing that there's no extra frame of latency which would show in a big increase in the millisecond count.

http://www.blurbusters.com/gsync/preview2/

It's blurbusters. They know more about this than any other site.

And it's not using something silly like fcat for latency measuring.

Wires from a modified Logitech G9x are attached to a LED, which illuminates instantly upon pressing the mouse button (<1ms). Using a consumer 1000fps camera, one can measure the time between the light coming from the LED, and the light coming from the on-screen gunshot, such as a gun movement, crosshairs flash, or muzzle flash:

^Perfect repeatable input lag / latency test. The fact that the games themselves make more of a difference than g-sync/v-sync off ought to tell you that there's no lag added.

Humbug perhaps you should first try reading about and studying these issues before making up your mind and making comments.
 
They measured it perfectly and those are completely legit results showing that there's no extra frame of latency which would show in a big increase in the millisecond count.

http://www.blurbusters.com/gsync/preview2/

It's blurbusters. They know more about this than any other site.

And it's not using something silly like fcat for latency measuring.



^Perfect repeatable input lag / latency test. The fact that the games themselves make more of a difference than g-sync/v-sync off ought to tell you that there's no lag added.

Humbug perhaps you should first try reading about and studying these issues before making up your mind and making comments.

No need for that hostility, I'm enjoying a good debate here, you would do your blood pressure a world of good if did the same.

Back on topic, so they are using cameras to record a mouse click and the time delay to that action taking place on screen, its crude but I can see how they think it might work, the problem for me is there are still to many variables.

You click a mouse, the mouse has to process that, then the computer has to process that before the GPU processes that. and then you have to guess which frame your recording is the result of all of that, the fact that they are using averages of latency suggest to me that they have not identified the exact frame.

So to close I'm not saying their results are wrong, or that it proves anything either way, I'm saying there are to many variables and too much guess work going on for anything conclusive to be taken from it either way.

The only way to know for sure is to track frames as they come off the GPU, to the G-Sync Module and finally to the screen, that way you take all the guess work and variables out.
 
Last edited:
No need for that hostility, I'm enjoying a good debate here, you would do your blood pressure a world of good if did the same.

Back on topic, so they are using cameras to record a mouse click and the time delay to that action taking place on screen, its crude but I can see how they think it might work, the problem for me is there are still to many variables.

You click a mouse, the mouse has to process that, then the computer has to process that before the GPU processes that. and then you have to guess which frame your recording is the result of all of that, the fact that they are using averages of latency suggest to me that they have not identified the exact frame.

So to close I'm not saying their results are wrong, or that it proves anything either way, I'm saying there are to many variables and too much guess work going on for anything conclusive to be taken from it either way.

The only way to know for sure is to track frames as they come off the GPU, to the G-Sync Module and finally to the screen, that way you take all the guess work and variables out.

That makes sense.
 
You can easily solve this by capping the frames to something like 140.

Or just wait for Nvidia to implement to polling fixes that they say they're going to implement. Not that they're in much hurry considering that A-sync monitors aren't even available and wont be for a while.

Blurbuster said:

I believe the fact that latency occured at fps_max 143, to be a strange quirk, possibly caused by the G-SYNC polling algorithm used. I’m hoping future drivers will solve this, so that I can use fps_max 144 without lag.

=Huddy/AMD didn't lie, they have also identified the lag, as did blurbuster.

Someone here has gysnc and said that a title or so are not perfect and does lag/stutter(not a slur on gsync at all-it is a positive tech imo), he is very very happy with what gsync brings.

I'm also confident it will be addressed to the best of the techs ability as it's in it's infancy.
 
I think this lag would also be more noticeable at lower fps, would it not? I expect you'd feel it at the lower ranges more than you would when fps are high.
 
No need for that hostility, I'm enjoying a good debate here, you would do your blood pressure a world of good if did the same.

Back on topic, so they are using cameras to record a mouse click and the time delay to that action taking place on screen, its crude but I can see how they think it might work, the problem for me is there are still to many variables.

You click a mouse, the mouse has to process that, then the computer has to process that before the GPU processes that. and then you have to guess which frame your recording is the result of all of that, the fact that they are using averages of latency suggest to me that they have not identified the exact frame.

So to close I'm not saying their results are wrong, or that it proves anything either way, I'm saying there are to many variables and too much guess work going on for anything conclusive to be taken from it either way.

The only way to know for sure is to track frames as they come off the GPU, to the G-Sync Module and finally to the screen, that way you take all the guess work and variables out.

You do not need to "identify a frame" to measure input lag. I don't even understand what that means.

All you literally need to do is measure the time between the input (mouse click) and the effect of that input appearing on the screen.

To do this they used a camera that takes 1 picture every millisecond.

That is taking into account everything that happens between your input and you seeing the image. And that's all that matters to a gamer.

Crysis 3:

If Huddy was right and didn't lie you would see G-sync getting ~20ms higher input lag as v-sync off. But it doesn't.

BF4:

If Huddy was right and didn't lie you would see G-sync getting ~13ms higher input lag as v-sync off. But it doesn't.

There's no way to make this more simple. You would see big differences if there was a whole extra frame of latency added. But those differences just are not there.

It's a really black and white issue. You cannot call this even and say that there's no way to know for sure.

Huddy lied. Again. Simple as that.

Anyone who reads the input lag tests for g-sync knows that. There is no extra frame being kept in the buffer. Nvidia hasn't said there's a frame in the buffer, review sites haven't said there's a frame in the buffer, tests haven't shown that there's a frame in the buffer. The only one saying that is Huddy.

So either he was misinformed or lied on purpose.

Blurbuster said:



=Huddy/AMD didn't lie, they have also identified the lag, as did blurbuster.

Someone here has gysnc and said that a title or so are not perfect and does lag/stutter(not a slur on gsync at all-it is a positive tech imo), he is very very happy with what gsync brings.

I'm also confident it will be addressed to the best of the techs ability as it's in it's infancy.

Huddy wasn't talking about a rare scenario or a game or two that still have quirks. He flat out said that g-sync causes 1 frame of extra latency due to buffering. Which is clearly false.

He didn't even imply that he was talking about a few rare cases.
He didn't imply that it could be easily fixed even in those cases.
And he didn't imply that it would be fixed in those few rare cases like it will be according to nvidia.
 
Last edited:
No, they're presenting an issue with g-sync in a certain rare scenario. Blurbusters do not say that g-sync works by buffering 1 frame that causes latency. They're also saying how this can be fixed.

Huddy on the other hand says that it's just how the technology works. He never mentions that it can be fixed by users, he never mentions it's only in rare cases and he never mentions that there's going to be an official fix as well. No, he presents the issue as if a 1 frame buffer is how g-sync works all the time.

It's comparable to an Nvidia rep just flat out saying that AMD's frame pacing doesn't work. But in reality it just doesn't work with DX9.

What's the Gsync 768mb memory module for then if it's not for storing a frame?

Monitors also often have on board memory. It's not for storing complete frames.

Here's blurbusters guy on the issue on their forums:

Considering that the GSYNC upgrade replaces the entire monitors' motherboard (and thus these chips), the GSYNC memory doesn't sound far fetched.

All of these displays do use framebuffer histories (past refreshes) in order to compute individual LCD pixel voltages for the next refresh cycle (taking into account of everything: Overdrive, FRC, inversion, etc). This is a bandwidth of roughly 1 gigabyte per second per access cycle, for 1080p@144Hz. Assuming 10 accesses per pixel (one read or one write is one access), that would be a bandwidth of 10 gigabyte per second. I'd guesstimate the bandwidth needed is somewhere in between.

Bandwidth. They needed three chips in parallel for triple-channel bandwidth. I imagine that sufficiently-fast chips smaller than 256mb weren't cheaper, so they went with that.

The memory is needed for processing, from what I understand.
 
Last edited:
MPC No BS Podcast #226: In-depth Interview with AMD Graphics Guru Richard Huddy

Not going to make a new thread, the OP can added it to the first page.
The better interview in my opinion.
 
Last edited:
Back
Top Bottom