• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

So much for G/Sync?

It got very quiet in here all of a sudden ;)





This is very interesting find from ubersonic.

It says that monitor manufacturers will have to add a scaler into currant monitor designs, this is definatly going to cost. So I cannot see how there wont be a premium for Adaptive Sync monitors.
What will be really funny is that if monitor manufacturers decide to use the G-Sync scaler instead of designing a new one :p, of course as Peterson says, I cannot see Nvidia allowing their scaler being used to work with non G-Sync certified monitors, assuming that it can be made to work with the way Adaptive Sync works.

Certainly interesting times ahead.

Will AMD create a scaler for the monitor manufacturers to use or will they let someone else do the work. In my opinion they do have a tendency to let others do the work for their additional features, tridef for 3D springs to mind, but we will have to wait and see.

In this case The scaler is on the GPU, not the Monitor, the reason Nvidia went down the G-Sync rout is because their GPU's don't have built in vBlank scalers.

As ubersonic said current Desktp monitors don't have an interface for vBlank signals, thats what the new VESA displayport 'Adaptive-Sync' is for.

So now a Screen with an Adaptive-Sync interface is compatible with the vBlank signals its receiving from the GPU.

The difference is, basically, Nvidia do with add-on external hardware what AMD do locally on the GPU its self.
 
Last edited:
Even if NVidia's GPU's had those scalers monitors supporting vblank weren't available and still won't be for many months. Both technologies require a new monitor.

The difference is, NVidia got the job done and onto market quickly for their customers to enjoy whilst AMD customer are still waiting, that's why many people are happy to pay a premium for NVidia products they know they will be looked after.
 
In this case The scaler is on the GPU, not the Monitor, the reason Nvidia went down the G-Sync rout is because their GPU's don't have built in vBlank scalers.

As ubersonic said current Desktp monitors don't have an interface for vBlank signals, thats what the new VESA displayport 'Adaptive-Sync' is for.

So now a Screen with an Adaptive-Sync interface is compatible with the vBlank signals its receiving from the GPU.

The difference is, basically, Nvidia do with add-on external hardware what AMD do locally on the GPU its self.
You got a credible source for that old chap?
 
In this case The scaler is on the GPU, not the Monitor, the reason Nvidia went down the G-Sync rout is because their GPU's don't have built in vBlank scalers.

As ubersonic said current Desktp monitors don't have an interface for vBlank signals, thats what the new VESA displayport 'Adaptive-Sync' is for.

So now a Screen with an Adaptive-Sync interface is compatible with the vBlank signals its receiving from the GPU.

The difference is, basically, Nvidia do with add-on external hardware what AMD do locally on the GPU its self.

Ok this sort of makes sense, assuming it is correct, which as Layte asked it would be nice to see where this info comes from. It is just a case of how long is it going to take someone like Intel to incorporate the correct type of scaler into their GPU's, I'm sure they have the next design already mapped out so it could be years.
 
Ok this sort of makes sense, assuming it is correct, which as Layte asked it would be nice to see where this info comes from. It is just a case of how long is it going to take someone like Intel to incorporate the correct type of scaler into their GPU's, I'm sure they have the next design already mapped out so it could be years.

Its already been gone over in the Free-Sync thread, Nvidia were asked about this and declined to comment.

AMD got what is now called Adaptive-Sync running on a laptop, it proves what AMD have said about it, it was originally just intended to be a power saving feature that can also do the same job as G-Sync. Anand reported on it saying it looks like G-Sync in action.

Its part of all HD 6### and later, though Thracks linked FAQ stated "R7 260, R7 260X, R9 290, R9 290X, Beema and Mullins.
They are the so called GCN 1.1 and GCN 1.2 architectures, its possible, i guess, that they have something in the architecture that VLIW and GCN 1.0 don't.
 
Adaptive vsync is has been around for ages. That's when vsync gets turned off if fps drops below 60. Available for Nvidia and AMD via RadeonPro. I've never found it that useful personally but im sure theres plenty that do.
 
Adaptive vsync is has been around for ages. That's when vsync gets turned off if fps drops below 60. Available for Nvidia and AMD via RadeonPro. I've never found it that useful personally but im sure theres plenty that do.

Ah...

Adaptive-V Sync

Adaptive-Sync
 
In this case The scaler is on the GPU, not the Monitor, the reason Nvidia went down the G-Sync rout is because their GPU's don't have built in vBlank scalers.

As ubersonic said current Desktp monitors don't have an interface for vBlank signals, thats what the new VESA displayport 'Adaptive-Sync' is for.

So now a Screen with an Adaptive-Sync interface is compatible with the vBlank signals its receiving from the GPU.

The difference is, basically, Nvidia do with add-on external hardware what AMD do locally on the GPU its self.

There's no such thing as a vblank scaler on any gpu
AMD's solution requires the GPU to guess how long the current frame is going to take to render, which it then sends with the last frame to tell the monitor how long to hold that frame for
Nvidia's solution lets the monitor hold the frame by itself and update when the new frame is actually ready
If AMD's "guess" is wrong then you either get unneccesary lag, or a stutter, they are also saying it takes up gpu time to generate the guess, so it will also impact frame rate, which is probably why they arent rolling it out to previous gens
 
Last edited:
There's no such thing as a vblank scaler on any gpu
AMD's solution requires the GPU to guess how long the current frame is going to take to render, which it then sends with the last frame to tell the monitor how long to hold that frame for
Nvidia's solution lets the monitor hold the frame by itself and update when the new frame is actually ready
If AMD's "guess" is wrong then you either get unneccesary lag, or a stutter, they are also saying it takes up gpu time to generate the guess, so it will also impact frame rate, which is probably why they arent rolling it out to previous gens

That's interesting, can either you are Humbug find sources for your conflicting info?
 
There's no such thing as a vblank scaler on any gpu
AMD's solution requires the GPU to guess how long the current frame is going to take to render, which it then sends with the last frame to tell the monitor how long to hold that frame for
Nvidia's solution lets the monitor hold the frame by itself and update when the new frame is actually ready
If AMD's "guess" is wrong then you either get unneccesary lag, or a stutter, they are also saying it takes up gpu time to generate the guess, so it will also impact frame rate, which is probably why they arent rolling it out to previous gens

That doesn't make any sense, the screen waits from the GPU, match the GPU's output.

As it stands the Screen refreshed at 60 FPS constantly, with Adaptive-Sync the screen refreshes nothing until it gets the signal to do so from the GPU.

GPU to Screen, here is a frame, Display it, if no frame do nothing.
 
Last edited:
Matt, you have a good repore with Thracks, can you ask him to explain how this actually works? thanks :)
 
There's no such thing as a vblank scaler on any gpu
AMD's solution requires the GPU to guess how long the current frame is going to take to render, which it then sends with the last frame to tell the monitor how long to hold that frame for
Nvidia's solution lets the monitor hold the frame by itself and update when the new frame is actually ready
If AMD's "guess" is wrong then you either get unneccesary lag, or a stutter, they are also saying it takes up gpu time to generate the guess, so it will also impact frame rate, which is probably why they arent rolling it out to previous gens

Who are the 'They'

Any link to this info.



Same to you Humbug, can we have a link to your info please or is it just your opinion of how it works?
 
Somebody is talking about something they have no clue about. Or they're putting their opinion across as fact.

I know who I'd believe if push came to shove :D
 
Matt, you have a good repore with Thracks, can you ask him to explain how this actually works? thanks :)

Stand by. I think there is more info coming judging by this tweet i had from Thracks this morning.

QFBenxf.jpg
 
That doesn't make any sense, the screen waits from the GPU, match the GPU's output.

As it stands the Screen refreshed at 60 FPS constantly, with Adaptive-Sync the screen refreshes nothing until it gets the signal to do so from the GPU.

GPU to Screen, here is a frame, Display it, if no frame do nothing.

http://www.techpowerup.com/196557/amd-responds-to-nvidia-g-sync-with-freesync.html?cp=2

In AMD's implementation, VBLANK length (interval between two refresh cycles where the GPU isn't putting out "new" frames, a sort of placebo frames) is variable, and the driver has to speculate what VBLANK length to set for the next frame; whereas, in NVIDIA's implementation, the display holds onto a VBLANK until the next frame is received. In NVIDIA's implementation, the GPU sends out whatever frame-rate the hardware can manage, while the monitor handles the "sync" part. In AMD's the speculation involved in setting the right VBLANK length for the next frame could cause some software overhead for the host system. That overhead is transferred to the display in NVIDIA's implementation. We're looking forward to AMD's whitepaper on FreeSync. AMD holds the advantage when it comes to keeping costs down when implementing the technology. Display makers have to simply implement something that VESA is already deliberating over. The Toshiba laptops AMD used in its FreeSync demo at CES already do.

that is how it works with eDP as well (you can go look up the existing vesa standard for that), vblank is variable, but if has to be sent before, not as an afterthought
 
Stand by. I think there is more info coming judging by this tweet i had from Thracks this morning.

QFBenxf.jpg

Thanks :)

http://www.techpowerup.com/196557/amd-responds-to-nvidia-g-sync-with-freesync.html?cp=2



that is how it works with eDP as well (you can go look up the existing vesa standard for that), vblank is variable, but if has to be sent before, not as an afterthought

I may go into more detail later when i have time. but a short version.

vBlank is what old CRT's used to run.

image > blank > image > blank >... really fast.

That was removed from 'early' LCD's because they could not process a fresh image fast enough, they would flicker.
So LCD's don't have the blank bit in-between, instead they over lay one image over the other, that's why the image can look blurry or ghost during movment, its a series of images overlaying eachother while not perfectly aligned as they are in motion.
This also causes that tearing, two images with one half rendered overlaying the first and out of position.

None of that happens with a 'blank' between each image, by default the screen displays a blank image until it gets one to display, it then removes that image before displaying the next one in waiting cached on the GPU, simple.
LCD's are more than fast enough now to run vBlank, its simply being put back in.
 
In this case The scaler is on the GPU, not the Monitor, the reason Nvidia went down the G-Sync rout is because their GPU's don't have built in vBlank scalers.

Is this where your getting your info from humbug?

It doesn't say that Nvidia don't have built in Vblank scalers, it just says that GPU display controllers on the Nvidia cards work differently to AMD's.

According to AMD's Raja Koduri, the display controllers inside NVIDIA GPUs don't support dynamic refresh rates the way AMD's do, and hence NVIDIA had to deploy external hardware.
 
Back
Top Bottom