• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia to support Freesync?

Associate
Joined
6 Dec 2013
Posts
1,877
Location
Nottingham
I'm not gonna commit to either way at the moment but I'm certainly going to be following it with interest when/if people like Gamers Nexus do some deep dives.
i mean, i can confirm that its not tied to gaming profiles for example as ive removed all of mine and freesync works still on a low range lg superwide the same as a broad range aoc. but other than that will have to see if anyone can dig deeper.
and i would assume the low range ones are the ones where you might think do not adhere, i/e no lg monitors passed nvidias certification.
p.s me and a few others on here found an aoc freesync firmware bug, which was later relayed by the aoc rep and resolved by AOC a few years back :)
 
Soldato
Joined
19 Dec 2010
Posts
12,019
Let me try and explain a little of displayport to you.

Displayport is an interface/standard, on that interface, VESA added adaptive sync as a standard

From that DisplayPort interface, you get external display port (thats what you use for your monitors) and you get eDP (which is the internal connection for laptop screens) I'm sure there's other branches of DisplayPort but these are the two were talking about.

Adaptive sync is available on any of the branches (DP and eDP) of the DisplayPort interface.

AMD used DP, they called it freesync, nVidia is using it on eDP for years.

If you cant or wont get that then you are just trolling me. If you want to save face, just stop responding because i have something to tell you. Once you are dead, you are dead. You don't feel anything, its everyone else around you that feels the pain. Same thing for stupid people.

LOL how can you post up all this and still believe that you are right?

Adaptive sync was added to the display port standard in May 2014. An optional part of the standard and still is an optional part of the standard.

It was never added to the eDP specification. eDP is for internal connections like you said and has a different set of requirements than the display port standard.

We are on display port 1.4a but embedded display port 1.4b and that was released back in 2015. Display port 1.4 only came in 2016. Just because something exists on one standard doesn't mean it exists on the other. There are different features to both and different standards.

But, hey, knock yourself out, Go through all the specs of the eDP and I guarantee you that you won't find any reference to "adaptive sync" been added or been used as part of the specification. But if you find something that shows the term "adaptive sync" been added as part of the eDP standard, I will apologise and admit that I was wrong.

eDP and DP are different, they might have the same governing body (VESA) but, you said it yourself, one applies to internal connections and one to external and as such have different requirements.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,182
Location
Greater London
LOL how can you post up all this and still believe that you are right?

Adaptive sync was added to the display port standard in May 2014. An optional part of the standard and still is an optional part of the standard.

It was never added to the eDP specification. eDP is for internal connections like you said and has a different set of requirements than the display port standard.

We are on display port 1.4a but embedded display port 1.4b and that was released back in 2015. Display port 1.4 only came in 2016. Just because something exists on one standard doesn't mean it exists on the other. There are different features to both and different standards.

But, hey, knock yourself out, Go through all the specs of the eDP and I guarantee you that you won't find any reference to "adaptive sync" been added or been used as part of the specification. But if you find something that shows the term "adaptive sync" been added as part of the eDP standard, I will apologise and admit that I was wrong.

eDP and DP are different, they might have the same governing body (VESA) but, you said it yourself, one applies to internal connections and one to external and as such have different requirements.

I am right, see? :D

He would rather argue to death than apologise :p

My stance on this is simple. Nvidia gave up with their G-Sync module and threw the towel in and now support Freesync monitors. It is that simple. Nothing wrong with it either, it is a good move both for business and consumers. They have already milked G-Sync and they can now use the module to make it a more premium thing, stick a fan on it and milk it some more :D
 
Mobster
Soldato
Joined
4 Apr 2011
Posts
3,501
LOL how can you post up all this and still believe that you are right?

Adaptive sync was added to the display port standard in May 2014. An optional part of the standard and still is an optional part of the standard.

It was never added to the eDP specification. eDP is for internal connections like you said and has a different set of requirements than the display port standard.

We are on display port 1.4a but embedded display port 1.4b and that was released back in 2015. Display port 1.4 only came in 2016. Just because something exists on one standard doesn't mean it exists on the other. There are different features to both and different standards.

But, hey, knock yourself out, Go through all the specs of the eDP and I guarantee you that you won't find any reference to "adaptive sync" been added or been used as part of the specification. But if you find something that shows the term "adaptive sync" been added as part of the eDP standard, I will apologise and admit that I was wrong.

eDP and DP are different, they might have the same governing body (VESA) but, you said it yourself, one applies to internal connections and one to external and as such have different requirements.

toms
https://www.tomshardware.com/news/vesa-displayport-freesync-amd,28524.html

vesa
https://vesa.org/featured-articles/...rd-1-4-for-mobile-personal-computing-devices/

anan (2nd paragraph)
https://www.anandtech.com/show/1379...-adaptive-sync-with-gsync-compatible-branding

arst
https://arstechnica.com/gadgets/201...standard-supporting-devices-with-8k-displays/

I had to stop, I got bored.

You don't need to admit you were wrong, I've known it from the beginning.
 
Permabanned
Joined
12 Sep 2013
Posts
9,221
Location
Knowhere
I'm pretty certain the flickering monitor is a regular 21:9 ultrawide. It's certainly an LG model, and I don't think LG sell an ultrawide that is not 21:9.

I think you're right & I'm wrong, Sorry,


Here's what I'm thinking I see now,
We have 4 monitors in a row, The order we see them in goes from right to left, The first two are G-sync compatible, both have stickers stating the one on the right is a 27" 4k model and the other is a 27" QHD model. The next two monitors are both unvalidated. It looks like the third monitor from the right to left is a 32" not an ultrawide as I thought & the fourth monitor is a regular ultrawide as you said.

At 1:01 we see the Gsync compatible monitors alongside the first non-validated monitor & the non validated looks a lot wider which is why I presumed it was an Ultrawide, but now that I'm looking closer I can see that the panel itself is also taller so it's not just a width difference it's a height difference as well so it must use a 32" panel.
Now I'm wondering what resolution the 32" monitor is? There's no sticker on it and Nvidia's complaint is that the picture looks blurred, Could that be in part because it uses a 32" panel that doesn't look as sharp as the picture on the two 27" models which where stickered as 4k and QHD.

Stop the video at 1 minute & 1 second & tell me what you think.
I think Nvidia are trying to obscurate the truth regarding how adaptive sync compares to G-sync by making it all about monitor quality which we've all known is an issue from the start, an issue that's down to the monitor makers not how well the tech works.

 
Last edited:
Associate
Joined
25 Apr 2017
Posts
1,095
I am right, see? :D

He would rather argue to death than apologise :p

My stance on this is simple. Nvidia gave up with their G-Sync module and threw the towel in and now support Freesync monitors. It is that simple. Nothing wrong with it either, it is a good move both for business and consumers. They have already milked G-Sync and they can now use the module to make it a more premium thing, stick a fan on it and milk it some more :D

The G-Sync module still has its advantages even in midrange. You get the full range of variable refresh rate, ULMB, guaranteed compensation at lower frame rates and superior overdrive performance. Not all monitors are ripoffs.The S2716DG is sub 400 and I have a hard time finding a single FreeSync monitor supporting all these features in 1 package at its price point
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,182
Location
Greater London
The G-Sync module still has its advantages even in midrange. You get the full range of variable refresh rate, ULMB, guaranteed compensation at lower frame rates and superior overdrive performance. Not all monitors are ripoffs.The S2716DG is sub 400 and I have a hard time finding a single FreeSync monitor supporting all these features in 1 package at its price point

I know mate. I am still very happy with my monitor, there are no Freesync monitors that can match it's range. My previous one was a LG 4K monitor which was a Freesync one with a range of 40-60hz only. Not that it is bad, but having a much bigger range is nice. I have never seen a tear since. Very happy with my purchase, it also had less blb and no dead pixels. Will be keeping it until I upgrade to a 120Hz 4K VRR OLED TV next year :D
 
Permabanned
Joined
12 Sep 2013
Posts
9,221
Location
Knowhere
I know mate. I am still very happy with my monitor, there are no Freesync monitors that can match it's range. My previous one was a LG 4K monitor which was a Freesync one with a range of 40-60hz only. Not that it is bad, but having a much bigger range is nice. I have never seen a tear since. Very happy with my purchase, it also had less blb and no dead pixels. Will be keeping it until I upgrade to a 120Hz 4K VRR OLED TV next year :D

That's terrible, LG have been one of the biggest abusers when it comes to supporting Freesync, The first ultrawides they marketed as gaming monitors with Freesync support had a 48-60hz range, If you look at AMD list of compatible monitors to see what features where available you'll see that the majority of LG's early monitors were a joke.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,182
Location
Greater London
That's terrible, LG have been one of the biggest abusers when it comes to supporting Freesync, The first ultrawides they marketed as gaming monitors with Freesync support had a 48-60hz range, If you look at AMD list of compatible monitors to see what features where available you'll see that the majority of LG's early monitors were a joke.
40-60hz range is better than nothing though :)
 
Associate
Joined
6 Dec 2013
Posts
1,877
Location
Nottingham
That's terrible, LG have been one of the biggest abusers when it comes to supporting Freesync, The first ultrawides they marketed as gaming monitors with Freesync support had a 48-60hz range, If you look at AMD list of compatible monitors to see what features where available you'll see that the majority of LG's early monitors were a joke.
the new ones are too, to be fair. my newish lg uw is 48 to 75 from memory? although its fine as i game at around 70, it would be better if broader.
 

Stu

Stu

Soldato
Joined
19 Oct 2002
Posts
2,737
Location
Wirral
I think you're right & I'm wrong, Sorry,


Here's what I'm thinking I see now,
We have 4 monitors in a row, The order we see them in goes from right to left, The first two are G-sync compatible, both have stickers stating the one on the right is a 27" 4k model and the other is a 27" QHD model. The next two monitors are both unvalidated. It looks like the third monitor from the right to left is a 32" not an ultrawide as I thought & the fourth monitor is a regular ultrawide as you said.

At 1:01 we see the Gsync compatible monitors alongside the first non-validated monitor & the non validated looks a lot wider which is why I presumed it was an Ultrawide, but now that I'm looking closer I can see that the panel itself is also taller so it's not just a width difference it's a height difference as well so it must use a 32" panel.
Now I'm wondering what resolution the 32" monitor is? There's no sticker on it and Nvidia's complaint is that the picture looks blurred, Could that be in part because it uses a 32" panel that doesn't look as sharp as the picture on the two 27" models which where stickered as 4k and QHD.

Stop the video at 1 minute & 1 second & tell me what you think.
I think Nvidia are trying to obscurate the truth regarding how adaptive sync compares to G-sync by making it all about monitor quality which we've all known is an issue from the start, an issue that's down to the monitor makers not how well the tech works.


I generally agree with you. The first two are 27inch. The third initially looks like a 32 inch or a 27 that is lower on its stand compared to the first two, but it is clearly 32 inch when seen next to the UW... assuming the UW is a 34 inch, it would be the same height panel as a 27 inch, implying the third monitor is indeed a 32 inch panel.

There is no argument that the UW is flickering, which is bad... who knows the reason and if it can be fixed by an end user.

The 32 inch monitor... it certainly doesn't look good, and I agree with the reviewer that there is blurring... it could be an overdrive issue as he says, it may be a 60Hz monitor (though GSYNC should still look smooth, right?), it might be significantly struggling with frame rate? I did consider if it was a resolution thing, but the image looks good before he starts moving the view around. I dunno, and I'm just throwing stuff out, but my guess is an ghosting/smearing/overshoot issue, which may be inherent with the panel and not related to GSYNC implementation, but I can see why Nvidia would not want to stick a "GSYNC certified" sticker on such a poor panel. Adding GSYNC will not make a bad panel become a good one.
 
Associate
Joined
14 Apr 2014
Posts
598
Well I'm confused.

In the initial Nvidia press release they say " later this quarter ASUS will unleash their curved 35-inch 3440x1440 G-SYNC HDR display" (The only current G-SYNC HDR is the PG27UQ which is 27-inch 4K)

We are getting a new line up of ROG Strix Free-Sync monitors coming out with the final one being the XG32VQR which is HDR but only 32-inch and not a G-SYNC specific screen.

So are we getting a new G-SYNC Ultimate ROG Swift line up soon as well, featuring G-SYNC HDR screens?
 
Permabanned
Joined
12 Sep 2013
Posts
9,221
Location
Knowhere
I generally agree with you. The first two are 27inch. The third initially looks like a 32 inch or a 27 that is lower on its stand compared to the first two, but it is clearly 32 inch when seen next to the UW... assuming the UW is a 34 inch, it would be the same height panel as a 27 inch, implying the third monitor is indeed a 32 inch panel.

There is no argument that the UW is flickering, which is bad... who knows the reason and if it can be fixed by an end user.

The 32 inch monitor... it certainly doesn't look good, and I agree with the reviewer that there is blurring... it could be an overdrive issue as he says, it may be a 60Hz monitor (though GSYNC should still look smooth, right?), it might be significantly struggling with frame rate? I did consider if it was a resolution thing, but the image looks good before he starts moving the view around. I dunno, and I'm just throwing stuff out, but my guess is an ghosting/smearing/overshoot issue, which may be inherent with the panel and not related to GSYNC implementation, but I can see why Nvidia would not want to stick a "GSYNC certified" sticker on such a poor panel. Adding GSYNC will not make a bad panel become a good one.

It'll likely come down to how much work Nvidia do supporting adaptive sync on the software side, I don't know what would be required but AMD seem to do plenty of software work related to fine tuning the various Freesync monitors so that they'll work okay, I've seen plenty of people mentioning issues they've had with Freesync enabled that weren't related to any particular game & AMD's done a fair job of working on them up till now. I've reported a few myself in the past, I don't remember what so they must have been fixed eventually.

Because of how long AMD's been using been Freesync they've currently got a much better implementation of adaptive sync, I hope Nvidia are willing to do the work that'll be required to try and catch up with AMD in that respect rather than just write the support for older Freesync models off and focus on fixing up any issues with both current & new monitors going forward.

Well I'm confused.

In the initial Nvidia press release they say " later this quarter ASUS will unleash their curved 35-inch 3440x1440 G-SYNC HDR display" (The only current G-SYNC HDR is the PG27UQ which is 27-inch 4K)

We are getting a new line up of ROG Strix Free-Sync monitors coming out with the final one being the XG32VQR which is HDR but only 32-inch and not a G-SYNC specific screen.

So are we getting a new G-SYNC Ultimate ROG Swift line up soon as well, featuring G-SYNC HDR screens?

I can't imagine they'lljust stop making G-sync monitors but you can pretty much guarantee that they'll be good models that come with a hefty price tag.
 
Associate
Joined
25 Apr 2017
Posts
1,095
It'll likely come down to how much work Nvidia do supporting adaptive sync on the software side, I don't know what would be required but AMD seem to do plenty of software work related to fine tuning the various Freesync monitors so that they'll work okay, I've seen plenty of people mentioning issues they've had with Freesync enabled that weren't related to any particular game & AMD's done a fair job of working on them up till now. I've reported a few myself in the past, I don't remember what so they must have been fixed eventually.

Because of how long AMD's been using been Freesync they've currently got a much better implementation of adaptive sync, I hope Nvidia are willing to do the work that'll be required to try and catch up with AMD in that respect rather than just write the support for older Freesync models off and focus on fixing up any issues with both current & new monitors going forward.



I can't imagine they'lljust stop making G-sync monitors but you can pretty much guarantee that they'll be good models that come with a hefty price tag.
I think they will keep making £800+ monitors. They still need to keep G-Sync somewhat affordable. There are hardly any buyers at the ultra high end. 1440p 144hz is where there will be highest demand, not these crazy size monitors.
 
Soldato
Joined
15 Jun 2005
Posts
2,750
Location
Edinburgh
Did you notice the title of that article.

NVIDIA To Officially Support VESA Adaptive Sync (FreeSync) Under “G-Sync Compatible” Branding

They understand that, for the general public, FreeSync has become shorthand catch all term for this tech. It will be a struggle to change this now it has taken hold, regardless of the history, standards and marketing.
 
Last edited:

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,182
Location
Greater London
Don’t care what Nvidia or anyone says. These are all Freesync monitors. Nvidia cards now work on Freesync monitors and that’s the end of it. Nvidia can try and call it what they want, but it won’t change :D
 
Back
Top Bottom