• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia to support Freesync?

Yes I agree but the A-sync tech has been used for just about 4 years now on desktop, Why has it took them this long is what I am saying when clearly from pascal it can be used.

Also iirc A-sync is a derivative of the E-dp spec that laptops use so it's actually been around a good while longer

There could be any number of reasons why they are only supporting it now. Maybe they feel that their Gsync module monitors have reached a point where they can now survive on their own even with cheaper Gsync compatible adaptive sync displays on the market. Or maybe it's because they now feel that adaptive sync monitors aren't going away and that people want a cheaper option and now is the right time to offer that option for Nvidia users after all the teething problems of bringing a new standard to the market have been ironed out. Who knows though, maybe some of the tech journalists will ask the question.

Yes, VRR has been a part of the eDP spec on laptops before that, but, not as adaptive sync and only as a power saving feature. It didn't exist as a standard on desktops until the middle of 2014 and then only as an optional part of the Display port 1.2a. Adaptive sync made it's way to laptops in 2015 as part of the eDP 1.4a standard.
 
no problem i thought it might aid it in testing the low end as from memory you can specify the target fps :)

I never thought to download it until you said, so cheers for that. But, I wouldn't be using it to the test the low end. In fact I don't understand all the fighting that goes on between Freesync and Gsync users about the low end and which goes lower. Once you go below 40fps you are into crap gaming and if you are under 30fps then you aren't really gaming at all. If I get too many dips below 40fps I will chance my settings. :)
 
I think you misunderstood what I was saying. I am saying that Nvidia will want you to call your new monitor a G-Sync panel and from now on when a new monitor comes out, instead of it primarily be advertised as Freesync, it will be as G-Sync or the very least G-Sync compatible. Therefore my point was all those monitors are still freesync and I won’t call any monitor that does not have a g-sync module in it G-Sync as there is no g-sync module for it to be g-sync. It is obviously freesync which now nvidia have caved in and support, no matter what their marketing will have us believe.

Nice upgrade by the way, bet you are loving the bump in image quality? :)

No I got you,I guess I did not explain myself well.I get what Nvidia are doing but like I said I am fine with whatever people call it,it sucks for people who do not know or will not know the difference in the future.
Actually the picture quality on the IPS compared to TN G-sync is night and day and I may be crazy but freesync is working better on my computer than my G-sync ever did on low FPS,I am blown away.

Edit:Added /adaptive/G-sync/Freesync Demo windmill/pendulum

 
Last edited:
I never thought to download it until you said, so cheers for that. But, I wouldn't be using it to the test the low end. In fact I don't understand all the fighting that goes on between Freesync and Gsync users about the low end and which goes lower. Once you go below 40fps you are into crap gaming and if you are under 30fps then you aren't really gaming at all. If I get too many dips below 40fps I will chance my settings. :)
completely agree, i meant the frame doubling below the lower end which should help anyone wanting to test it, but yea its been a long day :D
 
freesync v g. sync video they completely SCREWED UP https://www.youtube.com/watch?time_continue=11&v=AvhKFi4O8yg

Seems both Gsync and Freesync screen tear for him when he slows the camera you can clearly see it down the bottom. Also why the hell does Gsync tear so much more at 300fps vs 300fps on the freesync monitor? that is confusing.

Edit
How times have changed when limiting FPS. On Nvidia its still a head ache, more software to install etc
On AMD just ALT+R set FRTC or better use AMD CHILL something if I was an Nvidia user would be banging on for Nvidia to add.

CHILL is the best Frame limiter on the market!

Interesting

Joker Productions
27 minutes ago
V-Sync NEEDS to be on for G-Sync to work properly. Those people commenting otherwise are idiots and don't understand how the tech works. One of the main aspects of G-Sync besides variable refesh rate is the reduction of latency it provides versus traditional V-Sync.
 
Last edited:
Edit
How times have changed when limiting FPS. On Nvidia its still a head ache, more software to install etc
On AMD just ALT+R set FRTC or better use AMD CHILL something if I was an Nvidia user would be banging on for Nvidia to add.

CHILL is the best Frame limiter on the market!

Interesting

Joker Productions
27 minutes ago
V-Sync NEEDS to be on for G-Sync to work properly. Those people commenting otherwise are idiots and don't understand how the tech works. One of the main aspects of G-Sync besides variable refesh rate is the reduction of latency it provides versus traditional V-Sync.

Depends what you want to achieve - G-Sync on + V-Sync on and a framerate limiter usually produces the best results but if you get the framerate cap right you can avoid almost any overshoot of the max refresh without V-Sync and potentially reduce latency a little bit more again or if the framerate in the games you play never get near the max refresh having V-Sync off won't make much odds.
 
I have a question, since AMD had to go though more testing for HDR support to work along with Freesync. Does Nvidia Free-Gsync work with HDR monitors?
Yep, it is working on mine. The reddit spreadsheet, linked a few pages back, details other models which are also working.

I wasn’t expecting this as HDR supposedly requires FreeSync 2.
 
Yes, i think it is... i'm chuffed to bits, this is not an expensive screen, its a nice screen but 31.5" 1440P IPS £250 RRP i paid £200, you can still get them.

Later i'm going to load up Star Citizen, i know some places in that which can bring performance to below the 48Hz range, see how it behaves then.

Anyway, i realised you can't actually tell G-Sync from a recording but here it is anyway... G-Sync logo on the side, there is some visible hitching in it but i know what that is, Shadowplay is recording to the same mechanical drive the game is installed on and i think its starting to fail from old age, when not recording it is absolutely butter smooth.

1440P when its done processing.


I really am happy, well done nVidia, this is a mark up for you :)

PS: my excuse, i wassen't exactly concentrating on the game :o

so this was on a freesync min 48hz screen, you testing what I was unsure about, thanks.
 
freesync v g. sync video they completely SCREWED UP https://www.youtube.com/watch?time_continue=11&v=AvhKFi4O8yg

shankly beat me to it, its still tearing at the bottom.

I think his issue was

(a) switching gsync mode whilst in game, probably not the most stable way to do it.
(b) two different adaptive screen monitors at once, I am not sure thats how the tech can work. The gpu adapts its frame timing to match monitor, but surely it cannot do this to two monitors at once without issues.

in short I think his test was flawed.
 
shankly beat me to it, its still tearing at the bottom.

I think his issue was

(a) switching gsync mode whilst in game, probably not the most stable way to do it.
(b) two different adaptive screen monitors at once, I am not sure thats how the tech can work. The gpu adapts its frame timing to match monitor, but surely it cannot do this to two monitors at once without issues.

in short I think his test was flawed.
"UFD Tech6 hours ago
For everyone pointing out the tearing at the bottom at around 10:31, I can assure you that’s actually not in the original footage on my side. Not sure what’s going on there, but it’s only visible on the YT upload, not the original file."
 
That video was pretty lousy when it came to testing methodology.

Hot plugging the cabling can sometimes cause the monitor to not function properly.

Tearing can be caused by not using V-sync with G-sync. The tearing can sometimes be seen on high fps gaming without V-sync on in the Nvidia control panel usually at the bottom of the monitor.

You should really avoid using V-sync from the games as they can vary in operation from game to game.

Using V-sync in the control panel will not increase input lag with G-sync enabled provided that you don't hit the maximum frame rate your monitor can handle.
 
I read somewhere in this thread that the driver only enables Freesync for 10-series cards and above. What I haven't seen (I haven't read the entire thread) is if nVidia will ever allow it on older cards, or if it'll be something that can be hacked to work by other means. I have a 980ti and though I don't have a gsync or a freesync monitor, I might consider buying a new one if the tech would work with my card.
 
Back
Top Bottom