• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia to support Freesync?

Thanks so nvidia putting 1fps as the floor in that document is misleading, its actually 30fps for gsync.

For us 30-60fps gamers it sounds like VRR monitors have limited use, for me the usefulness is if I am playing game with vsync on at 60fps but I dont want the stutter that would happen if 60fps cannot be maintained. For that to be smooth a floor of 30fps or lower would be needed.

I guess im use to arguing with enthusiasts so i'm sorry if what i wrote confused you(I'm not being sarcastic I actually mean it). Let me straighten this out for you. Gsync itself will work down to 1 I suppose but the panel itself, the physical monitor, cannot show 1hz. What happens is the frame is then doubled over and over until the end result is above the lowest amount of hz the panel can show which is for gsync panels 30ish. The idea is similar to what 240hz tvs are doing. Its just the same frame multiple times(NOT Interpolation as that is different, but frame doubling). Now you don't want to do this frame doubling unless you have to as input lag goes up.

No, G-Sync works all the way down to 1fps and I tested it at 14 fps. Not playable but continued smoothly.

https://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ

Worth a read to help straighten misleading things out and they show it on test in a vid.

Greg you don't understand what it is i said. Watch that video you linked again at the 11:03 minute mark. The panel itself does not go below 30hz as it physically cannot hold a frame for longer periods than roughly 33ms and instead starts doubling frames until its above the floor which is 30hz for these gsync panels(due to the panels limitations). So you may have working gsync at 14fps but in reality the panel is refreshed at 56hz (14*2=28=not above floor=double again, 28*2=56=above floor=succes).

So while this is due to a panel limitation gsync is technically not going lower that 30hz even if the tech itself supports lower. Freesync, if I recall correctly, supports down to 8hz on paper but you don't see any panels taking advantage of that cause they cant and even if they could i personally really wouldn't want to.

You can perhaps fault me for saying Gsync's floor is 30hz which it isn't on paper(Then again it kinda is as long as panels cant go lower as a gsync panel will newer show less than 30 refreshes a second), and i'll take that, but any enthusiast worth their salt knew what I meant if they actually took everything I wrote into context instead of hand picking sentences.
 
Last edited:
So far, dropping AA and motion blur (which I hate) has generally been enough to keep things running around 60. However, I have the new Tomb Raider and Assassin’s Creed games on my to-play list and so I am hoping these will be helped by leveraging the 40-60 Freesync range.
Yep, same here. Motion blur and depth of field get turned off immediately. Way I see it, you get more performance and better image quality. It is not just those either, there are others that get turned off too (they are not in every game like the above) which make image quality worse, like film grain, chromatic aberration etc.

Playing Xoom 2 right now, I also have AA turned off, image quality looks much better and again more free performance. The difference between having all those options on (max settings yo :o) and off can sometimes be between a 1080 ti/Titan xp and a 2080 ti. To me max settings is with all that crap turned off as it is better image quality, to me at least.
 
Some more info here https://www.pcper.com/reviews/Graph...on-Improves-FreeSync-and-Frame-Pacing-Support I don't know where the updated results are off the top of my head but they later sat down with an oscilloscope to examine what both where doing but as AMD are frame doubling in software it increases the latency before you can recover from a low framerate situation amongst other issues and makes it harder to handle situations where you need multiples of the refresh rate rather than just doubling.

That is an over 3 year old test, there may have been improvements in latency since then and the article even states this as much in the body:

I am interested to get more detail from AMD and NVIDIA to see how their algorithms compare on when and how to insert a new frame - my guess is that NVIDIA has had more time to perfect its module logic than AMD's driver team has and any fringe cases that crop up aren't dealt with differently.

In addition as I said my laptop Gsync is to my eyes identical to my Rog Swift when running the same image despite not having the hardware module so if there is still a latency difference it is likely imperceptible to most people. Don't get me wrong as a person invested in Gsync tech I would have no problem if it is a superior solution but it does feel like there is some posturing going on to position freesync as some seriously inferior tech when it isn't.
 
That is an over 3 year old test, there may have been improvements in latency since then and the article even states this as much in the body:

There is a limit to what you can do when having to do it in software from the GPU versus having additional hardware and buffers on the monitor itself. Laptops using eDP for adaptive sync is still, albeit not fundamentally so, different from a desktop GPU driving a desktop monitor (FreeSync panels still have a standard scaler just modified a bit) let alone having a G-Sync module on the display itself.
 
Greg you don't understand what it is i said. Watch that video you linked again at the 11:03 minute mark. The panel itself does not go below 30hz as it physically cannot hold a frame for longer periods than roughly 33ms and instead starts doubling frames until its above the floor which is 30hz for these gsync panels(due to the panels limitations). So you may have working gsync at 14fps but in reality the panel is refreshed at 56hz (14*2=28=not above floor=double again, 28*2=56=above floor=succes).

So while this is due to a panel limitation gsync is technically not going lower that 30hz even if the tech itself supports lower. Freesync, if I recall correctly, supports down to 8hz on paper but you don't see any panels taking advantage of that cause they cant and even if they could i personally really wouldn't want to.

You can perhaps fault me for saying Gsync's floor is 30hz which it isn't on paper(Then again it kinda is as long as panels cant go lower as a gsync panel will newer show less than 30 refreshes a second), and i'll take that, but any enthusiast worth their salt knew what I meant if they actually took everything I wrote into context instead of hand picking sentences.
It is all about the G-Sync module Bud and how it works when frames are lower than 30Hz. As an example, at 26fps, it will double the refresh rate and insert duplicate frames. This is what keeps it smooth and with my own testing, I can agree that it works and works well. I upscaled Batman Arkham Origins to a point of sitting at roughly 14 fps and with G-Sync, it still looked nice and smooth but the downside was input lag and certainly not something that is ideal. It is great for those mini dips we get though and keeps everything running nice.

This was recorded from my phone years ago (hence the poor quality) but you can see it is smooth and this was at 14fps ish.

 
There is a limit to what you can do when having to do it in software from the GPU versus having additional hardware and buffers on the monitor itself. Laptops using eDP for adaptive sync is still, albeit not fundamentally so, different from a desktop GPU driving a desktop monitor (FreeSync panels still have a standard scaler just modified a bit) let alone having a G-Sync module on the display itself.

I'm sorry how is eDP different from a desktop gpu connected via displayport? Embedded display port is based on the vesa display port standard. It's not some magical standard that has improved link rates and scaler functionality, it was designed as as a lower power consumption version of the displayport for use in laptops connecting GPU to display. The point stands that the my laptop Gsync does not operate with the fabled Gsync Module which you are championing as some superior solution yet as I've now repeatedly said there is no perceptible difference between my laptop Gsync vs my Rog Swift. Though there may be slightly more lag in freesync there are no up to date accurate studies from what I've seen so anything else is purely conjecture or anecdotal in nature.

It is all about the G-Sync module Bud and how it works when frames are lower than 30Hz. As an example, at 26fps, it will double the refresh rate and insert duplicate frames. This is what keeps it smooth and with my own testing, I can agree that it works and works well. I upscaled Batman Arkham Origins to a point of sitting at roughly 14 fps and with G-Sync, it still looked nice and smooth but the downside was input lag and certainly not something that is ideal. It is great for those mini dips we get though and keeps everything running nice.

This was recorded from my phone years ago (hence the poor quality) but you can see it is smooth and this was at 14fps ish.


Which is exactly what freesync does via the GPU and @Phixsator explained in his post as Low frame compensation.


I'm out now but agreed! I don't regularly visit this section as it's usually a cesspit of fanboyism and it clearly hasn't changed. I've been told by a few that post here regularly that I'm wasting my time :p
 
Last edited:
Sorry this is a little of topic but I came across this video I want to use in response to an earlier post.

1070Ti consistently outperforms 56 and even 64 in some benches. It also runs cooler, quieter and more efficiently.

Hi, First off Radeon are not good with consistency, they never have been in my experience. There will always be certain games where Radeon performance is miles behind Geforce performance, Games like Project Cars & GTAV are good examples. In those games you find Radeon gpu's struggling to keep up with Geforce cards that are several rungs lower on the totem pole, However, they're not indicative of overall performance.

This video includes a decent selection of games old and new (It's a June 2018 video) that show how the 1070ti & Vega 64 stack up against each other, 30 games tested, They used an Asus 1070ti Strix and a reference Vega 64 that had the power limit increased in an attempt to simulate non-reference performance, I've had a reference Vega & doing that does not help, the Vega's being held back so it's not really a like for like test, but it is what we've got.
The best way to compare Vega with anything would be non-reference v non-reference with both manually overclocked & in the Vega's case manually undervolted, I've found that there's a big difference between an out of the box non-reference Vega & that card undervolted & overclocked. Vega wasn't ready when it released & nfortunately first impressions leave a mark or in Vega's case a stain and it's a shame as a properly balanced Vega outshines everything upto & including a GTX 1080.


As for the Pascal cards they've been exceptional class leaders but unfortunately their day has been & gone & Nvidia's focus is now on Turing so Pascal will gradually fall behind in new releases, in the same way Maxwell has and Kepler did before that. They're good cards but they're not good long term propositions when bought eol. That's why there's so many bargains on the secondhand circuit. But again this is only one persons opinion that's based on what I've seen happen again & again over the last 5 years.
 
It is all about the G-Sync module Bud and how it works when frames are lower than 30Hz. As an example, at 26fps, it will double the refresh rate and insert duplicate frames. This is what keeps it smooth and with my own testing, I can agree that it works and works well. I upscaled Batman Arkham Origins to a point of sitting at roughly 14 fps and with G-Sync, it still looked nice and smooth but the downside was input lag and certainly not something that is ideal. It is great for those mini dips we get though and keeps everything running nice.

This was recorded from my phone years ago (hence the poor quality) but you can see it is smooth and this was at 14fps ish.


Why are you repeating back to me what i just said, rephrased, after disagreeing with me? seems odd Bud :).

From my own personal testing I can say that it is NOT a good experience and my experience is that image persistence becomes a HUGE problem when fps starts dipping lower than 60, which is why I have said multiple times in the past that I don't care about a VVR floor lower than 40 cause I won't be playing games at lower than 55ish. Doesn't matter if it's on a Gsync panel or a Freesync(although LFC is plague with flickering on a lot of freesync panels but that's a non issue for me...can you guess why?), in my very honest from the heart opinion, it just *****. I just don't enjoy playing power point slides.

Well if that video shows me anything it is that, compared to my own standards, your requirements for the use of the word smooth are rather low and if a mouse was used in this same scenario the word smooth wouldn't even be possible to utter. Again my personal opinion, not an attack on your person, but instead a disagreement with your statements about the word smooth and how/when to use of it. In the end it's personal preference.
 
I'm out now but agreed! I don't regularly visit this section as it's usually a cesspit of fanboyism and it clearly hasn't changed. I've been told by a few that post here regularly that I'm wasting my time :p

:D, it's good practice in patience and I need all the practice I can get with a 3 year old boy monster roaming around the house and a partner that seems too misinterpret everything I say atm :P (Did you just say I am fat?!?, ME: nooo, i said those pants are ugly). God bless em both.
 
Though there may be slightly more lag in freesync there are no up to date accurate studies from what I've seen so anything else is purely conjecture or anecdotal in nature.

Battlenonsense on YouTube has done a bunch of studies on it.

I'm sorry how is eDP different from a desktop gpu connected via displayport? Embedded display port is based on the vesa display port standard. It's not some magical standard that has improved link rates and scaler functionality

It is nothing to do with improved link rates or scaler functionality it is about how the scaler is implemented in the chain versus a desktop monitor.
 
Last edited:
Why are you repeating back to me what i just said, rephrased, after disagreeing with me? seems odd Bud :).

From my own personal testing I can say that it is NOT a good experience and my experience is that image persistence becomes a HUGE problem when fps starts dipping lower than 60, which is why I have said multiple times in the past that I don't care about a VVR floor lower than 40 cause I won't be playing games at lower than 55ish. Doesn't matter if it's on a Gsync panel or a Freesync(although LFC is plague with flickering on a lot of freesync panels but that's a non issue for me...can you guess why?), in my very honest from the heart opinion, it just *****. I just don't enjoy playing power point slides.

Well if that video shows me anything it is that, compared to my own standards, your requirements for the use of the word smooth are rather low and if a mouse was used in this same scenario the word smooth wouldn't even be possible to utter. Again my personal opinion, not an attack on your person, but instead a disagreement with your statements about the word smooth and how/when to use of it. In the end it's personal preference.
I explained how it works in lamen terms for those who are unsure of what is what. And horses for courses I guess. My reaction time is a bit slower now but my perception of smooth is decent (why I will not go back to SLI until the microstutter is gone). At 48 years old, I still enjoy my gaming and whilst getting lots of frames is nice, for games like TR and Batman, ~40 is good enough for me, so long as it is smooth.
 
Battlenonsense on YouTube has done a bunch of studies on it.



It is nothing to do with improved link rates or scaler functionality it is about how the scaler is implemented in the chain versus a desktop monitor.

And the evidence to support your claim that EDP in laptops is superior to display port from the GPU is where exactly? Or is this purely based on your opinion?

I have watched the Battlenonsense video before, you mean where he shows the averages (blue bar) with freesync and gsync enabled versus the baseline and neither created any perceptible increase in input lag on the displays compared to when freesync/gsync were disabled?

He quite clearly states that he is not comparing the ms numbers between the two displays as they are different panels/monitors and that he wants us to look at the change from the baseline freesync/gsync 'off' which shows no real difference in either tech.

XIDIpnQ.png
 
Lets face it, It don't matter how NV swing it or anyone else for that matter, their adaptive sync implementation is sub par compared to AMD's.
 
It is all about the G-Sync module Bud and how it works when frames are lower than 30Hz. As an example, at 26fps, it will double the refresh rate and insert duplicate frames. This is what keeps it smooth and with my own testing, I can agree that it works and works well. I upscaled Batman Arkham Origins to a point of sitting at roughly 14 fps and with G-Sync, it still looked nice and smooth but the downside was input lag and certainly not something that is ideal. It is great for those mini dips we get though and keeps everything running nice.

This was recorded from my phone years ago (hence the poor quality) but you can see it is smooth and this was at 14fps ish.


A recorded video won't work for showing what you seeing. For one the video isn't Gsync and two the recorded gameplay from shadow play is 30fps
Adding in frames into a video makes them run smoother.

Movies and TV have been using motion interpolation for years for this effect.

Also viewing video vs interactive are very different.

24fps film will be viewed has smooth, now connected a mouse and move around at 24fps will be not a nice experience not only do you have frame lag to deal with on top off input lag.

60fps is 16ms 30fps is 33ms alone go lower and its unplayable..
 
Last edited:
And the evidence to support your claim that EDP in laptops is superior to display port from the GPU is where exactly? Or is this purely based on your opinion?

I have watched the Battlenonsense video before, you mean where he shows the averages (blue bar) with freesync and gsync enabled versus the baseline and neither created any perceptible increase in input lag on the displays compared to when freesync/gsync were disabled?

He quite clearly states that he is not comparing the ms numbers between the two displays as they are different panels/monitors and that he wants us to look at the change from the baseline freesync/gsync 'off' which shows no real difference in either tech.

XIDIpnQ.png

He has a number of videos on the subject covering the actual experience while playing as well as the numbers.

It is nothing to do with "superior" in the sense you are implying it - laptop displays being single input allow the scaler to be implemented differently, it isn't a fundamental change but it allows the GPU to replace the scaler functionality in some respects that isn't possible when it has to interface with a traditional scaler in the display itself.
 
He has a number of videos on the subject covering the actual experience while playing as well as the numbers.

It is nothing to do with "superior" in the sense you are implying it - laptop displays being single input allow the scaler to be implemented differently, it isn't a fundamental change but it allows the GPU to replace the scaler functionality in some respects that isn't possible when it has to interface with a traditional scaler in the display itself.

So what we've established that Battlenonsense has shown that in measurable terms there is limited difference and now we are back to anecdotal per person experiences which can differ wildly from person to person (as illustrated above Phixsator vs Gregster).

Anyway that's me out. I look forward to the 15th Jan.
 
Back
Top Bottom