• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is "Freesync" dead?

Soldato
Joined
19 Dec 2010
Posts
12,026
The problem is that even though AMD was very clever with the whole Freesync naming for Adaptive sync monitors, the fact that they pushed the idea that it was free to use with no ties to any one manufactuer. It left the door open for others to muscle in on it.

Not sure what your point is? AMD went through VESA and it became an open standard. They always said that other manufacturers would be able to use it. Once it became an open standard they had no control over who used it.
 
Soldato
Joined
19 Dec 2010
Posts
12,026
Can't easily find the information any more :s the conference notes have long since disappeared from the internet or at least from my ability to find as I can't remember specific dates now.

I can tell you straight up that this never happened. AMD's proposal to VESA went through in November 2013. If Nvidia did decide were going to use Adaptive sync why were none of their GPUs ready until Pascal? AMD had their GPUs ready before they submitted their proposal.
 
Caporegime
Joined
17 Mar 2012
Posts
47,543
Location
ARC-L1, Stanton System
The problem is that even though AMD was very clever with the whole Freesync naming for Adaptive sync monitors, the fact that they pushed the idea that it was free to use with no ties to any one manufactuer. It left the door open for others to muscle in on it.
Doesn't mean that what NVidia is doing is right of course, but I do get the feeling that if NVidia hadn't done it Intel would have with their discrete cards when they arrive next year.
AMD come across as a little too easy going, which sounds great and all that but it does allow other companies to walk all over you half the time.

Yes exactly.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,028
I can tell you straight up that this never happened. AMD's proposal to VESA went through in November 2013. If Nvidia did decide were going to use Adaptive sync why were none of their GPUs ready until Pascal? AMD had their GPUs ready before they submitted their proposal.

I read the ***** notes sure I can't prove it now - but I definitely saw the notes of the representations including counter points from those who use eDP variable refresh technology in things like air traffic control applications who were opposed.

nVidia were in a pre-emptive stage before going ahead with hardware implementation - while AMD were reacting to G-Sync (it really doesn't take that long to adapt an off the shelf scaler to support FreeSync the way it is currently done misusing PSR, etc.).
 
Caporegime
Joined
17 Mar 2012
Posts
47,543
Location
ARC-L1, Stanton System
I read the ***** notes sure I can't prove it now - but I definitely saw the notes of the representations including counter points from those who use eDP variable refresh technology in things like air traffic control applications who were opposed.

nVidia were in a pre-emptive stage before going ahead with hardware implementation - while AMD were reacting to G-Sync (it really doesn't take that long to adapt an off the shelf scaler to support FreeSync the way it is currently done misusing PSR, etc.).

Why does this even matter?
 
Soldato
Joined
8 Jun 2018
Posts
2,827
LOL, that's all that comes to mind.
You really haven't made much of a point and it's a regurgitation of a nonsensical and whimsical view point. I guess you want to be "seen".
Just in case you finally decide to look at the topic of this thread you might gain some insight as to the conversation. Perhaps not though I don't have any vote of confidence you in at this point.

But to make it clear (for others who want to avoid all the noise you make) I, as well as others no longer use free/g sync because there is no benefit to having it enabled.
We don't experience tearing and using it does impose latency penalty. We (in my circle) have already referred to those that "mandatory need it even at desktop" having crappy monitors. That won't change that no matter how sensitive you are to me saying it.
:p

Therefore, because nvidia is trying to rebrand freesync which down play AMD's impact on monitor industry is a moronic move. Not only is it old tech. We don't need it. Also, Freesync 2 with HDR is something AMD is pushing on developers to use their API for (making it exclusive to how AMD does HDR in games).Which has the potential to hamper Nvidia GPUs when HDR is used.
:D
 
Last edited:
Man of Honour
Joined
13 Oct 2006
Posts
91,028
You really haven't made much of a point and it's a regurgitation of a nonsensical and whimsical view point. I guess you want to be "seen".

Sorry but nothing in his post in reply to you is nonsensical or whimsical. You have a big gap in your understanding of adaptive sync and rather than take that onboard you are persisting with pushing nonsense.

AMD had their GPUs ready before they submitted their proposal.

On this note AMD has historically almost always been at least a half-generation step ahead of nVidia when it comes to supporting video display standards so it makes sense that their earlier generation of GPUs can support it than nVidia (FreeSync "1" currently doesn't have a specific [desktop] adaptive sync implementation - it uses an extension of features that already existed in ways they weren't originally intended to make it work) - the timeline for support with the 7000 series doesn't really fit the narrative otherwise as they'd have had to have spent months (more like 1-2 years) sitting on it without engaging VESA first which makes no sense.

https://www.anandtech.com/show/8008...andard-variable-refresh-monitors-move-forward

EDIT: The other thing that goes against AMD in this respect is that G-Sync has ground up implementation of things like adaptive variable overdrive and low framerate handling - things you'd sit down work on as fundamental steps in designing a desktop, gaming, class of adaptive sync while in FreeSync using adaptive sync these features are either missing or hastily cobbled together through extending existing features such as PSR - which tends to indicate FreeSync is a reactive technology.
 
Last edited:
Caporegime
Joined
17 Mar 2012
Posts
47,543
Location
ARC-L1, Stanton System
LOL, that's all that comes to mind.
You really haven't made much of a point and it's a regurgitation of a nonsensical and whimsical view point. I guess you want to be "seen".
Just in case you finally decide to look at the topic of this thread you might gain some insight as to the conversation. Perhaps not though I don't have any vote of confidence you in at this point.

But to make it clear (for others who want to avoid all the noise you make) I, as well as others no longer use free/g sync because there is no benefit to having it enabled.
We don't experience tearing and using it does impose latency penalty. We (in my circle) have already referred to those that "mandatory need it even at desktop" having crappy monitors. That won't change that no matter how sensitive you are to me saying it.
:p

Like Roff your allowing your self to get dragged into a pointless argument, who gives a crap who had the idea first, that's something no one, not even AMD or Nvidia knows because they don't know how long one or the other have been looking at this.

Anyway i do use G-Sync, or is it Free-Sync given that's what's written on the Screens box? i haven't noticed any input lag, i don't think there is any, unless someone has proven there is, on a Free-Sync screen? i can understand how G-Sync going through an extra layer of hardware might have.

And i hate screen tearing, which i do get without it.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
Sorry but nothing in his post in reply to you is nonsensical or whimsical. You have a big gap in your understanding of adaptive sync and rather than take that onboard you are persisting with pushing nonsense.
We disagree, he's using a logical fallacy to draw a point different to my conversation. Strawman comes to mind. And the fact that I already explained to you the point of view makes your assertion invalid. This isn't a discussion about how freesync works nor did I debate with you that it worked differently to some strawman you initialized. But why it doesn't matter that nvidia marketing to remove it's branding doesn't matter.

Saying that I, as well as others, don't use it to game is not the equivalent of defining how "sync" works.
Like Roff your allowing your self to get dragged into a pointless argument
Yeah I see it, it's amazing how emotionally involved some are about this. It's laughable the amount of anger induced from some here.
"Your Wrong" about sums up the rebuttal and it's a weak one.
 
Associate
Joined
3 Apr 2007
Posts
1,719
Location
London
But to make it clear (for others who want to avoid all the noise you make) I, as well as others no longer use free/g sync because there is no benefit to having it enabled.
We don't experience tearing and using it does impose latency penalty. We (in my circle) have already referred to those that "mandatory need it even at desktop" having crappy monitors. That won't change that no matter how sensitive you are to me saying it.

There most definitely is a benefit to having sync tech enabled. Just because you're not seeing tearing doesn't mean that it's not going to happen, on any monitor, no matter the quality.

Again, you seem to have a completely baffling view of one of the most basic facts about monitors and refresh rates vs frame rates and I find it impossible to believe there are many people like you that are getting this so utterly wrong, let alone enough of you to create a circle.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,028
We disagree, he's using a logical fallacy to draw a point different to my conversation. Strawman comes to mind. And the fact that I already explained to you the point of view makes your assertion invalid. This isn't a discussion about how freesync works nor did I debate with you that it worked differently to some strawman you initialized. But why it doesn't matter that nvidia marketing to remove it's branding doesn't matter.

My comments are in respect to what you posted here:

Lets be honest here, Free/G Sync is only intended for monitors with very poor HW scalers (among other HW). When you have a decent, well ventilated HW for your monitor you will hardly notice, if ever, tearing.

Which is missing an understanding of a key part of the whole issue. There are and were no higher quality monitors/scalers pre adaptive sync/variable refresh where you'd hardly notice tearing if V-Sync was off - and V-Sync on introduces another undesirable compromise in terms of latency. Variable refresh allows for a more happy compromise without tearing (or minimal tearing if you allow the framerate to go over the refresh window) while having levels of latency much closer to V-Sync off.

On older displays you could significantly minimise the perception of tearing by rendering very fast (typically around 3x the display refresh) with V-Sync off but then that introduced an issue itself of having a GPU capable of that.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,028
Yes, I said that and from my point of view having used many others that's my conclusion. Have you tested and use over 10 monitors in the last 30 days or grouped together at one time?

I've tested and used probably more than 300 monitors in the last 20 years... and an early adopter of the likes of the Samsung 2233RZ to try and get the best compromise between latency and tearing for gaming.

I've gone through some of the best (at the time) gaming monitors out there on release like the BenQ XL2420T and Asus ROG Swift PG278Q, etc. as well as trying some of the worst :s
 
Associate
Joined
3 Apr 2007
Posts
1,719
Location
London
Be my guest and share those benefits? Don't just state them.

It can be easily inferred from my post as a response to yours, also from my early reply to you regarding sync tech and from the multiple other posts people have made replying to just how wrong your understanding is.

All sync tech is there to eliminate tearing by making sure the monitor refreshes in sync with frame rate so that one completed frame buffer is drawn to the screen per monitor refresh. If these two things are out of sync the frame buffer changes mid monitor refresh and hence you get tearing due to two different frames being drawn to screen.

None of this has anything to do with the quality of the monitor or the HW scaler used within.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
I've tested and used probably more than 300 monitors in the last 20 years... and an early adopter of the likes of the Samsung 2233RZ to try and get the best compromise between latency and tearing for gaming.
I've started since crts were a thing. And I've seen various difference in for a long time. I serious doubt your 300 number as I lost count. However from my experience that is my conclusion.
"There are and were no higher quality monitors/scalers pre adaptive sync/variable refresh where you'd hardly notice tearing"
This particular point shows me that you haven't seen the number you claim. It implies a similarity, statically not possible for a 300 monitor sample size.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
It can be easily inferred from my post as a response to yours, also from my early reply to you regarding sync tech and from the multiple other posts people have made replying to just how wrong your understanding is.

All sync tech is there to eliminate tearing by making sure the monitor refreshes in sync with frame rate so that one completed frame buffer is drawn to the screen per monitor refresh. If these two things are out of sync the frame buffer changes mid monitor refresh and hence you get tearing due to two different frames being drawn to screen.

None of this has anything to do with the quality of the monitor or the HW scaler used within.
What are the benefits you claim I don't know? This has already been discussed. Like I told you before I don't need it enabled and the games run fine.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,028
I've started since crts were a thing. And I've seen various difference in for a long time. I serious doubt your 300 number as I lost count. However from my experience that is my conclusion.
"There are and were no higher quality monitors/scalers pre adaptive sync/variable refresh where you'd hardly notice tearing"
This particular point shows me that you haven't.

You can see in the monitors sub-section if you search - one year alone I spent over 6 grand trying different monitors to get the perfect setup for multi-boxing Eve Online :s

I also worked for a company selling PC stuff for awhile so got to try a lot of stuff while working there.
 
Associate
Joined
3 Apr 2007
Posts
1,719
Location
London
What are the benefits you claim I don't know? This has already been discussed. Like I told you before I don't need it enabled and the games run fine.

Fine you win, I have better things to do than argue with someone who is so wrong and so unwilling to learn, or is just a bored troll.

Enjoy being so oblivious, I just hope enough of us have called you out that if anyone reads your drivel in this thread that they don't take what you say as fact.
 
Back
Top Bottom