• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is "Freesync" dead?

Associate
Joined
22 Dec 2011
Posts
2,055
Location
UK
Many people have been reporting problems with the 5700/XT.

I for one had a 5700 had many crashes, rma'd it and got my money back, stuck with my trusty 1070 it's put me off buying AMD again.

Bravo to Ocuk for their superb customer service!

Would definitely buy from Ocuk again.
 

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,360
Location
kent
Lets be honest here, Free/G Sync is only intended for monitors with very poor HW scalers (among other HW). When you have a decent, well ventilated HW for your monitor you will hardly notice, if ever, tearing.

I bet some of you could turn off Free/G Sync right now and play your favorite games and not notice any tearing. Plus, you get a slight reduction in input latency to boot.

It's been always said that if you see tearing in the games you play (even way back when 60hz was the best you could get) you had a crappy monitor.Turn on vsync and live with it (even back then a decent monitor wasn't cheap).

What AMD and Nvidia have done was found a way to compensate for crappy monitors. And believe me I've seen some of the same make/model that are much worst at tearing then others. Yet for others you have to go frame by frame looking for it in other monitors.

You ever wondered why AMD never made their windmill app publicly (and well known) available directly from them? Hint: It's not that serious.

:eek::eek::eek: say what :eek::eek::eek:


Sorry but if you really think that's the case then you have a serious misunderstanding of the freesync/Gsync technologies and the problem that prompted their development.

Tearing happens on any monitor. The quality of the monitor has no impact on this. It can happen at any framerate, but the closer the framerate is to a multiple of the monitor's refresh rate the less likely you are to see it. You are more likely to notice it at higher framerates because it happens more often. It depends on the game too. It's more easily noticed in some games than others.

And like all Visual things, some people are a lot more sensitive to tearing than others. You must be one of the people who don't notice it too much. For others the difference between Sync on and Sync off is night and day even with minimal screen tearing.

Bang on. :)



Correct what he states is 100% wrong.

:)

Yeah, was a bit confused when I read his post. Was like, did I miss something? Lol.

Quoted just because what EastCoastHandle posted is so very very incorrect.

So going through the post.
"Free/G Sync is only intended for monitors with very poor HW scalers."
well with free sync it is because there is hardly any quality control over the scaler used. and Gsync they all use the same controller module. so that is blatantly wrong.

As for the rest of it, well I really cannot be bothered.


Yet another case of one persons opinion, would soon become fact if not corrected.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,570
Location
Greater London
Many people have been reporting problems with the 5700/XT.

I for one had a 5700 had many crashes, rma'd it and got my money back, stuck with my trusty 1070 it's put me off buying AMD again.

Bravo to Ocuk for their superb customer service!

Would definitely buy from Ocuk again.
I can see how that would be off putting. I would personally never say never though. Sounds like they dropped the ball with Navi. I only started hearing about this very recently. But I am always happy to go back if reviews are good and price for performance is good. Cannot let one failed gen put me off a company forever as there is only one other company.

To be honest you are better off waiting for Nvidia's 3000 series anyways. That is exactly what I am doing. A 3070 will be a very juicy upgrade :D
 
Associate
Joined
21 Sep 2018
Posts
895
With the news that Navi is selling quite well, it would be a mistake if manufacturers drop freesync. Esp now the rest of the line up are coming soon.

I have two navis and have zero issues.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
:eek::eek::eek: say what :eek::eek::eek:


Quoted just because what EastCoastHandle posted is so very very incorrect.

So going through the post.
"Free/G Sync is only intended for monitors with very poor HW scalers."
well with free sync it is because there is hardly any quality control over the scaler used. and Gsync they all use the same controller module. so that is blatantly wrong.

As for the rest of it, well I really cannot be bothered.


Yet another case of one persons opinion, would soon become fact if not corrected.
Ah, I see, a bit of denial by you and others.
However, lets look at the forest before the bark of the tree.
In the beginning we all know Nvidia found it very important to use their own HW in order for a monitor to be certified as gync, right? I'll leave your misconception(s) there.

BTW, actually disable free/g sync and game for a few. "Some" of you might actually be surprised not to find tearing. :D
 
Last edited:
Associate
Joined
6 Oct 2009
Posts
667
Location
London
No, I do not concur.
The difference is like night and day between Freesync on or off for me and not just regarding tearing but also on how everything feels "smooth" compared to when it's off.
 
Associate
Joined
3 Apr 2007
Posts
1,719
Location
London
Ah, I see, a bit of denial by you and others.
However, lets look at the forest before the bark of the tree.
In the beginning we all know Nvidia found it very important to use their own HW in order for a monitor to be certified as gync, right? I'll leave your misconception(s) there.

BTW, actually disable free/g sync and game for a few. "Some" of you might actually be surprised not to find tearing. :D


The clue is in the name... 'sync'.

Vsync, Vertical Sync, the 'sync'/'de-sync' between rate of frames being output (fps) and rate of frames (refresh) being displayed on the monitor. No more, no less, it is that simple. How you can have such a gross misunderstanding of it is baffling.
 
Associate
Joined
3 Apr 2007
Posts
1,719
Location
London
No, I do not concur.
The difference is like night and day between Freesync on or off for me and not just regarding tearing but also on how everything feels "smooth" compared to when it's off.

Agree with this.

i've had freesync turn off for me when say installing new drivers etc and it was instantly noticeable. Similar to when a high refresh monitor defaults back to 60Hz, you instantly notice it is worse.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,147
Ah, I see, a bit of denial by you and others.
However, lets look at the forest before the bark of the tree.
In the beginning we all know Nvidia found it very important to use their own HW in order for a monitor to be certified as gync, right? I'll leave your misconception(s) there.

BTW, actually disable free/g sync and game for a few. "Some" of you might actually be surprised not to find tearing. :D

I think you are missing what the problem was - V-Sync at 60Hz has very noticeable input latency for many people especially as in many cases a fairly small drop in framerate could see you bouncing off the 30 or 45 FPS V-Sync multis. The alternative of V-Sync off needs very high framerates to minimise the perception of tearing and even then at 60Hz hard to avoid nasty tearing (which is true across all monitors no matter the quality). V-Sync on at 120Hz is reasonably tolerable but you need to be able to sustain very high framerates to keep it hitting the 120Hz V-Sync threshold. Adaptive sync removes those compromises from the equation.

Correct what he states is 100% wrong.

On tearing topic, I have never witnessed tearing on any of my CRT's at any FPS well above the HZ with V-Sync OFF (I cannot stand V-Sync) as it can happen not will happen.

I have however seen it on LCD's and IMO we only have these new techs some of which work but others are gimmicks to sell more monitors as they have stagnated IMO and QA is far worse then a decade or so ago LCD's are a poor tech IMO.

F/G-Sync is more beneficial at lower FPS and its smoother, look at the Nvidia Pendulum Demo.

If you are rendering well above the monitor Hz with V-Sync off the perception of tearing is often reduced significantly - one of the main reasons I used to go SLI before I got a G-Sync monitor but find an area with lots of parallel lines in a scene like radiators or window blinds and it will be obvious - I would also notice the ripple when entering a new area of a game and trying to pan the scene left<>right to look for things like enemy players, etc.
 
Last edited:
Soldato
Joined
19 Dec 2010
Posts
12,031
Ah, I see, a bit of denial by you and others.

It's not denial, It's you. As @chaosophy has said already, your gross misunderstanding of sync tech is baffling. All your posts in this thread about Freesync/Gsync have shown that you don't know what you are talking about.

In the beginning we all know Nvidia found it very important to use their own HW in order for a monitor to be certified as gync, right? I'll leave your misconception(s) there.

This statement is kind of ironic, since you are the one with the misconceptions. Nvidia had to come up with their own Solution because no desktop monitor or desktop GPU had the hardware needed to do VRR. So Nvidia came up with the Gsync module, it contains the scaler, timing controller, frame buffer all in one. Which is why any Nvidia GPU with a Display port released since Fermi can use Gsync.

Adaptive Sync requires hardware on the GPU and on the monitor. Which is why Nvidia's Maxwell GPUs can't connect to adaptive sync monitors. Only Pascal and later have the hardware. It's also the reason not all GCN GPUs from AMD are not fully Freesync compatible.

BTW, actually disable free/g sync and game for a few. "Some" of you might actually be surprised not to find tearing. :D

Did you not read my last post? I suggest you read it. Tearing is not dependent on the quality of the monitor.

If screen tearing wasn't an issue why did they come up Vsync?
 
Soldato
Joined
31 Dec 2007
Posts
13,616
Location
The TARDIS, Wakefield, UK
True melmac - fighting a losing battle EastCoastHandle > the monitor was not "certified" it was a Gsync monitor because it had the gsync module. The same module was also going to be sold as a DIY module to fit monitors that were compatible before Nvidia changed their mind over the fitting being too technical for the average user (and a couple of other reasons...). The term "certified" has only appeared since Nvidia give the moniker to monitors they have tested as compatible.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
Very interesting reading.

If you need to "sync" in order to get a smooth desktop I'm going to assume there is something else going on with your setup. Perhaps look for a firmware update for your monitor. Or simply check to see you aren't at 60hz when it looks like it does.

It's quite common to have (for example) freesync not work without installing AMD drivers. Also, for some, can cause your monitor to default to 60hz until the AMD gpu drivers are installed. If you ever notice an issue where it looks like the refresh rate is 60hz (after uninstalling drivers) and you have MS basic display drivers (forget the real name though) for gpu device in Device Manger (as a result of using DDU) then check to see what refresh rate you are using.

I don't think the issue is the term Sync but what Screen Tearing is and how it occurs.

"...Yet for others you have to go frame by frame looking for it (tearing) in other monitors."
This statement did not suggest/imply nor state that monitors have no tearing. That's a strawman's arguement :). As there is no absolutes.

"Free/G Sync is only intended for monitors with very poor HW scalers."
Perhaps I could further clarify. I was thinking of how Nvidia uses it's own instead of relaying on manufacture. And, how AMD advised for better HW. Which is part of their certification process.

Furthermore, how vsync worked, before free/g sync is not the subject here. AMD (from my own research) had monitor manufactures update their HW in order to be ceritifed for Freesync. While Nvidia provided their own hardware. The point was that, at the time, the hw used for "gaming monitors" were adaquete for "desktop use" but not necessary for "gaming use".

Since then most, if not all, manufactures are on board with provided better HW, providing firmware updates, providing higher refresh rates, etc for monitors we see today.

At the end of the day, if your monitor is actually worth it's own weight, you shouldn't need to have free/g sync on. Specially for desktop use (come on now). I know I've stopped using it for sometime now. I have not notice a single difference since then either.

And yes I still agree, "What AMD and Nvidia have done was found a way to compensate for crappy monitors." That won't change.
How the industry responded with freesync/gsync has changed this though. As they continue to improve upon it.

I've gamed and seen quite a few monitors and with it turned off I've seen very few that need it on. Those who didn't neither me nor the clients have seen any noticeable tearing/smoothness issues while gaming. To imply that all monitor tear the same exact way in the same exact manner is simply false.

And yes, disabling it does improve on input latency. Sheesh, are we actually having the same discussion here?

So again, if the HW on the monitors were "fine" back then why did AMD and Nvidia push for better HW? It's very simple question guys, come on.

However to that end is Free/g sync dead? IMO yes as monitor manufactures continue to improve on their own HW it's going to get redundant. Heck even win10 has a variable refresh rate option now. Who saw that coming? From my own experiences the frequency and the amount of tearing isn't the same issue I've seen a few years ago. I'm sorry if that has any sentimental value to you.

But freesync 2 on the other hand...not so much. I think we need AMD to keep pushing the standards up. No different the how those standards are pushed for hdmi/display port for example.

:)
 
Last edited:
Soldato
Joined
19 Dec 2010
Posts
12,031
If you need to "sync" in order to get a smooth desktop I'm going to assume there is something else going on with your setup. Perhaps look for a firmware update for your monitor. Or simply check to see you aren't at 60hz when it looks like it does.

Nobody said anything about the desktop? What was mentioned was desktop GPUs as opposed to Laptop GPUs or APUs.

I don't think the issue is the term Sync but what Screen Tearing is and how it occurs.

And you obviously don't know what screen tearing is or how it occurs. Because if you did, you wouldn't have wrote the rest of your post.

"...Yet for others you have to go frame by frame looking for it (tearing) in other monitors."
This statement did not suggest/imply nor state that monitors have no tearing. That's a strawman's arguement :). As there is no absolutes.

There are absolutes in this case, tearing occurs in all monitors without sync tech. More on this below.

"Free/G Sync is only intended for monitors with very poor HW scalers."
Perhaps I could further clarify. I was thinking of how Nvidia uses it's own instead of relaying on manufacture. And, how AMD advised for better HW. Which is part of their certification process.

Your clarification actually shows that you don't really understand it at all. Freesync/Gsync aren't intended only for monitors with very poor HW scalers, they are a solution to a problem called tearing. You are mixing up a lot of things and arriving at wrong conclusions. No manufactured scaler would work with the original Gsync, because Gsync doesn't have a hardware scaler, it has an FPGA chip that does the scaling as well as handling other monitor functions. On the other hand monitors that want to be Freesync compatible need to have scalers that supported variable refresh rates. Most monitors before the introduction of Adaptive Sync didn't have scalers that could support Variable Refresh Rates.

The certification process that Nvidia are doing now is to find adaptive sync monitors that they think are worth connecting to and calling them Gsync compatible. They should have changed the name as Gsync compatible is causing confusion.

Furthermore, how vsync worked, before free/g sync is not the subject here. AMD (from my own research) had monitor manufactures update their HW in order to be ceritifed for Freesync. While Nvidia provided their own hardware. The point was that, at the time, the hw used for "gaming monitors" were adaquete for "desktop use" but not necessary for "gaming use".
Since then most, if not all, manufactures are on board with provided better HW, providing firmware updates, providing higher refresh rates, etc for monitors we see today.

Vsync is entirely relevant to the discussion. It is proof that screen tearing is and always has been an issue that gamers wanted a solution to.

The rest of your paragraph makes no sense what so ever.

At the end of the day, if your monitor is actually worth it's own weight, you shouldn't need to have free/g sync on. Specially for desktop use (come on now). I know I've stopped using it for sometime now. I have not notice a single difference since then either.

OH come on now you!! Seriously?? Your desktop runs at your monitors refresh rate. Which means no tearing ever. Surely you knew this?

And yes I still agree, "What AMD and Nvidia have done was found a way to compensate for crappy monitors." That won't change.

WRONG and will always be wrong. Tearing will occur on all monitors without some kind of sync tech. It has nothing got to do with the quality of the monitor.

I've gamed and seen quite a few monitors and with it turned off I've seen very few that need it on. Those who didn't neither me nor the clients have seen any noticeable tearing/smoothness issues while gaming. To imply that all monitor tear the same exact way in the same exact manner is simply false.

Haven't you been reading any of the replies to your posts? Nobody said all monitors will tear the same way, even the same monitor might not tear the same way each time. There are two important variables, The refresh rate of the monitor and the frame rate of the game. How far these are out of sync at any time will determining how much tearing occurs. Higher refresh rates combined with Higher frame rates makes it harder for some people to notice the tearing. And leads to the other variable and that's the user playing the game. Some people don't notice tearing as much as others do.

And yes, disabling it does improve on input latency. Sheesh, are we actually having the same discussion here?
So again, if the HW on the monitors were "fine" back then why did AMD and Nvidia push for better HW? It's very simple question guys, come on.

Sheesh yourself, Rroff was talking about Vsync. Vsync adds input lag, a lot of it, that's why it's not a good solution for FPS gamers. Freesync and Gsync also add input lag but the difference is tiny.

Your last sentence, again, makes no sense in context of the conversation.

However to that end is Free/g sync dead? IMO yes as monitor manufactures continue to improve on their own HW it's going to get redundant. Heck even win10 has a variable refresh rate option now. Who saw that coming? From my own experiences the frequency and the amount of tearing isn't the same issue I've seen a few years ago. I'm sorry if that has any sentimental value to you.
But freesync 2 on the other hand...not so much. I think we need AMD to keep pushing the standards up. No different the how those standards are pushed for hdmi/display port for example.

You think Freesync and Gsync is dead but that Freesync 2 is good. ?????????????????????????

So you think an improved solution to a problem that you don't think exists is a good idea?

Again quoting Chasophy
The clue is in the name... 'sync'.

Vsync, Vertical Sync, the 'sync'/'de-sync' between rate of frames being output (fps) and rate of frames (refresh) being displayed on the monitor. No more, no less, it is that simple. How you can have such a gross misunderstanding of it is baffling.








 
Soldato
Joined
8 Jun 2018
Posts
2,827

How did we get to a point were nvidia required monitor manufactures to use their HW to being gsync compatible over the years?
This is were critical thinking comes into play here. AMD freesync initiative helped foster monitors that no longer require nvidia hardware.
Take a gander at their compatbility list:
https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/
In the last few weeks they've added more compatible monitors then they ever had gsync monitors at one time. The manufacturing of these monitors have greatly improved from years past. That's the point :).

The point still remains, the HW in them used today is far better then before freesync/gsync.

So again, is free/g sync dead. My answer is still yes it is. With AMD pushing the Freesync 2 (I'm not seeing Nvidia doing anything on this front) they will push manufactures even further to make those monitors more robust.

Your response really doesn't show any objection to this. Albeit, you object to the topic of it being discussed in a thread labeled Is Freesync Dead. Furthermore, if I and others no longer use "sync" to game then your preface to suggest otherwise lacks any real rebuttal.
 
Last edited:
Man of Honour
Joined
13 Oct 2006
Posts
91,147
The point still remains, the HW in them used today is far better then before freesync/gsync.

Most of these monitors use off the shelf scalers that pre-date G-Sync/FreeSync with some minor modifications and a firmware update so as to support variable refresh through misusing existing features by extending their functionality (look up panel self refresh) - only the next round of adaptive sync compliant monitor hardware actually has such functionality build in from the ground up and still missing some features of the G-Sync module. (EDIT: There are some exceptions to this but that is the main story).

Adaptive sync still needs support on the software side be that G-Sync or FreeSync in some form and unfortunately some support on the OS side where Microsoft have several times complicated things due to their ineptness and lack of interest.

Sorry but you are talking nonsense and completely missing the big compromise that people had to make between tearing and input latency in regard to gaming before adaptive sync technologies.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
Adaptive sync still needs support on the software side be that G-Sync or FreeSync in some form and unfortunately some support on the OS side where Microsoft have several times complicated things due to their ineptness and lack of interest.
No one suggested that the process of how sync works. That's not the discussion. The discussion is that going from Nvidia HW to monitors being gsync compatible without NVidia hardware is the focal point. Have a look at their list
https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/
 
Man of Honour
Joined
13 Oct 2006
Posts
91,147
No one suggested that the process of how sync works. That's not the discussion. The discussion is that going from Nvidia HW to monitors being gsync compatible without NVidia hardware is the focal point. Have a look at their list
https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/

I'm not sure your point - G-Sync compatible monitors are monitors that have had the scaler slightly modified to support variable refresh rate - they still lack some of the functionality of the G-Sync monitor - previous to the G-Sync module it didn't matter what quality the monitor was you still had to make a compromise between tearing and input latency if that was important to you in gaming.
 

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,360
Location
kent
Earlier I was going to post that you should have gone to specsavers. but I wont say that now.

Many thanks to @Rroff and @melmac who have both argued the point way past what I was willing to do.

I will just finish with this.

Opinions are like bottoms, we all have one and very often they are full of @#*&%$
 
Back
Top Bottom