• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is "Freesync" dead?

Can play with simulations of various scenarios here:

https://www.testufo.com/stutter

Though it doesn't show the effects of input latency on gaming but still.

As an aside there are monitors/tech that can clean up tearing without having V-Sync on but they introduce stutter as a side effect which can be reduced by again rendering very fast but obviously then require a pretty brutal GPU setup.
 
Like Roff your allowing your self to get dragged into a pointless argument, who gives a crap who had the idea first, that's something no one, not even AMD or Nvidia knows because they don't know how long one or the other have been looking at this.

Anyway i do use G-Sync, or is it Free-Sync given that's what's written on the Screens box? i haven't noticed any input lag, i don't think there is any, unless someone has proven there is, on a Free-Sync screen? i can understand how G-Sync going through an extra layer of hardware might have.

And i hate screen tearing, which i do get without it.


Exactly this. Its not very often that me and humbug agree, but it nice when it does happen. :)
 
I read the ***** notes sure I can't prove it now - but I definitely saw the notes of the representations including counter points from those who use eDP variable refresh technology in things like air traffic control applications who were opposed.

nVidia were in a pre-emptive stage before going ahead with hardware implementation - while AMD were reacting to G-Sync (it really doesn't take that long to adapt an off the shelf scaler to support FreeSync the way it is currently done misusing PSR, etc.).

I was only asking because I am interested to learn this, I have never seen or heard anything anywhere about this. Would like to see the proposal. I am surprised you can't find it as these proposals are released to the public.


Why does this even matter?

I just finding out this now and am interested in all aspects of adaptive sync tech and Gsync.
 
Like Roff your allowing your self to get dragged into a pointless argument, who gives a crap who had the idea first, that's something no one, not even AMD or Nvidia knows because they don't know how long one or the other have been looking at this.

See, you are completely mistaken. I am not in an argument with Rroff. It isn't a debate about who is first, I don't care who was first. I am just interested in learning about the tech, including the history of said tech.

My discussion with Eastcosthandle, is a different story. The guy doesn't have a clue but yet believes he is making some kind of point.

Do you really think that monitors without freesync/adaptive sync/gysnc don't have tearing? Or that you can turn off Freesync and not notice tearing?
 
See, you are completely mistaken. I am not in an argument with Rroff. It isn't a debate about who is first, I don't care who was first. I am just interested in learning about the tech, including the history of said tech.

My discussion with Eastcosthandle, is a different story. The guy doesn't have a clue but yet believes he is making some kind of point.

Do you really think that monitors without freesync/adaptive sync/gysnc don't have tearing? Or that you can turn off Freesync and not notice tearing?

I honestly don't know, its not something that i see with it on, i do with it off.
 
I was only asking because I am interested to learn this, I have never seen or heard anything anywhere about this. Would like to see the proposal. I am surprised you can't find it as these proposals are released to the public.




I just finding out this now and am interested in all aspects of adaptive sync tech and Gsync.

Ah just frustrating as I had the information once but a lot of years gone by since and I can't find it again but I know what I read LOL.

EDIT: Not that I'm so bothered about who was first as such but FreeSync has none of the hallmarks of something that was in the works that nVidia rushed to beat to market as many of the things you'd design in from the start are/were omitted or rushed rather than centric from the ground up.
 
Ok, i just ran the G-Sync Pendulum Test.

What i can tell you is with V-Sync on and V-Sync off i don't get any screen tearing, what i do get is micro juddering on the lateral motion, or Ghosting??, V-Sync off or on looks identical, motion judder... its not micro stutter, it looks like the image isn't refreshing smoothly, its VERY noticeable.

When i click G-Sync on that judder disappears completely, it is butter smooth and the image is crystal clear during that lateral motion, its working on my 'None G-Sync Approved Free-Sync Screen' and working very well.

If you don't see the same on your screen, i would suggest G-Sync isn't working.
 
Last edited:
The only time I have seen a difference between G-Sync and Freesync is when FPS goes below 50, While G-Sync still "feels" smooth under 50, Freesync felt a little stuttery even with LFC kicking in.
 
LOL, that's all that comes to mind.
You really haven't made much of a point and it's a regurgitation of a nonsensical and whimsical view point. I guess you want to be "seen".
Just in case you finally decide to look at the topic of this thread you might gain some insight as to the conversation. Perhaps not though I don't have any vote of confidence you in at this point.

This is the stupidest thing that you have wrote in this thread. And you have come out with some drivel already. But lets keep it very simple.

But to make it clear (for others who want to avoid all the noise you make) I, as well as others no longer use free/g sync because there is no benefit to having it enabled.We don't experience tearing and using it does impose latency penalty. We (in my circle) have already referred to those that "mandatory need it even at desktop" having crappy monitors. That won't change that no matter how sensitive you are to me saying it.

What you wrote here is opinion, and anecdotal nonsense, not fact.

Now read this very carefully.

Monitors without Freeysnc/Gsync will have tearing unless you have software solution like Vsync enabled. That's a fact.

The improved hardware in Gsync and Freesync monitors will have no effect on tearing if you turn off Freesync and Gsync. That's also a fact.

You are also completely ignoring the other major benefit of Adaptive sync and Gsync. Frame drops. You don't notice frame drops because the monitor's refresh rate is sync'd to the game's frame rate. Which means games are really smooth to play.

Now if you and your circle of friends don't notice either of those benefits then I wonder do you just play Microsoft Solitaire?

Therefore, because nvidia is trying to rebrand freesync which down play AMD's impact on monitor industry is a moronic move. Not only is it old tech. We don't need it. Also, Freesync 2 with HDR is something AMD is pushing on developers to use their API for (making it exclusive to how AMD does HDR in games).Which has the potential to hamper Nvidia GPUs when HDR is used.

Nvidia isn't trying to rebrand Freeync. Nvidia is removing the Freesync brand name from Monitors. So Monitors advertise that they are Gsync Compatible but have no mention of Freesync so customers think that they only use that monitor if they have a Nvidia card. I am really curious, what do you think Freesync 2 is? It uses the exact same technology as Freesync. The only difference with the adapative sync side is that the standard forces monitor manufacturers to have a scaler that has a maximum refresh rate at least 2 times the lower refresh rate. That's so LFC can work(Low FrameRate Compensation) The other standards have nothing got to do with tearing or frame rate drops. One is to do with input lag and the other is a minimum standard for HDR.

Now sure why up brought up a chroma sampling bug when using HDR with Nvidia Pascal cards. It's nothing got to do with Freesync or AMD. Also, you don't seem to realise but Nvidia cards will be able to connect to Freesync 2 monitors just like Freesync monitors. Same Technology you see.
 
Ah just frustrating as I had the information once but a lot of years gone by since and I can't find it again but I know what I read LOL.

EDIT: Not that I'm so bothered about who was first as such but FreeSync has none of the hallmarks of something that was in the works that nVidia rushed to beat to market as many of the things you'd design in from the start are/were omitted or rushed rather than centric from the ground up.

Understandable, I wrote my reply to you after writing my post to Eastcoast. So my post might have come across a little sharper than I intended. I wasn't arguing with you. I am not really bothered which is first either, but I would like to know the history.
 
If you don't see the same on your screen, i would suggest G-Sync isn't working.

That's our discussion with EastCoast. He thinks Gsync and Freesync don't actually do anything because monitor tech has improved so much. That you can turn Freesync/Gsync off and there is no difference. If you do see a difference it's because you have a crappy monitor.

So you must have a crappy monitor Humbug :p
 
I am not really bothered which is first either, but I would like to know the history.


Would it be nice to know who thought of it first, well of course it would, but in reality we will never know.

Bottom line is the technology works very well. both GSync and Freesync/AdaptiveSync/GSync-compatible, wow thats a mouthfull maybe we should just call it "VRR'.
 
That's our discussion with EastCoast. He thinks Gsync and Freesync don't actually do anything because monitor tech has improved so much. That you can turn Freesync/Gsync off and there is no difference. If you do see a difference it's because you have a crappy monitor.

So you must have a crappy monitor Humbug :p

I see, well my Monitor is not an Asus ROG or something along those lines, its a fairly standard 32" 1440P IPS Free-Sync panel, its not expensive, its not cheap, G-Sync / Free-Sync defiantly makes a difference, i don't know if a £700 Screen would display an image as smooth as mine does without G-Sync / Free-Sync, even if it did, i didn't pay £700 for mine.
 
Monitors without Freeysnc/Gsync will have tearing unless you have software solution like Vsync enabled. That's a fact.

The improved hardware in Gsync and Freesync monitors will have no effect on tearing if you turn off Freesync and Gsync. That's also a fact.
As an aside there are monitors/tech that can clean up tearing without having V-Sync...

Ok, i just ran the G-Sync Pendulum Test.

What i can tell you is with V-Sync on and V-Sync off i don't get any screen tearing...
Melmac you are woefully lacking any personal experience or knowledge on the subject. People in this post have proven you assertion incorrect if you had just read their post before replying or just edit your post to make a correction I would have understood. Tragic really.



Nvidia isn't trying to rebrand Freeync. Nvidia is removing the Freesync brand name from Monitors.
That is what rebranding does. You are again, incorrect and haven't written in a way that shows you understand the subject you're so vehemently against.


As to the last part of your post. FS2 is about the implementation of HDR (yes there are others but this is the biggest difference between them). AMD will have developers use their method. I've said that in my prior post.
FS monitors already have low latency and already offers LFC. Did you really believe that monitors that offer FS didn't have those 2 things until FS2? LOL, wow.
----
Ok I found an article discussing it a bit more how HDR will work in games using FS2 (which I would assume next gen consoles as well).
The idea was the games themselves would tone map directly to what the display was capable of presenting, with the FreeSync 2 transport passing the data straight to the monitor without the need for further processing on the monitor itself. This was in contrast to standard HDR tone mapping pipelines that see games tone map to an intermediary format before the display then figures out how to tone map it to its capabilities. Having the games do the bulk of the HDR tone mapping work was supposed to reduce latency, which is an issue with HDR gaming.

That’s how AMD detailed FreeSync 2’s HDR implementation back at CES 2017. While it sounded nice in theory, one of the key issues raised at the time was that the games themselves had to tone map specifically to FreeSync 2 displays. This meant games would need to integrate a FreeSync 2 API if this HDR implementation was ever to succeed, and we all know how difficult it is to convince a game developer to integrate a niche technology.
https://www.techspot.com/article/1630-freesync-2-explained/

I do hope you gain some insight. :)

This brings us to back to one of my original statements. I as well as others don't need it because we don't see tearing in our games. Therefore, as far as I'm concerned Free/G sync is dead, but not because of what nvidia is trying to do.
:cool:
 
Last edited:
Back
Top Bottom