1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Nvidia to support Freesync?

Discussion in 'Graphics Cards' started by Wrinkly, Jan 7, 2019.

  1. melmac

    Soldato

    Joined: Dec 19, 2010

    Posts: 6,255

    LOL how can you post up all this and still believe that you are right?

    Adaptive sync was added to the display port standard in May 2014. An optional part of the standard and still is an optional part of the standard.

    It was never added to the eDP specification. eDP is for internal connections like you said and has a different set of requirements than the display port standard.

    We are on display port 1.4a but embedded display port 1.4b and that was released back in 2015. Display port 1.4 only came in 2016. Just because something exists on one standard doesn't mean it exists on the other. There are different features to both and different standards.

    But, hey, knock yourself out, Go through all the specs of the eDP and I guarantee you that you won't find any reference to "adaptive sync" been added or been used as part of the specification. But if you find something that shows the term "adaptive sync" been added as part of the eDP standard, I will apologise and admit that I was wrong.

    eDP and DP are different, they might have the same governing body (VESA) but, you said it yourself, one applies to internal connections and one to external and as such have different requirements.
     
  2. TNA

    Capodecina

    Joined: Mar 13, 2008

    Posts: 10,440

    Location: London

    I am right, see? :D

    He would rather argue to death than apologise :p

    My stance on this is simple. Nvidia gave up with their G-Sync module and threw the towel in and now support Freesync monitors. It is that simple. Nothing wrong with it either, it is a good move both for business and consumers. They have already milked G-Sync and they can now use the module to make it a more premium thing, stick a fan on it and milk it some more :D
     
  3. LambChop

    Mobster

    Joined: Apr 4, 2011

    Posts: 3,336

    toms
    https://www.tomshardware.com/news/vesa-displayport-freesync-amd,28524.html

    vesa
    https://vesa.org/featured-articles/...rd-1-4-for-mobile-personal-computing-devices/

    anan (2nd paragraph)
    https://www.anandtech.com/show/1379...-adaptive-sync-with-gsync-compatible-branding

    arst
    https://arstechnica.com/gadgets/201...standard-supporting-devices-with-8k-displays/

    I had to stop, I got bored.

    You don't need to admit you were wrong, I've known it from the beginning.
     
  4. nashathedog

    Sgarrista

    Joined: Sep 12, 2013

    Posts: 8,414

    Location: Knowhere

    I think you're right & I'm wrong, Sorry,


    Here's what I'm thinking I see now,
    We have 4 monitors in a row, The order we see them in goes from right to left, The first two are G-sync compatible, both have stickers stating the one on the right is a 27" 4k model and the other is a 27" QHD model. The next two monitors are both unvalidated. It looks like the third monitor from the right to left is a 32" not an ultrawide as I thought & the fourth monitor is a regular ultrawide as you said.

    At 1:01 we see the Gsync compatible monitors alongside the first non-validated monitor & the non validated looks a lot wider which is why I presumed it was an Ultrawide, but now that I'm looking closer I can see that the panel itself is also taller so it's not just a width difference it's a height difference as well so it must use a 32" panel.
    Now I'm wondering what resolution the 32" monitor is? There's no sticker on it and Nvidia's complaint is that the picture looks blurred, Could that be in part because it uses a 32" panel that doesn't look as sharp as the picture on the two 27" models which where stickered as 4k and QHD.

    Stop the video at 1 minute & 1 second & tell me what you think.
    I think Nvidia are trying to obscurate the truth regarding how adaptive sync compares to G-sync by making it all about monitor quality which we've all known is an issue from the start, an issue that's down to the monitor makers not how well the tech works.

     
    Last edited: Jan 10, 2019
  5. Shaz12

    Gangster

    Joined: Apr 25, 2017

    Posts: 280

    The G-Sync module still has its advantages even in midrange. You get the full range of variable refresh rate, ULMB, guaranteed compensation at lower frame rates and superior overdrive performance. Not all monitors are ripoffs.The S2716DG is sub 400 and I have a hard time finding a single FreeSync monitor supporting all these features in 1 package at its price point
     
  6. TNA

    Capodecina

    Joined: Mar 13, 2008

    Posts: 10,440

    Location: London

    I know mate. I am still very happy with my monitor, there are no Freesync monitors that can match it's range. My previous one was a LG 4K monitor which was a Freesync one with a range of 40-60hz only. Not that it is bad, but having a much bigger range is nice. I have never seen a tear since. Very happy with my purchase, it also had less blb and no dead pixels. Will be keeping it until I upgrade to a 120Hz 4K VRR OLED TV next year :D
     
  7. nashathedog

    Sgarrista

    Joined: Sep 12, 2013

    Posts: 8,414

    Location: Knowhere

    That's terrible, LG have been one of the biggest abusers when it comes to supporting Freesync, The first ultrawides they marketed as gaming monitors with Freesync support had a 48-60hz range, If you look at AMD list of compatible monitors to see what features where available you'll see that the majority of LG's early monitors were a joke.
     
  8. TNA

    Capodecina

    Joined: Mar 13, 2008

    Posts: 10,440

    Location: London

    40-60hz range is better than nothing though :)
     
  9. henson0115

    Hitman

    Joined: Dec 6, 2013

    Posts: 974

    Location: Nottingham

    the new ones are too, to be fair. my newish lg uw is 48 to 75 from memory? although its fine as i game at around 70, it would be better if broader.
     
  10. Stu

    Wise Guy

    Joined: Oct 19, 2002

    Posts: 1,926

    Location: Wirral

    I generally agree with you. The first two are 27inch. The third initially looks like a 32 inch or a 27 that is lower on its stand compared to the first two, but it is clearly 32 inch when seen next to the UW... assuming the UW is a 34 inch, it would be the same height panel as a 27 inch, implying the third monitor is indeed a 32 inch panel.

    There is no argument that the UW is flickering, which is bad... who knows the reason and if it can be fixed by an end user.

    The 32 inch monitor... it certainly doesn't look good, and I agree with the reviewer that there is blurring... it could be an overdrive issue as he says, it may be a 60Hz monitor (though GSYNC should still look smooth, right?), it might be significantly struggling with frame rate? I did consider if it was a resolution thing, but the image looks good before he starts moving the view around. I dunno, and I'm just throwing stuff out, but my guess is an ghosting/smearing/overshoot issue, which may be inherent with the panel and not related to GSYNC implementation, but I can see why Nvidia would not want to stick a "GSYNC certified" sticker on such a poor panel. Adding GSYNC will not make a bad panel become a good one.
     
  11. kazuya1337

    Gangster

    Joined: Apr 14, 2014

    Posts: 434

    Well I'm confused.

    In the initial Nvidia press release they say " later this quarter ASUS will unleash their curved 35-inch 3440x1440 G-SYNC HDR display" (The only current G-SYNC HDR is the PG27UQ which is 27-inch 4K)

    We are getting a new line up of ROG Strix Free-Sync monitors coming out with the final one being the XG32VQR which is HDR but only 32-inch and not a G-SYNC specific screen.

    So are we getting a new G-SYNC Ultimate ROG Swift line up soon as well, featuring G-SYNC HDR screens?
     
  12. willhub

    Capodecina

    Joined: Jan 3, 2006

    Posts: 20,891

    Location: MediaCityUK

    When will we all be able to test freesync monitors and see if we can enable gsync?
     
  13. doody

    Gangster

    Joined: Dec 19, 2012

    Posts: 135

  14. nashathedog

    Sgarrista

    Joined: Sep 12, 2013

    Posts: 8,414

    Location: Knowhere

    It'll likely come down to how much work Nvidia do supporting adaptive sync on the software side, I don't know what would be required but AMD seem to do plenty of software work related to fine tuning the various Freesync monitors so that they'll work okay, I've seen plenty of people mentioning issues they've had with Freesync enabled that weren't related to any particular game & AMD's done a fair job of working on them up till now. I've reported a few myself in the past, I don't remember what so they must of been fixed eventually.

    Because of how long AMD's been using been Freesync they've currently got a much better implementation of adaptive sync, I hope Nvidia are willing to do the work that'll be required to try and catch up with AMD in that respect rather than just write the support for older Freesync models off and focus on fixing up any issues with both current & new monitors going forward.

    I can't imagine they'lljust stop making G-sync monitors but you can pretty much guarantee that they'll be good models that come with a hefty price tag.
     
  15. nashathedog

    Sgarrista

    Joined: Sep 12, 2013

    Posts: 8,414

    Location: Knowhere

    Drivers release within a week, I read around the 15th.
     
  16. doody

    Gangster

    Joined: Dec 19, 2012

    Posts: 135

    G-SYNC Ultimate = freesync 2:p
     
  17. Shaz12

    Gangster

    Joined: Apr 25, 2017

    Posts: 280

    I think they will keep making £800+ monitors. They still need to keep G-Sync somewhat affordable. There are hardly any buyers at the ultra high end. 1440p 144hz is where there will be highest demand, not these crazy size monitors.
     
  18. IT Troll

    Wise Guy

    Joined: Jun 15, 2005

    Posts: 2,139

    Location: Edinburgh

    Did you notice the title of that article.

    NVIDIA To Officially Support VESA Adaptive Sync (FreeSync) Under “G-Sync Compatible” Branding

    They understand that, for the general public, FreeSync has become shorthand catch all term for this tech. It will be a struggle to change this now it has taken hold, regardless of the history, standards and marketing.
     
    Last edited: Jan 10, 2019
  19. TNA

    Capodecina

    Joined: Mar 13, 2008

    Posts: 10,440

    Location: London

    Don’t care what Nvidia or anyone says. These are all Freesync monitors. Nvidia cards now work on Freesync monitors and that’s the end of it. Nvidia can try and call it what they want, but it won’t change :D
     
  20. Rup

    Gangster

    Joined: Aug 23, 2017

    Posts: 182

    Great news, finally Nvidia surrender to reason!!
    My 1440p UW Acer will perform great with my 2080 RTX.