From vesa's site;
The framework has been around for ages. It just hadn't been utilised by external displays or their connectors.
Yes, that's in Laptops not in desktops. It was only made part of the desktop specification in May/June 2014.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
From vesa's site;
The framework has been around for ages. It just hadn't been utilised by external displays or their connectors.
That's what I saidYes, that's in Laptops not in desktops. It was only made part of the desktop specification in May/June 2014.
That's what I said
The tech itself has been around for ages.
looks like it with the pointy red bit on the stand. interesting if it is, the 29um69g is a fairly budget end monitor, at least for an ultrawide, with only 40 - 75 hz freesync rangeIs that flickering ultrawide an LG 29UM69G?
looks like it with the pointy red bit on the stand. interesting if it is, the 29um69g is a fairly budget end monitor, at least for an ultrawide, with only 40 - 75 hz freesync range
yeah, my guess is that its something like that. I have had a Samsung monitor that is freesync 40 - 75 hz, but is locked to 60 without freesync, and I tried overclocking it and it went all to poo over 65hzI wonder if it still only runs to 60Hz via the adaptive sync spec with an nVidia card and the higher refresh via FreeSync can't be enabled in the OSD without an AMD card and is part of the problem with it flickering. nVidia might be enforcing a lower minimum refresh as well which might be kicking out the panels that can't adjust to it.
Wrong sorry, There is a hardware requirement on the GPU side for supporting Adaptive Sync, hardware that's not normally needed on a desktop GPU which is why AMD's first demonstration was using laptops. This is why Nvidia needed a module. AMD had been working on adaptive sync and had built the support for it into their second generation GCN GPUs. It's why Older GCN cards aren't fully compatible with Adaptive sync despite having display port 1.2. And the Laptop standard is why First Generation APU's from AMD have support.
I remember AMD also demonstrating VRR on a desktop panel, even the suggestion that many existing desktop panels could work with a firmware update.
I also remember reading that Nvidia GPUs lacked the required scaler tech to control VRR panels. I'm not sure if the 9 series does lack it, but have read that the 10 series and up will offer support. Why would they then hold back such support until now?
Then to finish it all off Nvidia won't name monitor's or provide a list of monitors it tested & classed as non validated, leaving people with Freesync monitors to either risk it or play it safe and use what works. ie: Radeon gpu's
So does this all sound plausible to you?
On an added note isn't the blinking monitor one of the really wide ultrawides not a 21:9 model.
I've had that blinking on my Freesync screen with my Vega 64.
It's intermittent and is okay depending on driver/game.
If they genuinely intend to support adaptive sync they'll have to do some work on the driver side not simply refuse to validate that model & tell people they can buy our card and try their luck.
They don't want to name and shame their partners. The list of gsync compatible monitors is easy to work with. If its not one of them, you are on your own. I think its a good thing nVidia is trying to bring the wild west of FreeSync monitors under control. There is a lot of crap out there and very few good monitors.
I see the BenQ/Zowie XL2740 is on the supported list, I have the XL2730Z paired with a 1080Ti, wonder how that will behave with these new drivers/gsync.
If its not supported properly I think i'll just move forward with my plan to move to 34" gsync ultrawide and give the BenQ to my son. Will be interesting to see, roll on next week
Gonna be an interesting one to see what comes out in the wash - it could be that the panels are "faulty" in how they operate and AMD is simply building tolerance for it into the drivers at the expense of latency or image quality, etc. or it could be nVidia sticking to some version of the standard just to be awkward or lazy and so on.
Err come again? Gsyncs floor is 30hz and after that it does the same as what LFC for freesync is meant to do which is doubling the frames until it goes over the floor(which again is 30hz in the case of gsync). VRR gaming at sub 55ish is trash imho so i don't understand the out cry about 40-48hz freesync floor vs gsync 30hz as long as the monitor is supporting LFC properly. Now if we were talking about the 60hz monitors with a floor of 48hz then i would certainly agree that its not that useful but then we are talking about a poor implementation of freesync instead freesync itself being bad. There is a distinction to be made there.
Well here is the thing, ULMB is an nvidia feature and is NOT supported on every gsync monitor so even if you stick to a pure nvidia ecosystem you still have to do the research, there is no getting around it.
No, G-Sync works all the way down to 1fps and I tested it at 14 fps. Not playable but continued smoothly.Err come again? Gsyncs floor is 30hz and after that it does the same as what LFC for freesync is meant to do which is doubling the frames until it goes over the floor(which again is 30hz in the case of gsync). VRR gaming at sub 55ish is trash imho so i don't understand the out cry about 40-48hz freesync floor vs gsync 30hz as long as the monitor is supporting LFC properly. Now if we were talking about the 60hz monitors with a floor of 48hz then i would certainly agree that its not that useful but then we are talking about a poor implementation of freesync instead freesync itself being bad. There is a distinction to be made there.