• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel plans to support FreeSync.

The Fiji series is shocking with Mantle in BF4 and I don't believe AMD are revisiting it either. I was a little disappointed to find that out and a 290x was as quick as a Fury X when both are running Mantle.

I would have liked to have tried Freesync and if Greenland is faster than Pascal, I will be trying it.

Yeah, it is a shame. But some reading around and some tweets from Johan Anderson may have pointed the culprit to being WDDM 1.1-1.3 getting in the way. They essentially had to write hacks around it for memory management on each GCN architecture. So it could be the reason why the problem occurred with the 285 as well. since they had not written the workarounds for it. And why BF4 has such bad memory management for Mantle.

He also went on to state that WDDM2.0 has none of those problems since it was made for DX12. And Mantle could see a performance improvement due to them not having to hack around the memory management. But they have to write the support for a WDDM2.0 pathway for the best effects.

But that also then restricts Low Abstraction API's to Win 10 for the best possible performance.
 
Last edited:
afsync is just Better. Better for consumers, Better for the industry, just... Better.

Whatever miniscule differences there are now will not matter going forward. We don't need to argue the case because we're on the right side of history, feels good man.
 
NV should be using async, the open standard. End of.

They do, see Gsync on laptops. ;)

afsync is just Better. Better for consumers, Better for the industry, just... Better.

Assuming you are talking about Freesync. to say it is just better, is just plain wrong. Technically with the currant implementations, it is clearly inferior to the currant GSync implementation.

Better for consumers, well that one is a bit up in the air at the moment. With the very wide varying range of good and seriously not so good Freesync capable monitors, it is a bit of a minefield for consumers.

Better for the industry. Well if it wasn't for GSync there would be no Freesync, but long term I would agree with you.

Whatever miniscule differences there are now will not matter going forward. We don't need to argue the case because we're on the right side of history, feels good man.

Yup long term Adaptive-sync and its various vendor implementations will win out, (assuming that the Intel rumour is correct) technically it will eventually overtake the technical Advantage that GSync currently has, but saying that of course Gsync will advance as well, right up until the point when NVidia stop going the dedicated module route.

It makes me wonder if NVidia could put all the gubbins in the GSync module into the GPU itself and just have the monitors do the display side of things. in a similar way to AMD has the Freesync hardware built into some of their GPU's, that way preserving the technical advantage that GSync currently has.
 
It makes me wonder if NVidia could put all the gubbins in the GSync module into the GPU itself and just have the monitors do the display side of things. in a similar way to AMD has the Freesync hardware built into some of their GPU's, that way preserving the technical advantage that GSync currently has.

The Gsync ASIC takes over the functions of the scaler. That's how they manage to avoid the overshoot and limited working range issues that plague Freesync displays. Freesync displays will need the panel and scaler manufacturers to work together if they want to match the Gsync experiance.
 
It makes me wonder if NVidia could put all the gubbins in the GSync module into the GPU itself and just have the monitors do the display side of things. in a similar way to AMD has the Freesync hardware built into some of their GPU's, that way preserving the technical advantage that GSync currently has.

They couldn't, that's why they have the G-Sync controller in the monitors -- they couldn't find a way to do it with the hardware already in monitors. They could do it now with DP1.2a but that would mean giving up on G-Sync in its current form and being closer to FreeSync, which I don't think they'll do.

I'd say they'll be keeping an eye on Adaptive Sync now Intel will be using it as well, if they think they're losing money by not supporting it as well then they'll support both G-Sync and nVidia-flavoured ASync, leaving the G-Sync monitors out there AS WELL as the ASync ones. I can't see them dropping G-sync and going purely ASync.
 
Of course and that was something they put to a public vote and what AMD wanted. Effectively that is the way it is meant to be played from AMD's perspective, as that is a characteristic part of the drivers.

I have been a massive proponent of these techs since inception and urge anyone Freesync/G-Sync capable to try it. A game changer for me, just like the SSD was.

Have you tried it Humbug?

I have used both Gysnc and Freesync side by side. There is absolutely no difference between them when they are within their ranges. Under 30 FPS yes Gync is smoother. But I still didn't like playing at sub 30fps.

Most people will lower their settings until they can get above 30 at a minimum.
 
I'd say they'll be keeping an eye on Adaptive Sync now Intel will be using it as well, if they think they're losing money by not supporting it as well then they'll support both G-Sync and nVidia-flavoured ASync, leaving the G-Sync monitors out there AS WELL as the ASync ones. I can't see them dropping G-sync and going purely ASync.

This is how I see it playing out. The top tier displays will retain the Gsync module and branding, whilst the cheaper displays will have a generic ASync scaler.
 
I have used both Gysnc and Freesync side by side. There is absolutely no difference between them when they are within their ranges. Under 30 FPS yes Gync is smoother. But I still didn't like playing at sub 30fps.

Most people will lower their settings until they can get above 30 at a minimum.

Yer, 30fps or below is quite painful really and although Batman ran well at 5K and around 20 fps, the input lag made it tough to play.
 
I have used both Gysnc and Freesync side by side. There is absolutely no difference between them when they are within their ranges. Under 30 FPS yes Gync is smoother. But I still didn't like playing at sub 30fps.

Most people will lower their settings until they can get above 30 at a minimum.

but that is kind of the problem, the gsync module has better control over the panel than the current generic ASIC's so they tend to have a better range - or in the case of the new super wide ones, gsync enables the panel to do 100hz where as the generic scaler can only do 75hz

having the only superwide that can do 100hz is yet another "first" and will be unique to gysnc until a scaler manufacturer gets around to doing it, but with the lower volumes its not really worth producing a scaler just for one monitor type
 
Nvidia already use a form of Freesync in their G-Sync enabled gaming laptops, There isn't an actual G-Sync module inside :p

That is because async was a required spec for laptop eDP. that didn't exist for desktops, and it has its own limitations, hence nvidia developed their own solution for desktops..
 
but that is kind of the problem, the gsync module has better control over the panel than the current generic ASIC's so they tend to have a better range - or in the case of the new super wide ones, gsync enables the panel to do 100hz where as the generic scaler can only do 75hz

having the only superwide that can do 100hz is yet another "first" and will be unique to gysnc until a scaler manufacturer gets around to doing it, but with the lower volumes its not really worth producing a scaler just for one monitor type

I can't remember where I saw it but there was *something* about it not working right at 100hz.. Will see if I can source it.
 
Free-Sync rages from 9 to 240 hz, it not working below 35 FPS is the screen, if you game at below 35 FPS Avoid the Asus screen. Simple.
 
It should still work as a normal 75hz screen though, its advertised as and allows you to select 75hz as a fixed refresh rate.

The 100hz gsync screens they have actually said it will only do 100hz when using an nvidia gpu.

I can understand that having had numerous display issues with my Fury X, 290X and 7950 as the GPU DP is weak, so probably why it will only do 100Hz on Nvidia.
 
Back
Top Bottom