• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

If Nvidia suddenly adopted adaptive sync (Freesync)

Be a good time for nvidia to adopt Free sync support seen as they have been super competitive in the high end GPU department for a while. Means all those sitting on A sync / "free sync" monitors have something to upgrade to. Yea there is Vega around the corner now but still gotta wait lol.
 
The reason people want Nvidia to adopt A-sync is the same reason they don't want to: customers pay less. And in this relationship Nvidia has the power to decide, and since people won't stop buying their GPUs, nothing will change (unless AMD can achieve parity). Maybe they'd sell more at below 1070 level but I'm not so sure it would off-set the loss on the modules themselves and locking you in to their eco-system.
 
I get that Minstadave, however, do you think it could have potentially sunk the graphics card division if it had been adopted 6-9 months ago? Its all very well not liking these lock-ins however it may be providing AMD with a lifeline?

To understand Nvidia business plan.
The hardware to support VESA Adaptive Sync, exists on all their Pascal cards. (DP1.4/HDMI 2.0b ports hardware). However they do not activate it on the drivers, so you have to buy Gsync monitor :)

But the day HDMI 2.1 (VRR) TVs start flooding the market (later this year, earlier next), magically Nvidia will be forced to provide adaptive sync over HDMI 2.1 or at least VRR over HDMI 2.0b on all their products. Or else they could leave AMD run rampant.

Consider that already XboneX supports Freesync 2 and VRR even if the latter isn't that clear if it is over HDMI 2.0b or actually has HDMI2.1 port. So makes sense, anyone who would buy XboneX, get a Freesync monitor instead of a generic one or one with Gsync, and possibly upgrade to AMD Graphic card their PC also to make use for it.
And XboneX going to sell few million units, opening a new potential market for Freesync monitors.
 
The reason people want Nvidia to adopt A-sync is the same reason they don't want to: customers pay less. And in this relationship Nvidia has the power to decide, and since people won't stop buying their GPUs, nothing will change (unless AMD can achieve parity). Maybe they'd sell more at below 1070 level but I'm not so sure it would off-set the loss on the modules themselves and locking you in to their eco-system.

nvidia might as well just do it. They wont be loosing out on customers cause those going with freesync most likely would not have gone with gsync to begin with due to the premium. There is a clear difference between freesync and gsync, not the tech itself, but the implementation. Freesync monitors are very hit and miss simply because there are no proper oversight. A lot has flickering issues and some have ghosting issues simply because the implementation isn't done properly. Just take the monitor im running an Asus MX34VQ. Pretty expensive all things considered but it has severe flickering issues when freesync is in use below 60hz. Look at the CF791 from samsung, also has flickering issues and then you have the lower end like the Asus VH245 which suffers from annoying ghosting and the list keeps going on. This is what makes gsync better overall compared to freesync, not the tech itself, but the implementation overall(i know there are atleast one spotty gsync monitor but the number is a lot lower). Now if only Dell could get off their behinds and release their dell s2417dg and S2716DG in freesync versions and everyone would be much happier.
 
it is if you've ever had a high refresh CRT.
if were going back that far then I didn't use v sync, competitive tf and tfc was conducted with maxium possible fps, in the hundreds or higher, only in my old age has 60 fps vsync been the go to, and tfc died when tf2 came out ;(

edit: think it didnt like over 999 fps, didn't the hl engine go weird, i'd have to fire it back up to check :X
 
I'm very old and I never seen freesync or gsync in real life, but 60fps vsync is hardly a horror show is it? :p

Yes it is. Input lag on games like World of Tanks is horrendous, to the point that you cannot aim and shoot properly. (tried it playing on my mate gear on Easter).
I had to sell my GTX1080 because I was missing Freesync, and went back to FuryX. Yes less FPS but so much more smoother experience.

Until you try Freesync/Gsync on a 144hz monitor, you don't know what you are missing.
 
A standard adaptive sync technology would be fantastic for the consumer.

Locking people in to one brand with a propriety sync tech when an open standard exists is unnecessary and anti-competitive.

Yes, It may take longer than 9 months to a year but it would hit RTG hard. Possibly leaving them without much of a gpu consumer base.
 
Yes it is. Input lag on games like World of Tanks is horrendous, to the point that you cannot aim and shoot properly. (tried it playing on my mate gear on Easter).
I had to sell my GTX1080 because I was missing Freesync, and went back to FuryX. Yes less FPS but so much more smoother experience.

Until you try Freesync/Gsync on a 144hz monitor, you don't know what you are missing.
I've not played much world of tanks, more world of warships, but I have played it at 60fps, it was certainly playable, I mean I play everything these days at vsync'ed 60fps, I've got p3d v4 and x plane 11, that won't run that fast, and have been waiting on playing them until I upgrade my 970, I want to try a 144hz screen and see the difference, it's quite a wedge for a 1080ti and gsync monitor tho :P
 
Yes it is. Input lag on games like World of Tanks is horrendous, to the point that you cannot aim and shoot properly. (tried it playing on my mate gear on Easter).
I had to sell my GTX1080 because I was missing Freesync, and went back to FuryX. Yes less FPS but so much more smoother experience.

Until you try Freesync/Gsync on a 144hz monitor, you don't know what you are missing.

I've never noticed input lag on a few games with V-sync on like others. Unless the mouse feels like I'm moving the cursor through jelly (where you stop to aim and the mouse keeps moving/wobbling at the end) it doesn't seem that bad. Maybe it's one of those things where if you've used Freesync/G-Sync you don't know what you're missing out?

I've got an NVIDIA card but would switch to AMD if I got a Freesync monitor. No way am I paying way more for a proprietary monitor. The majority say the G-Sync experience is better but still not worth the amount they charge for it (for me anyway).
 
It would be cool however I would be worried it might mean they were thinking of giving up on Gsync.



AMD had the option of using Gsync just like they did with Physx.

When was that? I don't remember ever seeing any mention of G-sync being available for AMD, It was Nvidia's ace in the whole they just played it at the wrong time, If they'd waited until after the displayport standard update AMD wouldn't of been able to get adaptive sync included with version 1.4a or what ever it is.
 
One of the few users here to have used both FreeSync/G-Sync although they were both 'proper' 144Hz gaming panels- they perform the same in regards to Adaptive Sync performance.

AMD had the option of using Gsync just like they did with Physx.
When was that? I don't remember ever seeing any mention of G-sync being available for AMD

Initially when the press queried about AMD access to G-Sync, Tom Peterson said along the lines of 'don't know, it's not been discussed', then a few weeks later when queried again it was a categorical 'No'(Amd won't get access to G-Sync) when asked again.
 
Back
Top Bottom