• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD FreeSync vs Nvidia G-Sync - the adaptive sync battle

Nice, now please hold on to that same logic, and convince those who insisting AMD is dividing the market with Mantle, rather than thanking it for inspiring MS to think about and implementing low-level access with dx12.

No wait...dx12 was going to bring low level access anyway, AMD was just bring a jerk deciding to jump ahead for bringing the feature early to its users base for free...they totally should done what Nvidia does be so kind enough to all their user base to let them have the chance to happily throwing money their way for having the new feature early :rolleyes:

Yep lots of double standards going on here.
 
Yep true.

I'd imagine it's only a matter of time till Intel are supporting it too. Honestly I still have no idea why nVidia made GSync given they are part of VESA and knew about the impending arrival of adaptive sync - they could easily have had their drivers all ready to go

Exactly this. It is a simple case of Nvidia pushing a non-standard proprietary solution when a perfectly usable and open solution is available. This would mean one single monitor would potentially work with all GPU vendors if they adopted the open standard. That will never be an option with Gsync. Some people refuse to look at what is good for ALL GPU consumers, all they see is their God Nvidia and immediately drop in worship. :rolleyes:

Well to be fair to Nvidia, there was no open solution available when they launched Gsync.

Why did they go Gsync? My guess is that they knew that adaptive sync was coming. None of their Kepler or Maxwell cards(so far) can connect to an adaptive sync monitor. They also knew that it would be a while before any of this tech hit the market. So they came up with their own solution and because it was their own, they got it to the market quicker.

They are a business after all, they spotted an opportunity and went for it.
 
Works both ways, so refreshing to see Nvidia bring these monitors to their customers to the widest audience.

Whatever way you look at it your either tied into Amd and save £100 ( your still spending money on a freesync compatible monitor ) if you own any other gpu you have to buy an Amd one.

Or your tied into Nvidia and save on the cost of a comparable Amd card ( and again your still buying a new Monitor ) and if you dont own an nvidia gpu you have to buy an Nvidia one.

Cost difference as of right now for a 1440 free/Gsync monitor and a 290 or a 970 is £200 and thats all down to the Rog branding. When other 1440 gsync monitors are released i am sure it will be less than £50 difference.


Either way i would not change my card just for a monitor ( early adopters excused ), and i am sure no one else would right now.

Until Nvidia changes what g-sync chip is made it will always be at least 100 pounds more expensive to 150 pounds more expensive than an adaptive-sync monitors.

that's just fact - you won't seen a g-sync monitor within 50 pounds of adaptive sync; because of what Nvidia charges.

Now way its looking; there will be far more adaptive-sync monitors than g-sync; simple economics; cheaper to produce the adaptive-sync; so easier to make wider range of models....

Right now it looks to be about even in how many monitors; as economics plays how we will see more adaptive-syncs.....
 
Battle ?
Lol.. let's fight.

Player #1 - expensive, but existing @ market, capable for testing
Player #2 - told to be cheaper, not existing @ market (yet), told to be working the same way or better.

Let's fight. Facts agains thoughts ;-)
I'm surprised that thread hasn't been removed :)
 
I'm surprised that thread hasn't been removed :)

If they deleted all threads comprising rumour, speculation and discussion based thereon, this would be a very quiet board.

When nothing much is happening, there will of course be disagreements based on guesswork.
 
Excellent and credible review and a very interesting article.

AFAIK there is already an Acer monitor with freesync available to pre-order on OCUK and while I haven't been paying an awful lot of attention to it, Asus appear to have a adaptive sync compatible display in the works and announced at CES... thus an article that is posted after these two things are known claiming that Acer currently have no plans to release a DP 1.2a monitor and Asus has also not announced a monitor is... rubbish, inaccurate and entirely useless.

http://www.overclockers.co.uk/showproduct.php?prodid=MO-078-AC&groupid=17&catid=948 acer freesync... apparently nothing announced.

http://www.pcper.com/news/Displays/...2560x1440-IPS-120-Hz-Variable-Refresh-Monitor

Both monitors have categorically been announced, yet an article claims neither have announced such monitors or shown any interest in support.

G-sync had one 144hz 27" 1440p screen available at £700 or so at launch, freesync/adaptive sync has already announced 3 of that specific monitor from what you might call the three main gaming brands in Asus, BenQ and Acer... to write an article that misrepresents this as only one of the three main gaming monitor makers is beyond ridiculous, it's beyond inaccurate. It's either inept(most likely) or purposefully misleading(quite possible).

Excellent and credible, not even remotely close. Pathetic and embarrassing at best, lying and biased as hell at worst, neither is good.

EDIT:- just to add the article specifically mentions that
Some monitor manufacturers are sticking with Nvidia for now too.

Which actually pushes me towards believing they are being intentionally misleading and trying to imply that some monitor makers are staying loyal to or trying to stick to a better technology with Nvidia. There really was no need to mention that at all, even if it was true a line something like "as yet Acer and Asus haven't announced any plans for a freesync monitor", to go with what is paraphrased as "two of the main gaming monitor players are sticking with nvidia" is bad to begin with as they could be right around the corner with monitor announcements but when both have actually announced monitors already... it's beyond awful to suggest they are sticking with Nvidia.
 
Last edited:
Yep true.

I'd imagine it's only a matter of time till Intel are supporting it too. Honestly I still have no idea why nVidia made GSync given they are part of VESA and knew about the impending arrival of adaptive sync - they could easily have had their drivers all ready to go and avoided the bad PR of 'you can get what AMD offer from us too, but have to pay extra for it'.

Or, rather, as an interim solution I can see that it gave them a time advantage, but I've no idea why they don't also support adaptive sync - that would have been a win/win I feel. The GSync added cost wasn't so large that it's any different to any other early adopters tax so existing customers don't lose out. It would impose a maintenance cost though making sure that GSync keeps working as intended for a reasonable period while also supporting the standard.

Edit: To be clear, adaptive sync predates gsync significantly - on laptops. On desktops it may be that gsync was conceived before adaptive sync on desktop, hard to say, (and if so, and if it was this that triggered adaptive sync to come to desktop, then great! Well done nVidia, awesome idea) but certainly VESA were talking about it well before GSync became a reality. Perhaps nVidia had invested enough in GSync by that stage that they wanted to follow through. However they already knew about adaptive sync since the inception of the mobile version - and as they have mobile GPUs they should have been working on driver level support years ago. No idea if they have released a compatible driver on mobile or not yet, someone else can answer that if they know - I've got an nVidia m-series GPU but have no clue if it can support it, my panel can't so it's of no use to me :'(

I think you will find that it was the EDP variable Vblank power saving feature that was available on laptops before Gsync and not Adaptive Sync as it is now.

So It seems rather unfair to say that Nvidia knew that adaptive Sync was coming before they introduced GSync. (bear in mind that GSync was probably in development for ages before Nvidia introduced it.)

On the Mantle front, yes it could be deemed a similar situation, but in my opinion no it isn't. Why, well there was never really any doubt that there would be a DirectX 12, whether it would be called that or not, where as there wasn't really anything from the opposition that predated GSync.
 
So It seems rather unfair to say that Nvidia knew that adaptive Sync was coming before they introduced GSync. (bear in mind that GSync was probably in development for ages before Nvidia introduced it.)

They undoubtedly knew it was coming because it's one of those things that was going to be adopted by VESA eventually, the process was just speeded up by AMD shouting about it. The reason they pressed ahead with G-Sync was because they were unhappy with the performance/quality offered by other methods (including adaptive sync).
 
They undoubtedly knew it was coming because it's one of those things that was going to be adopted by VESA eventually, the process was just speeded up by AMD shouting about it. The reason they pressed ahead with G-Sync was because they were unhappy with the performance/quality offered by other methods (including adaptive sync).

Was it?
To my understanding AMD asked for it to be included after GSync as announced, and then it took a good while for it to be decided on and even then it was only made an optional part of the standard.
 
Well to be fair to Nvidia, there was no open solution available when they launched Gsync.

Why did they go Gsync? My guess is that they knew that adaptive sync was coming. None of their Kepler or Maxwell cards(so far) can connect to an adaptive sync monitor. They also knew that it would be a while before any of this tech hit the market. So they came up with their own solution and because it was their own, they got it to the market quicker.

They are a business after all, they spotted an opportunity and went for it.

Yep, I went into that a little below but your comment is entirely fair.

I think you will find that it was the EDP variable Vblank power saving feature that was available on laptops before Gsync and not Adaptive Sync as it is now.

So It seems rather unfair to say that Nvidia knew that adaptive Sync was coming before they introduced GSync. (bear in mind that GSync was probably in development for ages before Nvidia introduced it.)

On the Mantle front, yes it could be deemed a similar situation, but in my opinion no it isn't. Why, well there was never really any doubt that there would be a DirectX 12, whether it would be called that or not, where as there wasn't really anything from the opposition that predated GSync.

The standard used in laptops was indeed aimed at power saving - it's still the same vblank mechanism as used now. Also, if you read what you quoted, I did say nVidia may have been well into development (that's what I was talking about when I said they'd already 'invested' in GSync) and said there may have been no plans to bring it to desktop in which case good job! Though, by 2012, AMD were talking in a vague way about it (though not yet named adaptive sync or even necessarily using that mechanism)
 
Last edited:
Well to be fair to Nvidia, there was no open solution available when they launched Gsync.

Why did they go Gsync? My guess is that they knew that adaptive sync was coming. None of their Kepler or Maxwell cards(so far) can connect to an adaptive sync monitor. They also knew that it would be a while before any of this tech hit the market. So they came up with their own solution and because it was their own, they got it to the market quicker.

They are a business after all, they spotted an opportunity and went for it.

None of those reasons are remotely valid for us as consumers. Every one of them reads as Nvidia wanted our money and didn't care that in the long run it would fragment the GPU community even further. Instead of a scenario where almost every monitor coming out would have adatpive sync and work on ALL modern GPUs, we have fragmentation.

The effort and cost Nvidia spent on getting G-sync to market would have served all of us better if they worked towards making their newer 9X0 range Adaptive Sync compatible.

Why support such business decisions that are not designed with consumers in mind?
 
None of those reasons are remotely valid for us as consumers. Every one of them reads as Nvidia wanted our money and didn't care that in the long run it would fragment the GPU community even further. Instead of a scenario where almost every monitor coming out would have adatpive sync and work on ALL modern GPUs, we have fragmentation.

The effort and cost Nvidia spent on getting G-sync to market would have served all of us better if they worked towards making their newer 9X0 range Adaptive Sync compatible.

Why support such business decisions that are not designed with consumers in mind?

The fact remains that nVidia have had this tech out for well over a year now and people who wanted it and could afford it, didn't care that it was proprietary or costing more than a bog standard monitor. For those that "couldn't justify the price" but wanted it, will soon be able to buy a Freesync monitor.

No big deal really and you will have AMD for Freesync or nVidia for G-Sync.
 
None of those reasons are remotely valid for us as consumers. Every one of them reads as Nvidia wanted our money and didn't care that in the long run it would fragment the GPU community even further. Instead of a scenario where almost every monitor coming out would have adatpive sync and work on ALL modern GPUs, we have fragmentation.

The effort and cost Nvidia spent on getting G-sync to market would have served all of us better if they worked towards making their newer 9X0 range Adaptive Sync compatible.

Why support such business decisions that are not designed with consumers in mind?

They are a business, why give your competitor the upper hand?

Would you want your company to be using something that promotes your competitor?

Freesync has become the name associated with adaptive sync. AMD is the one that pushed for adaptive sync. And by association more people would have started buying AMD cards.

So in a smart move, they got their solution to market first, that keeps Nvidia customers with Nvidia.

And they would argue that it's for consumers, they have Gsync out for the last year for consumers to enjoy.

I am really struggling here with these arguments that Nvidia is out to fragment the GPU market? OF course they are, they want all the business. And so do AMD. You can be guaranteed if AMD had the clout and resources that NVida has they would have released their own version of Adpative sync and not waited around for an open standard.
 
They are a business, why give your competitor the upper hand?

Would you want your company to be using something that promotes your competitor?

Freesync has become the name associated with adaptive sync. AMD is the one that pushed for adaptive sync. And by association more people would have started buying AMD cards.

So in a smart move, they got their solution to market first, that keeps Nvidia customers with Nvidia.

And they would argue that it's for consumers, they have Gsync out for the last year for consumers to enjoy.

I am really struggling here with these arguments that Nvidia is out to fragment the GPU market? OF course they are, they want all the business. And so do AMD. You can be guaranteed if AMD had the clout and resources that NVida has they would have released their own version of Adpative sync and not waited around for an open standard.

That will blow some minds on here, to much common sense.
 
I am really struggling here with these arguments that Nvidia is out to fragment the GPU market? OF course they are, they want all the business. And so do AMD. You can be guaranteed if AMD had the clout and resources that NVida has they would have released their own version of Adpative sync and not waited around for an open standard.
Food for thought- AMD brought out low level access in the form of Mantle before dx12 was even announced (which is still not here yet) at no extra cost for their own user base (in contrast to Nvidia charging their user base for Gsync), and they were getting stick for "fragmenting the market" :p

If I'm not mistaken, the "freesync" monitor is basically just "generic adaptive sync monitor" in nature in terms of implementation, which mean Nvidia card should be able to support it as well, provided IF Nvidia would make their graphic card support it. If that is the case, I really think Nvidia would be better off supporting adaptive sync alongside Gsync as well, as it would at least give existing AMD users with "freesync (adaptive sync) monitor" to considering switching to the green team without the penality of losing the sync feature. Not supporting adaptive sync, it would mean Nvidia would be rejecting future customs from anyone that owns a freesync monitor, so they'd just be shooting themself in the foot.

But then again, they would probably put the potential profit on Gsync by denying Nvidia option on adaptive sync vs the potential profit from AMD users and freesync monitor owners that might switch camp in the future on the scale.
 
Last edited:
The best thing about freesync is the massive selection of monitors that will be on offer..
20 this year alone, they is something for everyone. Seems Samsung is the strongest selection.
 
Back
Top Bottom