• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

FreeSync monitors hit mass production, coming in Jan-Feb

Whilst you are correct, the guys like me, who paid the premium that was added to G-Sync didn't really care and some have been enjoying tear free, stutter free, reduced lag gaming for over a year, whereas AMD have happily waited for others to do the work. That isn't a dig at AMD but for people like me who want to try new things out, it would have been cool if AMD had released at the same time as nVidia and at least given the users a choice of Freesync or G-Sync. In tech terms, a year is a very long time to allow the competition to get ahead and it isn't the first time either that AMD have sat back and let nVidia sell sell sell whilst AMD have nothing to offer.

Put quite simply AMD doesn't have or doesn't choose to put the same level of financial resources into R&D to develop a product like Gsync. AMD developing freesync with the help of other parties is born out of necessity although IMO AMD's approach is the right way i.e. I don't want to be locked into a particular brand of graphics just because of my monitor.
 
Put quite simply AMD doesn't have or doesn't choose to put the same level of financial resources into R&D to develop a product like Gsync. AMD developing freesync with the help of other parties is born out of necessity although IMO AMD's approach is the right way i.e. I don't want to be locked into a particular brand of graphics just because of my monitor.

Fair but I think we make too much off this being locked in business in truth. If nVidia don't back A-Sync, and you have an A-Sync monitor, you will be locked into AMD and vice versa. The only difference is Intel gamers could use A-Sync monitors as well but I don't care about those kind of people :D
 
Whilst you are correct, the guys like me, who paid the premium that was added to G-Sync didn't really care and some have been enjoying tear free, stutter free, reduced lag gaming for over a year, whereas AMD have happily waited for others to do the work. That isn't a dig at AMD but for people like me who want to try new things out, it would have been cool if AMD had released at the same time as nVidia and at least given the users a choice of Freesync or G-Sync. In tech terms, a year is a very long time to allow the competition to get ahead and it isn't the first time either that AMD have sat back and let nVidia sell sell sell whilst AMD have nothing to offer.

Sure, AMD probably didn't have the drive to push for this technology like Nvidia did and its worked against them. And yes at the moment saying adaptive sync is open and anyone can use its makes no odds as only AMD have support for it (however I do think this will change)

But I also hope that adaptive sync gets into DP 1.4 fully (as it was too late for 1.3) that way nvidia would be forced to support it on their cards if they claim DP 1.4 support. That also would open both Gsync and adaptive sync monitors to nvidia users.
 
While it's true nvidia pushed for this on the desktop scene, it's wrong to think they was the first to come up with the technology.

This been around for long time in the laptops..

Not sure it only amd based laptops or all?
 
While it's true nvidia pushed for this on the desktop scene, it's wrong to think they was the first to come up with the technology.

This been around for long time in the laptops..

Not sure it only amd based laptops or all?

It was/is used in laptops as a power saving feature. The benefit for gamers was not seen until much later.
 
So the reason for freesync to auto disable itself when frame rate below 30fps is because of the monitor manufacturer choice..

Gsync when frame rate is below the range will introduce a flicker effect something I posted on here couple weeks ago.
To stop this from happening the best way would be to disable Freesync. Not a bad choice and if you still hate screen tear you can have the option to enable vsync side by side..
AMD Freesync Hands-on with BenQ, Samsung & LG Mon…:

For me though I'll never hit the bottom with BenQ 30-144 I'll only ever go beyond its max.
 
So it's either flicker or screen tear seems to be both downsides to each technology..

Am happy with screen tear if I had to, witch as a user you still have the option to never see.
 
Put quite simply AMD doesn't have or doesn't choose to put the same level of financial resources into R&D to develop a product like Gsync. AMD developing freesync with the help of other parties is born out of necessity although IMO AMD's approach is the right way i.e. I don't want to be locked into a particular brand of graphics just because of my monitor.

This isn't going to go down well but ho hum.


While I agree in part with what you are saying, I will just add one little point.

Unfortunately it doesn't matter whether we like what has happened, the way AMD do things, the way Nvidia do things, which company is right or wrong, which technology gives better results, which one has better prospects or a brighter future, which one is cheaper to produce, is cheaper to buy for the consumer, which one is a hack job, which one isn't, which one has a good hz range which one doesn't, screen size, bezel size, viewing angles, yada yada yada. ( yes I've probably missed a few things but you get the idea.)

Most of us don't want to be locked in to brand of graphics due to our monitor, or locked into our monitor choices due to our graphics card.

But the bottom line is, if you want the Sync technology that Gsync and freesync bring us, you are locked into a GPU and monitor combination, regardless of how much we don't want it.

And before anyone starts out yelling 'but with the VESA standard adaptive sync your not locked into anything anyone can make the other component to make it work.'
Yes your right they can, but its not going to happen over night and to be honest I cannot see Nvidia doing it at all, if Intel do it then great but it will take a good while as they will need to add the required hardware to their APU's and of course lets not forget the massive amount of us gamers that use Intel APU's for the graphics side of things, oh wait that would be next to none of us.
 
This isn't going to go down well but ho hum.


While I agree in part with what you are saying, I will just add one little point.

Unfortunately it doesn't matter whether we like what has happened, the way AMD do things, the way Nvidia do things, which company is right or wrong, which technology gives better results, which one has better prospects or a brighter future, which one is cheaper to produce, is cheaper to buy for the consumer, which one is a hack job, which one isn't, which one has a good hz range which one doesn't, screen size, bezel size, viewing angles, yada yada yada. ( yes I've probably missed a few things but you get the idea.)

Most of us don't want to be locked in to brand of graphics due to our monitor, or locked into our monitor choices due to our graphics card.

But the bottom line is, if you want the Sync technology that Gsync and freesync bring us, you are locked into a GPU and monitor combination, regardless of how much we don't want it.

And before anyone starts out yelling 'but with the VESA standard adaptive sync your not locked into anything anyone can make the other component to make it work.'
Yes your right they can, but its not going to happen over night and to be honest I cannot see Nvidia doing it at all, if Intel do it then great but it will take a good while as they will need to add the required hardware to their APU's and of course lets not forget the massive amount of us gamers that use Intel APU's for the graphics side of things, oh wait that would be next to none of us.

Well, intel IGPU's are able to handle 720p reasonably well. Their future IGPU's will do even better. Why shouldn't those users, who would also benefit well from it, be able to take advantage of this tech.

It's also not just about not wanting to be locked into a vendor but also about not wanting to have to 'buy in' to another ecosystem.

Besides it seems reasonable that Nvidia will eventually support adaptive sync and I can also see them continuing to develop G-sync as it will give them flexibility to make changes and add features. The added cost clearly hasn't stopped people buying G-sync monitors. I suspect they will hold off GPU support while G-sync is at feature parity with adaptive sync.
 
Well, intel IGPU's are able to handle 720p reasonably well. Their future IGPU's will do even better. Why shouldn't those users, who would also benefit well from it, be able to take advantage of this tech.

It's also not just about not wanting to be locked into a vendor but also about not wanting to have to 'buy in' to another ecosystem.

Besides it seems reasonable that Nvidia will eventually support adaptive sync and I can also see them continuing to develop G-sync as it will give them flexibility to make changes and add features. The added cost clearly hasn't stopped people buying G-sync monitors. I suspect they will hold off GPU support while G-sync is at feature parity with adaptive sync.

Well thank you for completwly missing the point.

Yes in the future who knows what will happen, but at the moment its a lock one company or the other.

You want to give a shout out to intel GPU users go right ahead. But I'd love to meet a serious gamer who would consider spending £300+ on a new varible refresh rate monitor that hasnt already got a dedicated gaming card.
 
So it's either flicker or screen tear seems to be both downsides to each technology..

Am happy with screen tear if I had to, witch as a user you still have the option to never see.

witch.gif
 
Whilst you are correct, the guys like me, who paid the premium that was added to G-Sync didn't really care and some have been enjoying tear free, stutter free, reduced lag gaming for over a year, whereas AMD have happily waited for others to do the work. That isn't a dig at AMD but for people like me who want to try new things out, it would have been cool if AMD had released at the same time as nVidia and at least given the users a choice of Freesync or G-Sync. In tech terms, a year is a very long time to allow the competition to get ahead and it isn't the first time either that AMD have sat back and let nVidia sell sell sell whilst AMD have nothing to offer.

Folk had the swift over a year ago? I thought they were only demo'd at last years CES and they were'nt released until Spet/Oct?
 
Well thank you for completwly missing the point.
Was there a point other than "We are where we are"? :p

Yes in the future who knows what will happen, but at the moment its a lock one company or the other.

You want to give a shout out to intel GPU users go right ahead. But I'd love to meet a serious gamer who would consider spending £300+ on a new varible refresh rate monitor that hasnt already got a dedicated gaming card.

Except since one is part of the VESA standard, it will be coming to a good number of monitors and eventually most new monitors. That is not an if it happens and we can make some sensible extrapolations. We should always be forward looking even when discussing/mulling the current situation and the benefits/negatives.

But you are correct, Johnny Smith has not paid the members fee to join the club, he does not have the right to game tear/stutter free. :p :D
edit:
And will someone please think of the children!! Why can't Johnny's kids play tear free :(
 
Last edited:
Well thank you for completwly missing the point.

Yes in the future who knows what will happen, but at the moment its a lock one company or the other.

You want to give a shout out to intel GPU users go right ahead. But I'd love to meet a serious gamer who would consider spending £300+ on a new varible refresh rate monitor that hasnt already got a dedicated gaming card.

I guess if you buy an adaptive sync monitor you can still go Nvidia later knowing you haven't wasted 100s on a feature you now won't be able to use, the same can't be said the other way round, and at least there's an outside chance Nvidia will support it.
 
Back
Top Bottom