Will the rog swift get a free sync update?

I'm not sure what that has to do with this particular avenue of the discussion

Direct reply to

Unless the Syncs are free, the only market I can see for Gysnc is from the fanboys (I mean no offence by this) and ill informed

G-Sync is not free, "FreeSync" will not be either.

FreeSync is an unknown at this stage. Even if it performs 'better' than G-Sync for less cost there is still cost and and complexity associated with it.

You STILL require a compatible AMD card so it's not simply a matter of FreeSync launching and being a universal solution.

G-SYNC and FreeSync will be direct competitors. But I would make a relatively well founded presumption now, that being:

- Current AMD owners will opt for FreeSync monitors
- Current nvidia owners will opt for G-SYNC monitors
- Neither G-SYNC or FreeSync will offer any real tangible benefit over the other to warrant a GPU brand swap
- FreeSync Monitors will not offer a price point that will entice a nVidia GPU owner to opt for a FreeSync monitor with GPU swap.
 
Since you seem to want this stupid argument about your precious Gsync "MODULE" so I am going to give it to you.

what component of the monitor provides those features?

Name me one monitor that list features by the name of the internal PCB? If you can I will accept your argument.

And your argument is so silly, If you took out the electronics from any monitor it wouldn't work.

You are just arguing to make Gsync look better. I bet if you said "Gsync module" in normal conversation, most people would assume that you were talking about the part of the monitor that makes Gsync work, not the whole internal electronics of the monitor.

SO again, 3D and ULMB are features of a monitor, just like Gsync is a feature of a monitor. They aren't features of each other. They will never be called features of the Gsync module by anyone, apart from you, because they have nothing to do with Gsync.
 
Direct reply to



G-Sync is not free, "FreeSync" will not be either.

FreeSync is an unknown at this stage. Even if it performs 'better' than G-Sync for less cost there is still cost and and complexity associated with it.

You STILL require a compatible AMD card so it's not simply a matter of FreeSync launching and being a universal solution.

G-SYNC and FreeSync will be direct competitors. But I would make a relatively well founded presumption now, that being:

- Current AMD owners will opt for FreeSync monitors
- Current nvidia owners will opt for G-SYNC monitors
- Neither G-SYNC or FreeSync will offer any real tangible benefit over the other to warrant a GPU brand swap
- FreeSync Monitors will not offer a price point that will entice a nVidia GPU owner to opt for a FreeSync monitor with GPU swap.

I think you are spot on with most of that. Even if Adaptive sync monitors are £100 cheaper most Nvidia owners will not switch over.
 
Since you seem to want this stupid argument about your precious Gsync "MODULE" so I am going to give it to you.



Name me one monitor that list features by the name of the internal PCB? If you can I will accept your argument.

And your argument is so silly, If you took out the electronics from any monitor it wouldn't work.

You are just arguing to make Gsync look better. I bet if you said "Gsync module" in normal conversation, most people would assume that you were talking about the part of the monitor that makes Gsync work, not the whole internal electronics of the monitor.

SO again, 3D and ULMB are features of a monitor, just like Gsync is a feature of a monitor. They aren't features of each other. They will never be called features of the Gsync module by anyone, apart from you, because they have nothing to do with Gsync.

the "bit of the monitor that makes gsync work" is the gsync scaler - the fpga
the bit of a monitor that makes backlight strobing work in a non-gsync monitor work is the scaler, the bit of a monitor that makes 3D/120hz work in a normal monitor is the scaler

I'm not talking about "the entire internal electronics of the monitor" I am exactly talking about "the bit that makes gsync work", it is the same bit

the gsync scaler being an FPGA means that they can add features to the FPGA with basically a software update, which on a "normal" scaler would require a complete re-spin of the silicon, which would be significantly more difficult and expensive

scaler makers sell their scalers based on what refresh rates / resolutions they support, those refresh rates and resolutions are the features of that scaler
the things the gsync scaler supports are its features as well

here is a realtek display port scaler;
http://www.realtek.com.tw/products/...d=1&PNid=19&PFid=33&Level=4&Conn=3&ProdID=202

"Features: Max resolution 1920x1080"

So despite 1920x1080 being a ubiquitous thing that all modern scalers support, Realtek think it is a "feature" worth listing on the "features" list.

when I talk about the gsync module, I am not talking about the entire PCB, I am talking about the fpga assembly, the core of the monitor, the scaler itself, that is the bit that supports the resolution, the refresh rate, if it supports 3D, backlight strobing, absolutely everything the monitor supports is controlled by the core chip of the monitor
you can't say "the monitor supports it but the scaler doesn't" because if the scaler doesn't support it then neither does the monitor

That is why I'm asking you which specific component of the monitor provides 120hz/3d/backlight strobing support, because "the monitor" is not an answer

Some monitors do have multiple scalers, which is why they support some features only via certain ports
 
Last edited:
Direct reply to



G-Sync is not free, "FreeSync" will not be either.

FreeSync is an unknown at this stage. Even if it performs 'better' than G-Sync for less cost there is still cost and and complexity associated with it.

You STILL require a compatible AMD card so it's not simply a matter of FreeSync launching and being a universal solution.

G-SYNC and FreeSync will be direct competitors. But I would make a relatively well founded presumption now, that being:

- Current AMD owners will opt for FreeSync monitors
- Current nvidia owners will opt for G-SYNC monitors
- Neither G-SYNC or FreeSync will offer any real tangible benefit over the other to warrant a GPU brand swap
- FreeSync Monitors will not offer a price point that will entice a nVidia GPU owner to opt for a FreeSync monitor with GPU swap.

I've already been over this with Andy. You do realise the market will set the price based on demand. Thats the way of the world.
 
I've already been over this with Andy. You do realise the market will set the price based on demand. Thats the way of the world.

Not really. Gamer Technology does not follow ordinary market trends.

FreeSync is nothing but a gamer technology, nobody will buy FreeSync for any other reason.

Demand will outstrip supply, as is demonstrated by the ASUS ROG Swift. You will pay a premium for it, it's a niche product for a niche market.


Hardware partners will be able to implement FreeSync for very very little, unless AMD are charging excessive licensing fees - but even then, it's going to be minimal. Hardware partner will implement FreeSync into comparatively low volume products (compared to run of the mill desktop monitors) and charge a premium for it. They can afford to do this as the product is so very niche. They are not looking to sell HUGE volumes with low profit margins, the market (IE Gamers) does not support such a model.

The bottom line is this - FreeSync will be a (comparatively) low volume product for a niche market, it's pricing will reflect that.

If you believe anything else is the case you are sorely misguided.
 
Last edited:
If adaptive sync capable monitors become far more mainstream than gsync then Nvidia will have to support them. It will still be called Gsync but it would use the vesa standard vblank method.

The idea that anyone who has an Nvidia card will never remotely contemplate Freesync is pure BS. I have a GTX980 and frequently swap between AMD and Nvidia. I would not even entertain the idea of getting a gsync monitor.

Far too pricey over an equivalent standard monitor
Locked in to Nvidia GPUs
If you do go AMD your monitors gysnc feature is an expensive and unused feature. Obviously if you get a freesync monitor the same applies if you get an Nvidia GPU. The only mitigating factor would be if a Freesync monitor is only marginally more expensive because Gsync is an expensive option right now.

I genuinely think that due to the fact the VESA standard includes an option for vblank, it will eventually become the more mainstream option.
 
Last edited:
Not really. Gamer Technology does not follow ordinary market trends.

FreeSync is nothing but a gamer technology, nobody will buy FreeSync for any other reason.

Demand will outstrip supply, as is demonstrated by the ASUS ROG Swift. You will pay a premium for it, it's a niche product for a niche market.


Hardware partners will be able to implement FreeSync for very very little, unless AMD are charging excessive licensing fees - but even then, it's going to be minimal. Hardware partner will implement FreeSync into comparatively low volume products (compared to run of the mill desktop monitors) and charge a premium for it. They can afford to do this as the product is so very niche. They are not looking to sell HUGE volumes with low profit margins, the market (IE Gamers) does not support such a model.

The bottom line is this - FreeSync will be a (comparatively) low volume product for a niche market, it's pricing will reflect that.

If you believe anything else is the case you are sorely misguided.

Thats the thing though. The way FreeSync has been implemented means it will be mass market. Sure, the early rush will have an effect, but after that FreeSync will be just like any other monitor as it's a standard VESA feature.
 
Thats the thing though. The way FreeSync has been implemented means it will be mass market. Sure, the early rush will have an effect, but after that FreeSync will be just like any other monitor as it's a standard VESA feature.

It's not Freesync. It's Adaptive Sync. There's a big difference. Plus, there's a very limited amount of GPU's that currently support adaptive sync (Which by AMD is through Freesync, which is the limitation, the GPU's actually with Freesync are in the minority. We're talking the 290/290X/285/260X/295X)

And it isn't a *standard* VESA feature. It's *optional*.

You seriously sound like you've followed hype without context.

I think the future is bright personally, and Intel/Nvidia can freely adopt adaptive sync through their own proprietary named method (Freesync equivalent). Which will encourage more and more monitors.
But then it's all about price point too. A lot of people just buy the first 100 pound 1080p display they see. Adaptive Sync would literally need to become a mandatory standard to be fully mainstream (Which personally, I hope is the future)
 
Last edited:
It's not Freesync. It's Adaptive Sync. There's a big difference. Plus, there's a very limited amount of GPU's that currently support adaptive sync (Which by AMD is through Freesync, which is the limitation, the GPU's actually with Freesync are in the minority. We're talking the 290/290X/285/260X/295X)

And it isn't a *standard* VESA feature. It's *optional*.

You seriously sound like you've followed hype without context.

I think the future is bright personally, and Intel/Nvidia can freely adopt adaptive sync through their own proprietary named method (Freesync equivalent). Which will encourage more and more monitors.
But then it's all about price point too. A lot of people just buy the first 100 pound 1080p display they see. Adaptive Sync would literally need to become a mandatory standard to be fully mainstream (Which personally, I hope is the future)

We've been over this already. How can I put this in a way a fanboy can get his head around it... The industry has *opted in* so FreeSync = insidious take over of all the monitors. You can't call off the dogs of war?
 
We've been over this already. How can I put this in a way a fanboy can get his head around it... The industry has *opted in* so FreeSync = insidious take over of all the monitors. You can't call off the dogs of war?

Are you really calling me a fanboy? I own an AMD R9 290X. You have no comprehension of the subject material however.

Again, Freesync is AMD's proprietary technology, the industry haven't opted in to Freesync. They've opted in (And even then, it's not like the *industry* have opted in. We've got a few monitors, that's early and very positive success thus far, but it's not the *industry*. It's still an optional part of the VESA specification. For the industry to have fully backed adaptive sync, it would need to be mandatory, but that's besides the point, and I'm not posting to be negative) to adaptive sync.

Again, you don't know the subject matter, so you can stick the "We've been over this" because what you're typing is factually incorrect, and the only person struggling to get his head around anything, is you, seemingly because you lack the comprehension.
 
Last edited:
You must be. You seem smart enough to understand the situation, yet argue against common sense, and cast insults. Fanboy would seem the most likely case.
 
What you're typing is factually wrong, and in the context shows a lack of understanding about the situation.
Why is correcting you making me a fanboy?

Also, what type of fanboy? I'm not saying anything negative about adaptive sync/freesync/AMD. Or anything pro Gsync/Nvidia. I'm simply posting the actual situation, rather than from a standpoint of not understanding the situation and the technical side.
 
Last edited:
What you're typing is factually wrong, and in the context shows a lack of understanding about the situation.
Why is correcting you making me a fanboy?

I'm not sure what makes people become irrational over PC hardware. It's not something I've put that much thought into TBH.
 
It's frustration more than anything.
It's like you're reading one thing and managing to take another thing away from it.

And again, what type of fanboy am I :p?
 
Back
Top Bottom