Will the rog swift get a free sync update?

I'm sensing a little anger mixed with veiled fanboyism, and I'll assume by "us" you actually mean you, and not some posse. I'll go against my gut feelings and state the obvious.

Mantle and FreeSync. TrueAudio has impressed so far as well.
 
True Audio is another AMD tech that goes no where.
It's in like 1 or two games. It's typical AMD.

True Audio has gone the way of most things AMD do. And that is no where. Just like TressFX.

And I say this while exclusively using AMD.
But some of the fanboy BS is ridiculous.

Freesync is no where yet, so you can't really give it a positive. It has potential, and I can't wait to use it. But that's it.

And Mantle is so-so. I like it, but I see it as a short term answer, DX12 will be leading the way as the standard, although it's very probable Mantle will have the performance edge for GCN users over DX12, but Mantle will never be sustained.
 
Last edited:
So if the next generation of Gsync runs with a broader hertz range than the currant one maybe even a broader range than the already very broad range that Freesync has listed. Or maybe being able to be used over HDMI/DVI (this has already been noted by Nvidia that there is no reason why it couldn't).

So neither of those would be killer new features.

And those are just two I can think of off of the top of my head. Who knows own what the boffins at Nvidia are working on.:rolleyes:

Well, for the first one, the range depends on the monitor. For the HDMI version, not sure what the point of this would be? if you are buying a monitor for Gsync, you already have the right connector on your card. Who or what would benefit? I wouldn't call it a killer new feature, that's for sure.

I have to agree with Spoffle on this. Don't think freesync and gsync are things that are going to have killer new features.
 
True Audio is another AMD tech that goes no where.
It's in like 1 or two games. It's typical AMD.

True Audio has gone the way of most things AMD do. And that is no where. Just like TressFX.

And I say this while exclusively using AMD.
But some of the fanboy BS is ridiculous.

Freesync is no where yet, so you can't really give it a positive. It has potential, and I can't wait to use it. But that's it.

And Mantle is so-so. I like it, but I see it as a short term answer, DX12 will be leading the way as the standard, although it's very probable Mantle will have the performance edge for GCN users over DX12, but Mantle will never be sustained.

But DirectX isn't even finalised so I can't see how you can give that a positive omm nom nom :p

But seriously, I'm so thankful Directx 11 has had a line drawn under it and I'm also thankful AMD give MS a boot up tail. However D3D is still in the hands of MS. Plus with AMD I can use any API and should have a choice of a few operating systems.

IMO anyone that knocks Mantle is not a true PC enthusiast. Right now it's the fastest API by a long way and it will be some time before anything is making use of DirectX12, and if DirectX12 turns out to be faster I will simply switch API and AMD have me covered for that too.

I've got to say the more I think about this I really do believe in what AMD are doing for the PC and I hope this ethos continues for the foreseeable future.
 
Last edited:
Well, for the first one, the range depends on the monitor. For the HDMI version, not sure what the point of this would be? if you are buying a monitor for Gsync, you already have the right connector on your card. Who or what would benefit? I wouldn't call it a killer new feature, that's for sure.

I have to agree with Spoffle on this. Don't think freesync and gsync are things that are going to have killer new features.

The gsync module already has 3d and ULMB on board, extra gsync module features dont neccessarily have to be sync related

Ulmb running at the same time as gsync would be a good start, though tricky.

The advantage of the gsync module being an fpga is that they can keep adding features
 
Last edited:
The gsync module already has 3d and ULMB on board, extra gsync module features dont neccessarily have to be sync related

Ulmb running at the same time as gsync would be a good start, though tricky.

The advantage of the gsync module being an fpga is that they can keep adding features

3d and ULMB Have nothing to do with the Gsync module. I had Nvidia 3D on my Samsung 120Hz monitor by just using an EDID of an officially supported 3d monitor. And ULMB is just an upgraded strobe backlight.

And ULMB working at the same time as Gsync would be more than just tricky as ULMB only works at certain refresh rates, 85, 100 and 120.
 
Do you write drivers or something? NV are a huge company and seem to have plenty of cash and a much bigger driver team than AMD. I'm sure if AMD can get Mantle off the ground by themselves NV could build a driver from the public SDK?

Yeah, I'm just going to assume you're trolling now.

Nvidias cards are magically going to work with an API built up for GCN with a new driver for their architecture. Oh no wait, we don't live in fairy land.

To be fair to Jigger, most people aren't aware of what the drivers really are. They think it's all and any software related to the graphics card that comes from the manufacturer.

So with that, it's not a surprise that he's underestimating what it'd take to bring Mantle support.

He's right though about how people talk about nvidia drivers. If you listened to the fans and drank the Kool Aid, you'd think their drivers are infallible to never have any issues, and when they do have issues they're just blanked (by some).
 
3d and ULMB Have nothing to do with the Gsync module. I had Nvidia 3D on my Samsung 120Hz monitor by just using an EDID of an officially supported 3d monitor. And ULMB is just an upgraded strobe backlight.

And ULMB working at the same time as Gsync would be more than just tricky as ULMB only works at certain refresh rates, 85, 100 and 120.

ULMB was something that could be hacked on to 3dvision lightboost monitors, it is now officially supported by gsync monitors, so I see it as a feature of having a gsync monitor/module

if freesync monitors come out and have ULMB and 3D then fair enough, but otherwise I see it as part of the package... the point being that nvidia could add additional features to gsync module/monitors that aren't available on freesync monitors... I don't know what, but they have said they have extra plans

2560x1440 @ 144hz is also currently only supported on gsync, so that is also a unique feature currently, there isn't a standard monitor scaler that supports it

Obviously people are free to have their own interpretation of what "gsync" entails, but if consumers want a feature and that feature is only available on a gsync monitor, then they are buying a gsync monitor, the end result is the same regardless of semantics. Likewise if the only monitor available with x is a freesync one then it adds to the perceived sales of freesync monitors, regardless of whether they want it or even use it.
 
Last edited:
ULMB was something that could be hacked on to 3dvision lightboost monitors, it is now officially supported by gsync monitors, so I see it as a feature of having a gsync monitor/module

if freesync monitors come out and have ULMB and 3D then fair enough, but otherwise I see it as part of the package... the point being that nvidia could add additional features to gsync module/monitors that aren't available on freesync monitors... I don't know what, but they have said they have extra plans

2560x1440 @ 144hz is also currently only supported on gsync, so that is also a unique feature currently, there isn't a standard monitor scaler that supports it

Obviously people are free to have their own interpretation of what "gsync" entails, but if consumers want a feature and that feature is only available on a gsync monitor, then they are buying a gsync monitor, the end result is the same regardless of semantics. Likewise if the only monitor available with x is a freesync one then it adds to the perceived sales of freesync monitors, regardless of whether they want it or even use it.

Hold on, this is nothing to do with semantics and it's just you moving the goal posts to suit your argument. Do any of the things you have listed(apart from one) have anything to do with the Gsync module?

The only feature that is specific to Gsync monitors that you listed is the resolution @144hz. 3D is on other monitors, ULMB is too, some of the latest BenQ gaming monitors have it. SO again, these aren't unique to Gsync monitors, and since you can't use both at the same time, they have even less to do with it.

Also, looking back on the other posts in this thread, I think you are been overly negative about adaptive sync. Take the price, you mention that the price for adding Gsync is negligible, but it isn't. There is the cost of the FPGA, there is the cost of the scaler and lastly the cost of the custom made PCB onto. This is then sold to the monitor manufacturer.

Whereas with adaptive sync there is only the cost of the scaler, and even that will be cheaper because they won't be buying it from a third party but direct from the manufacturer.

I don't see how gsync and adaptive sync monitors will ever be the same price unless Nvidia sell the module at a loss.

AS for market share, well, while only a small number of discrete AMD cards support freesync you have to remember that all GCN APUs do support it. So there is a big market out there.

Will Intel use adaptive sync? Their 4th generation intel core and M core GPUs all have the hardware controller (eDP1.2) needed to connect to an adaptive sync monitor. All they would need to do is write a driver. So the question isn't will they use it? but why wouldn't they use it?
 
Last edited:
Is 3D on all monitors? No, in fact there aren't any 3D non-3Dvision monitors currently on sale, so when looking at a gsync monitor and it also has 3d, that is an additional feature, I'm not saying it is one that is "worth" paying extra for, but it is an example of a additional feature that you get when buying (most) gsync monitors

There may be other features they can add to the gsync module that wouldnt be available on other monitors

The cost of the pcb is the same roughly for gsync or freesync monitors, the additional cost of a freesync scaler is said to be $10-20, the cost of an altera fpga with 768mb of ram is around $25 in bulk, so yes, the raw materials cost is pretty similar, what nvidia choose to do on costings is up to them, also bear in mind that with freesync the setup and integration of adaptive sync is down to the monitor maker where as with gsync it is nvidia, I'm not saying they will discount it, just that they could

AMD say that freesync requires additional hardware not present on gcn1 cards or vliw cards, intel dont aim at gamers, i dont think that people who buy intel chips are going to be buying $300-500+ monitors for gaming, i dont think intel will bother until after adaptive is already established, and i'm not sure it will get established unless intel support it

similarly, i'm not sure people with only an AMD apu are going to bothered about buying a premium monitor, just because it is cheaper than gsync

i think we are talking about niche products, with costs associated with developing them, i struggle to see where the payback is

I'm not saying i hope it fails, or that it definitely will, i'm just pointing out the challenges it faces as well as some things thay would really help it gain traction IF they do happen

Freesync on standard $150 monitors would be a massive coup, (as would intel support), but AMD themselves say it isnt going to happen any time soon
 
Last edited:
FreeSync is an industry standard. The job is done.

It isn't. Adaptive sync is an as yet un-adopted optional open standard. Freesync is proprietary to AMD.

For something to be an industry standard it has to be in widespread use and supported by more than one gpu manufacturer.
 
Last edited:
AMD started work on a project called FreeSync.
AMD went VESA and give them the work.
VESA then added that to the display port standard and called it Adaptive Sync.
All the major scaler manufacturers have opted in.
People get FreeSync by default from now on.

Job done.
 
AMD started work on a project called FreeSync.
AMD went VESA and give them the work.
VESA then added that to the display port standard and called it Adaptive Sync.
All the major scaler manufacturers have opted in.
People get FreeSync by default from now on.

Job done.

I think you've been sipping on Spoffle's Kool Aid... I mean I hope that all comes to pass but its blind enthusiastic optimism at this point.
 
http://support.amd.com/en-us/search/faq/225

Will every monitor eventually support Project FreeSync?
AMD is presently advocating these benefits to display vendors and working with their respective design teams to expand the capabilities of high-performance/gaming-oriented monitor lineups to include Project FreeSync.

Additionally, it must be established*that all dynamic refresh rate technologies require robust, high-performance LCD panels capable of utilizing a wide range of refresh rates without demonstrating visual artifacts. Such LCD panels naturally cost more to manufacture and validate than less capable panels, which may render dynamic refresh rate*technologies*economically unviable for especially cost-conscious monitors.


Even AMDs own marketing is less blindly optimistic, and that is saying something for AMD
 
Last edited:
All thats saying is you pays your money you takes your choice. You want a 40-144Hz 3440x1440 FreeSync monitor then it will cost a lot. So what?
 
All thats saying is you pays your money you takes your choice. You want a 40-144Hz 3440x1440 FreeSync monitor then it will cost a lot. So what?

It is saying that it will not be economically viable to add freesync to every monitor by default, which is what you are claiming will happen - even AMD say it wont
 
Back
Top Bottom