• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD freesync coming soon, no extra costs.... shocker

You're grasping at straws to be perfectly honest.

Freesync has less compatibility GPU wise than Gsync, so many will need a new GPU (Far more than will need a new GPU from Nvidia for their GPU to work)

You can't just chuck a 1.3 into your 1.2 monitor, so it's a moot point, you're buying a new monitor, no different than buying a gsync monitor.

To be honest, your "point" makes little sense really.

But logic isn't your strong suit, so I'll let you quit while you're already behind.

Actually, most Intel GPU's should be compatible as they have hardware needed to utilise adaptive sync monitors. All they need to do update their drivers. They already have the capability as they use dynamic refresh rates etc for power saving.

It was called Freesync because the license is free, any monitor manufacturer can use it. The Freesync name sort of stuck and AMD just ran with it, as any company with any sort of business sense would. It's not even an AMD tech, it's just a display port standard!!

People, and to be honest, mainly those that support Nvidia are making a big deal out of the fact that with freesync you still have to buy a new monitor etc. Just because it has free in the name? lol

AMD have nothing to do with the monitor side of things, the only thing they did was push for the specification from embedded display ports be added to desktop display ports. Since it is optional on the the display port 1.2a standard, they are also trying to persuade monitor manufacturers to use 1.2a connections early and not wait for version 1.3.

The added cost of $10 -$20 is simply the new display port board, some manufacturers wont even have this cost as they can update their monitors by firmware. I don't even know if this should be counted as a cost, I mean they are going to have to spend $10-$20 putting in a display port anyway.

Also, I don't know how much Gsync adds to the cost of the monitor and some people are working out the cost based on the Acer 4K Gsync monitor when compared to other monitors. Well, I don't think you can base it on this at all. I would say imagine how cheap the Acer 4 K monitor would be if it didn't have Gysnc??
 
My sammy does lightboost and it isn't 3Dvision compatible, it isn't Nvidia specific and can be enabled through the panel if it's capable.

TheGsync panel doesn't have an inbuilt emitter, in theory there is no extra cost for 3Dvision 'ready', which makes it the most comparable panel in regards to possible Gsync cost.:)

Didn't know that (Sammy lightboost), though it is the same end result as lightboost it appears to fall under a different name as it looks to be from the little reading I've done, something Samsung developed. That on the 700d? Gutted they went eol without replacement, by far and away the best 120hz monitors outside of those 3dvision branded.

We know 3dvision comes at a cost, it's (imo) reflected in the level of support it has.

So if 3dvision has a premium, gsync has a premium, how do you differentiate the added cost of gsync by standing it next to a monitor that has neither of those things? If 3dvision and the features it comes with adds cost, as it does, surely you would compare a gsync variant with something of similar feature level is my point.

That's not to say doing the above paints the picture any different, the average cost of a 24" 144hz 3dvision monitor is still over £100 less then the aoc, yet at 27" of same spec cost extras are next to nothing.

I guess we'll see who is price gauging and who is just aggressively pricing when the benq 24/27 units arrive, as they'll likely be the xl series with the different scaler. This will likely be the closest we'll get to a manufacturer having a gsync and non gsync varient of the same monitor.
 
Interesting part of the initial variable refresh rate patent, which was actually filed in 2006 by none other than ATi Technologies - regarding the use of an integrated circuit.

Referring now to FIG. 6, the processing described by the present invention may be embodied in a hardware-based implementation, such as an integrated circuit. To this end, as known by those of skill in the art, a set of executable instruction 600 may be defined and stored within a library 602 that, in turn, is stored in memory 604. The instructions 600, which may comprise instructions represented in any suitable hardware design language (HDL) including, but not limited to, Verilog or another hardware representation such as GDSII, can be used by a circuit design module 606 that is executed on a processor 608 of an integrated circuit design system 610. Using the instructions 600, the system 610 may employed to create a suitable integrated circuit (or other hardware embodiment) capable of performing the processing described herein. Such system 610 and circuit design module 606 may be any suitable system and integrated circuit design program as known to those skilled in the art.

As described above, the present invention provides a technique for a dynamically adjusting the frame rate of a display to accommodate different types of content having different image frame rates or to provide power savings opportunities. This is achieved by the determining the dynamic frame rate capabilities of the display as well as the image frame rate, and selecting an updated frame rate. Displays can accommodate dynamic frame rates through modification of horizontal or vertical timing parameters. For at least these reasons, the present invention represents an advancement over prior art techniques.
 
Last edited:
Didn't know that (Sammy lightboost), though it is the same end result as lightboost it appears to fall under a different name as it looks to be from the little reading I've done, something Samsung developed. That on the 700d? Gutted they went eol without replacement, by far and away the best 120hz monitors outside of those 3dvision branded.

We know 3dvision comes at a cost, it's (imo) reflected in the level of support it has.

So if 3dvision has a premium, gsync has a premium, how do you differentiate the added cost of gsync by standing it next to a monitor that has neither of those things? If 3dvision and the features it comes with adds cost, as it does, surely you would compare a gsync variant with something of similar feature level is my point.

That's not to say doing the above paints the picture any different, the average cost of a 24" 144hz 3dvision monitor is still over £100 less then the aoc, yet at 27" of same spec cost extras are next to nothing.

I guess we'll see who is price gauging and who is just aggressively pricing when the benq 24/27 units arrive, as they'll likely be the xl series with the different scaler. This will likely be the closest we'll get to a manufacturer having a gsync and non gsync varient of the same monitor.

Yes it's a Sammy 700.

What I'm trying to point out is, that monitor doesn't have a 3D emitter built in, there is no 3D output until you add the emitter and glasses-that's the payment for Nvidia 3D, when you purchase the kit after the initial monitor outlay.

Nvidia 3D isn't paid for out the box, that's my comparison over the Gsync panel.

Your comparing 3D features that aren't included out the box on both panels.:)
 
Well, regardless of the extra cost of Freesync, I am sure it will do a grand job and nobody is forced to buy it if they feel the cost for the first Freesync monitors is too much. 10$ or 20$ equates to £15 max, so not too much extra cost for the free version of G-Sync.

They also said they have more than one vendor on board, so that's good as well.

I think they would have this cost anyway. They still have to buy the display port. I bet this part costs roughly the same no matter what version of display port is on it.

But, how much will monitor manufacturers charge us for having a monitor with the new 1.2a adaptive sync spec? That is the important question.

And just like Gsync, every monitor manufacturer will be different, some will charge full whack and a lot extra, others like ACER with their 4K monitor will charge very little.
 
Interesting part of the initial variable refresh rate patent, which was actually filed in 2006 by none other than ATi Technologies - regarding the use of an integrated circuit.

Its been around for a long time. And Intel and AMD have made use of it. It's for this reason that I think both Nvidia and AMD have been working towards Gsync/Freesync in their own way for a while now.
 
Yes it's a Sammy 700.

What I'm trying to point out is, that monitor doesn't have a 3D emitter built in, there is no 3D output until you add the emitter and glasses-that's the payment for Nvidia 3D, when you purchase the kit after the initial monitor outlay.

Nvidia 3D isn't paid for out the box, that's my comparison over the Gsync panel.

Your comparing 3D features that aren't included out the box on both panels.:)

Ahhh I'm with you now!! :)

It's a bit of a minefield that one Tbh, some monitors have the emmiter built in (vg278?) While other's don't, even in the same price range and again alternate 'models' some with and some without the 3d stuff. My vg236h has a sister model...the vg236he, one with the kit one without.
 
My point is although it might not be clear to some, AMD had this patent in their possession since roughly when it was filed in 2006, knowing full well it could be implemented in form of an additional scaler. Yet they've done nothing about it.

Yet we've got Beebob and Rocksteady in here, praising AMD for suddenly leaping out of bed and telling us this is potentially VESA compatible.

G-Sync is here already. Of course AMD knew about it. I know I have to take the bins out tomorrow. Maybe I'll leave it 6 years though. **** it.
 
It's not just used for caching frames. So you think VESA spec is going to account for frame loss at the same pace considering its implemented in every certified panel at the same performance as a custom designed ASIC with built in buffer?

This is not how it works, there is no caching or inter communication, the GPU tell the screen what refresh rate to run at and that's it.

Interesting part of the initial variable refresh rate patent, which was actually filed in 2006 by none other than ATi Technologies - regarding the use of an integrated circuit.

That's the patent for the power saving feature, that is not Free-Sync.

Go to the Free-Sync FAQ, you don't understand any of this.
 
Free Sync IS the power saving feature. Don't you get that?


And that IS how it works - G-Sync uses a DDR3 buffer because the scaler is capable of falling back on itself. Which is why I doubt Free Sync can work as well on a frame by frame basis.


AMD have no control over what ASICs are used. Only VESA can verify requirements which may or may not have similar capabilities to Nvidia's 'proprietary' or more to the point custom scaler.
 
Last edited:
I think they would have this cost anyway. They still have to buy the display port. I bet this part costs roughly the same no matter what version of display port is on it.

But, how much will monitor manufacturers charge us for having a monitor with the new 1.2a adaptive sync spec? That is the important question.

And just like Gsync, every monitor manufacturer will be different, some will charge full whack and a lot extra, others like ACER with their 4K monitor will charge very little.

Pretty much hit the nail on the head as I see it. The monitor manufacturers are the one's and possibly the retailers who decide what the cost will be and like anything brand new, it normally comes with a premium. I have zero patience for new tech and if my bank balance allows me to get it, I generally do and I am sure there is many like me.

So G-sync monitors were priced at what the early adopters would suffer and the very early DIY module that was fitted had a premium for the cost of fitting. Some people bought that (the guys like me) and were happy to pay the premium and I will bet money that any premium placed on the early Freesync monitors will also be paid.

I pre-ordered that Acer 4K G-Sync monitor as soon as I see the price was £499 (The same as what the Sammy 4K was without the G-Sync module), so to me, that is a snip for this fairly new and game changing tech.

I am waffling a bit but believe the very first Freesync monitor will come with a premium (and fully understand why that is) and when they become readily available, the price will drop to a friendlier price.

The ironicy for me was the thread title ("AMD Freesync coming soon, no extra costs....Shocker") and the way it has been called freesync (which again hints at the tech being free (which could be argued but for peace, I will say the implimentation is free)).
 
My point is although it might not be clear to some, AMD had this patent in their possession since roughly when it was filed in 2006, knowing full well it could be implemented in form of an additional scaler. Yet they've done nothing about it.

Yet we've got Beebob and Rocksteady in here, praising AMD for suddenly leaping out of bed and telling us this is potentially VESA compatible.

G-Sync is here already. Of course AMD knew about it. I know I have to take the bins out tomorrow. Maybe I'll leave it 6 years though. **** it.

That's a bit unfair? Why didn't any company come out with something like this before now? Maybe they had been working on it, maybe Monitor and graphic card technology is only now reaching the required level to use something like this.
 
Free Sync IS the power saving feature. Don't you get that?

Its not, the Power Saving feature only works for generic desktop and watching video's.

Free-Sync has its own hardware built into the GPU, a bit like the G-Sync Module but on the GPU its self, its a different system, that's why its only compatible with the 260X, 290 and 290X. because they are currently the only GPU's that have the hardware.

Go back here... http://forums.overclockers.co.uk/showthread.php?t=18614917
 
Last edited:
The ironicy for me was the thread title ("AMD Freesync coming soon, no extra costs....Shocker") and the way it has been called freesync (which again hints at the tech being free (which could be argued but for peace, I will say the implimentation is free)).

I wonder if it was called AMDfsync instead of freesync would people have had a problem with the thread title.

It's like Freeview TV. When the switch came, you still had to buy a new TV or set top box, if you didn't have a digital connection already.
 
Oh dear......there seems to be a lot of confusion about all this as usual.

Free-Sync is not the power saving feature.
The VESA standard 1.2a is not Free-Sync.

They are two completely separate things.

The VESA standard is for Adaptive-Sync. This allows compatible hardware to utilise the feature that was previously for power saving for adapting the refresh rate enabling the display to dynamically match a GPU’s rendering rate, on a frame-by-frame basis, to produce a smoother, low latency, gaming experience.

Free-Sync is AMD's proprietory hardware and software that will allow their GPU's to communicate with Adaptive-Sync monitors.

So yes anyone will be able to utilise Adaptive-Sync monitors, but they will need to use their own hardware and software to do so. I cannot see AMD just giving away their GPU scaler technology to people like Intel/Nvidia and of course that is quite within their rights, just as I do not expect Nvidia to give away their G-Sync technology to anyone else either.
 
Oh dear......there seems to be a lot of confusion about all this as usual.

Free-Sync is not the power saving feature.
The VESA standard 1.2a is not Free-Sync.

They are two completely separate things.

The VESA standard is for Adaptive-Sync. This allows compatible hardware to utilise the feature that was previously for power saving for adapting the refresh rate enabling the display to dynamically match a GPU’s rendering rate, on a frame-by-frame basis, to produce a smoother, low latency, gaming experience.

Free-Sync is AMD's proprietory hardware and software that will allow their GPU's to communicate with Adaptive-Sync monitors.

So yes anyone will be able to utilise Adaptive-Sync monitors, but they will need to use their own hardware and software to do so. I cannot see AMD just giving away their GPU scaler technology to people like Intel/Nvidia and of course that is quite within their rights, just as I do not expect Nvidia to give away their G-Sync technology to anyone else either.

But it is, because it's counting on the VVBLANK scalers used in DP1.3 (or 2a) displays, so without it it's useless. But maybe not directly as I'm aware it's controlled at driver level - but again, would be useless without VVBLANK
 
Last edited:
Back
Top Bottom