• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

FreeSync monitors hit mass production, coming in Jan-Feb

It's funny because the single reason for including hardware support for DP 1.2a and including adaptive sync support currently, is to work with freesync. Nvidia may support it in the future, Intel might, but only AMD have stated they will support it and pushed the industry to support it, pushed to get it included in the DP 1.2a standard. The sole reason to put adaptive sync on a screen today is to use freesync..... but it's not a freesync screen.


In an industry in which for 20 years people will put any old label on a box as a selling point when your screen supports a feature AND that feature is free, unlicensed and has no cost to put said label on your box and you don't. Well lets just say you need a bigger reason to not put the label on the box than to add it.

G-sync is different, even if g-sync in the next year or whatever starts to support the adaptive sync method(as well as current method), Nvidia currently charge you to license g-sync, you have to pay them to be able to put the label on your box. If you don't pay them then even if you have hardware support Nvidia will blacklist the screen so the drivers wouldn't enable a feature that was in the hardware on both sides.

Asus didn't put a freesync label on because Nvidia is a big partner of theirs, no other reason. It's a freesync screen as much as the BenQ, the LG's and all the other ones in as far as there is adaptive sync and the only current hardware/software to use it is freesync.

The other screens simply put a free label on their things and added a feature to get more sales, Asus purposefully went out of their way to not advertise a feature of their monitor.

So, AMD are driving the industry forward when G-Sync literally paved the way for this?

I want what you're smoking. Actually no I don't, I don't think any normal human could handle being so overwhelmingly consumed by consumer bias that that even the laws of time and physics themselves invade factual evidence.


Jesus H.
 
Last edited:
'Ordered' is a little bit strong.

And lets not pretend Asus don't know what this new displayport standard is and what it does, putting forward particular spec'd models with particular appeal from the start.
And god knows why a company would give an easy courtesy and gesture to a partner with specific competing tech that they have been working with closely.
 
If AMD want displays to be branded with their proprietary name for a standard technology then surely they should be paying for it?

Moaning that a vendor who has an existing deal with a competitor isn't using AMD's brand name for something with it's own generic name just makes you look a bit petty.
 
Last edited:
So, AMD are driving the industry forward when G-Sync literally paved the way for this?

I want what you're smoking. Actually no I don't, I don't think any normal human could handle being so overwhelmingly consumed by consumer bias that that even the laws of time and physics themselves invade factual evidence.


Jesus H.

Gysnc and Nvidia did not pave the way for this. VESA DP1.2a has been ratified since early last year and it takes quite a while for amendments to VESA standards to get ratified. So AMD have been pushing for adaptive sync to be standard in DP for a long time. Nvidia pushed their own proprietary tech and got it released first. First to release does not = paving the way.
 
Last edited:
If AMD want displays to be branded with their proprietary name for a standard technology then surely they should be paying for it?

Moaning that a vendor who has an existing deal with a competitor isn't using AMD's brand name for something with it's own generic name just makes you look a bit petty.

No it's perfectly normal to enroll on a contractual basis with a company but then also promote their direct competitors tech too.


...what?

Gysnc and Nvidia did not pave the way for this. VESA DP1.2a has been ratified since early last year and it takes quite a while for amendments to VESA standards to get ratified.


If you look past that tree over there, you'll see an almighty forest.

Nice edit, but not the least bit true. You can check the submission dates if you cared to.
 
Last edited:
As far as monitor manufacturers are concerned Freesync doesn't exist, they've been ordered to meet a spec by the DisplayPort group that's all, at extra cost to them.. and they've met it. If AMD want advertising for their own proprietary technology (FreeSync) then they will have to pay for it. DP 1.2a is completely independent of Freesync.

Exactly. This is why adaptive sync on the DP standard is the best way forward for this tech. It is more likely to become widespread rather than niche if it does.
 
Exactly. This is why adaptive sync on the DP standard is the best way forward for this tech. It is more likely to become widespread rather than niche if it does.

I have to agree. Nvidia showed us all what the technology could do and has let people take advantage for it for the past year, but now it's time for a vendor neutral approach that everyone can use. (much like Mantle and DX12)
 
So, AMD are driving the industry forward when G-Sync literally paved the way for this?

I want what you're smoking. Actually no I don't, I don't think any normal human could handle being so overwhelmingly consumed by consumer bias that that even the laws of time and physics themselves invade factual evidence.


Jesus H.

OKay firstly where in that post did I say AMD were driving the industry forward, secondly, Nvidia paved the way with g-sync, no they didn't.

Do you know how expensive and inefficient it is to go the FPGA route? There is a single reason why Nvidia would go the route they did, time to market, if it was there idea and no one else was doing it they could have taken 6 months longer made a proper chip, worked with a monitor maker more closely and made more profits AND not had any of the downsides, none of this patching a module onto a monitor which stops you using it's other ports.

An original idea that no one else is doing gets done right, not patched in as quickly as possible in as ghetto a way as you can. Name another big monitor standard that for time to market required the end user to take apart their monitor and add a pcb and chip at a huge expense?

I said at the time and it is clear as day to anyone without the hilarious hypocritical consumer bias you have, that the industry in general was talking about adding what is now adaptive sync WAY before g-sync came out. Instead of waiting and getting adaptive sync or whatever display port/industry agreed on they went the fpga(programmable and quick yet expensive and ridiculous) way and ghetto boards to add to screens for time to market. They took an idea that was coming, took a short cut which saved 7-8 months and added massive cost to ONLY their users for something that was already coming.

This was an industry idea that Nvidia co-opted and fan boys are now claiming as originality.

There is literally no reason to ever go the ridiculously ghetto, absurd cost, fpga chip route if they weren't trying to beat a deadline, if it was there idea then the only deadline could be something already in the pipeline.

People like you are deciding that oh, AMD submitted something to display port to add adaptive sync whenever it was... thus they'd probably be working on it for a couple of weeks before that thus that is when they came up with it. It isn't how the industry works at all. Probably 6-12 months at least before even submitting a proposal like that to display port they'd be talking to panel manufacturers about if they wanted and would support such a feature. These standards aren't a case of, write a general idea and submit it, this isn't put the words adaptive sync on a piece of paper and hand it in and let the other side do the work. You're talking about standards, protocols, idea's, testing, checking, R&D, time and money.

Standards take YEARS, not months. adaptive sync was clearly coming to anyone with a brain and Nvidia did their usual... how do we screw our customers out of more money, hey if we make this stupid ghetto mode, go the uber expensive and utterly stupid fpga route, get to market early we can get an extra £200 out of our customers, woo us, we can also pretend it was our idea, woo.
 
As far as monitor manufacturers are concerned Freesync doesn't exist, they've been ordered to meet a spec by the DisplayPort group that's all, at extra cost to them.. and they've met it. If AMD want advertising for their own proprietary technology (FreeSync) then they will have to pay for it. DP 1.2a is completely independent of Freesync.

Also completely incorrect, firstly no one has to make a DP 1.2a screen at all, there is nothing forcing a manufacturer to adhere to the DP 1.2a spec UNLESS the screen maker wants to use the DP 1.2a spec and advertise as having it.

DP1.2a, over 1.2 has ONLY adaptive sync added to it. There is no reason to make it 1.2a unless you are planning to support adaptive sync, one implies the other. They didn't randomly go DP 1.2a and thus happen to have adaptive sync, it is the ONLY reason to make it DP 1.2a, the entire reason Asus included DP 1.2 is so it could support freesync, the only reason they are not saying this is their bigger partner doesn't want them to.

A bunch of other screens were announced at CES with only DP 1.2 and no adaptive sync.
 
If AMD want displays to be branded with their proprietary name for a standard technology then surely they should be paying for it?

Moaning that a vendor who has an existing deal with a competitor isn't using AMD's brand name for something with it's own generic name just makes you look a bit petty.

Asus PAY Nvidia to put G-sync branding on the screen, not the other way around.

Again this is industry stuff that everyone knows until people want to play silly games. Dell ask Nvidia and AMD to release the same products with the same features just so they can advertise their boxes with a new gpu and pretend it's the latest and greatest. laptops come with little stickers on them, psu's come with sli/xfire branding. EVERY manufacturer in the world will put every freaking feature, every sticker, every standard, everything they can think of onto their products to make it appeal to more people.

If one PSU will work fine with xfire, but doesn't bother putting an xfire sticker on the box then a customer may choose another psu that is functionally identical but has an xfire logo on the box to indicate support. It's a product peeing match, more features the better, has always been that way and will always be that way.

Any screens that support features will 99.9999% of the time put some kind of indication on the box it does because it might mean 3, 300k, or 300million more sales.

You don't pay a company to advertise your feature, the industry has never been about that. It's always been the reverse, add every single feature possible and decide which other features you need to pay to add to the box you do. So put everything free to add on your box and in your marketing then decide which "pay to add" features you include.

Think about it this way, Asus pays for g-sync branding and the ability to market the feature, if they didn't have to pay, would they then refuse to put the logo on unless Nvidia paid them? A £700 screen with one key new feature that might set it apart from every other screen and they wouldn't advertise it at all unless Nvidia paid them? Who would buy this no killer feature £700 screen unless they told everyone it supported g-sync. In the fictional world where Nvidia didn't charge, they most certainly wouldn't have to pay Asus to advertise this feature, advertising what the screen supports gets them more sales.

AMD don't pay mobo makers to add xfire logo's to boxes or have xfire support info in them, the manufacturers add the logo because someone who wants xfire might want to buy a motherboard and would exclude it if they didn't advertise the feature. You don't go to the trouble of supporting a feature like xfire or adaptive sync(for which only AMD can current use it) to not tell anyone about it.. unless you have a specific reason not to like a partner who makes you more money.
 
Last edited:
Indeed. I remember all the exciting talk the industry was awash with regarding adaptive sync. Oh wait no, nobody at all was discussing it until Nvidia jumped started everything with Gsync. https://www.google.co.uk/search?q=a...2013,cd_max:01/01/2011&q="adaptive+sync"+vesa

It's only after Nvidia came up with their own solution and shone light on just what a game changer the technology could be that everyone jumped on the AS bandwagon.

Don't let that get in the way of yet another of your tedious and ludicrously biased anti Nvidia trollfests though, is amusing to see an apparently grown man get so angry with a technology company.
 
Speechless.

You'd never get away with that dribble on any other forum.

Excluding the below, this part is actual real information. Well played.

Also completely incorrect, firstly no one has to make a DP 1.2a screen at all, there is nothing forcing a manufacturer to adhere to the DP 1.2a spec UNLESS the screen maker wants to use the DP 1.2a spec and advertise as having it.

DP1.2a, over 1.2 has ONLY adaptive sync added to it. There is no reason to make it 1.2a unless you are planning to support adaptive sync, one implies the other. They didn't randomly go DP 1.2a and thus happen to have adaptive sync, it is the ONLY reason to make it DP 1.2a, the entire reason Asus included DP 1.2 is so it could support freesync, the only reason they are not saying this is their bigger partner doesn't want them to.

A bunch of other screens were announced at CES with only DP 1.2 and no adaptive sync.

Mostly.
 
Last edited:
Back to some recent info/developments, PCPer had this to say at their time at CES. At the very least note worthy.

If you're rendering under 30....it raises some some interesting questions. Are you going to be seeing judder is there going to be a transition between that. If you're going to be transitioning between 35 and 45 a lot, what is the experience going to be like.

I think that's important because Nvidia has worked very hard find a way to get around that judder. The first generation [g-sync] panel actually had that issue and since they have fixed it. We don't know for sure if freesync is going to have this problem but we do know they didn't tell us they didn't have - they didn't have an answer to it.

[talking about the 40fps minimum 4k freesync requirement on the samung panel]
 
Indeed. I remember all the exciting talk the industry was awash with regarding adaptive sync. Oh wait no, nobody at all was discussing it until Nvidia jumped started everything with Gsync. https://www.google.co.uk/search?q=a...2013,cd_max:01/01/2011&q="adaptive+sync"+vesa

It's only after Nvidia came up with their own solution and shone light on just what a game changer the technology could be that everyone jumped on the AS bandwagon.

Don't let that get in the way of yet another of your tedious and ludicrously biased anti Nvidia trollfests though, is amusing to see an apparently grown man get so angry with a technology company.

AMD were working on Mantle for 2-3 years and only announced it when the did, where was the talking about it before that. Where was the talking about DP 1.3 before it was announced, where was the talk of g-sync months before it was announced. Why aren't we awash with industry information about what GPU Nvidia or AMD are releasing two years from now.

WHen a standard happens someone doesn't on their own go hey, industry we ARE doing this, ratify it and make it happen. Anyone suggesting it is mental and has no idea what they are talking about. 90% of talking about standards and new technology happens behind closed doors, it entirely not in the public domain. When freesync, or g-sync or Mantle, or just about any new tech is announced publicly it's at the point that 90% of the R&D is done and it's close or actually ready for public consumption. TO even suggest that if something was being talked about behind closed doors we'd all know about it is ridiculous. Again why didn't we know about G-sync before it was announced, you think they made, programmed and tested an fpga and the software on their side in a month, you think they didn't talk to Asus and other g-sync partners before they announced... honestly does that actually sound even possible to you?

Why don't you do research for a change, why don't you e-mail someone at a tech website and ask them how industry standards come about.

HSA 1.0 was ratified about 3 years after AMD, Samsung, Arm and a crapload of the industry got together to work together on HSA.

With something like DP 1.3 lets say, what happens is pretty much everyone talks to each other(usually painfully slowly and with lots of ulterior motives), discusses what the industry need, what their own needs are, listens to others needs. Meetings, upon meetings, e-mails back and fourth, just getting a list of things they all agree they should work towards, THEN they do the R&D bit and try and come up with say a cable, signalling, coding, encryption, power, everything that works together to achieve as many of these goals as possible, then they'll all talk together again and see where they are at. On the individual side when a relatively final spec is agreed then on their own they'll start thinking about monitors to support it, features that can now be included.

6months absolute minimum, maybe closer to 2-3 years for some standards, you get a final spec, it gets put forward for everyone to agree to it, then people go away and start working on support for it, chips required, for monitors, scalers, cable guys start making new cables, monitors put into action new monitor designs with these new features. Then maybe a year later DP 1.3 products actually start coming to market.

This is NOT something where Samsung goes "hey, DP 1.3 guys, have at it"< it gets ratified in 3 months then 2 months later products are coming out. That simply isn't how it works. YOu can do this for a non standard, for products that you and you alone can use along with any companies you choose to involve but it ignores the whole everyone agreeing to it, everyone coming up with something that works for everyone.

Standards are a pain because you need dozens of people involved who all agree on the best set of features that everyone can use, but it is the ONLY way things go beyond a short term feature that gets ignored.

AMD can not and would not have come up with adaptive sync in a few months on their own, it would take years of talking with many monitor companies about if they would support it, how they'd do it, how AMD would do it, compromise, getting agreement then making it work then submitting it then certification of it meaning testing amongst various companies THEN products go into production.
 
Last edited:
Righty then chuckles, I'm gonna call you out and ask for proof. Real proof, not your usual tin foil hat or whoops I mis-spoke AMD PR mouthpiece nonsense.

Basically, nope, go on any forum and ask if sli mobo's had to pay to support SLI, if there are hacked drivers to get around Nvidia's blacklisting of products where SLI works but Nvidia drivers lock out that specific mobo because they haven't paid. Where 3d vision, g-sync, sli certification on psu's.

These are VERY widely known facts and have been for a decade plus. It's like telling me to prove Intel makes x86 cpu's. Your ignorance to it doesn't make it my burden to prove it to you.

But I must be a tin pot when 90% of the nvidia guys on this forum insisted Nvidia had patented variable refresh rate and how much abuse I got when I'd explain why not only is that not true but it would be entirely laughable.
 
All that text completely devoid of any worthwhile content. That takes some effort. Pro tip chuckles. Adaptive Sync had been around for a while and was available in physical products. Yet absolutely nobody was talking about bringing it to standalone displays as something to improve gaming until an idea put old hard cash and resources on the table. Even VESA's own documentation has no pre Gsync references to the tech beyond eDP for power saving.

So instead of spectacularly missing the point as usual and going off on yet another tedious and misinformed anti Nvidia rant you should probably do some research and keep the content light walls of text at bay unless you have something valid to add. They are a pain to scroll past on my phone.
 
Back
Top Bottom