Upcoming FreeSync monitors

Soldato
Joined
30 Dec 2011
Posts
5,540
Location
Belfast
Oh yes, indeed, these are supposed to be for the operating frequency ranges in general for these variable refresh rate solutions, not tied to the Freesync. Check here for more accurate wording:
http://forums.overclockers.co.uk/showthread.php?p=27491023#post27491023

But indeed, the 4k part is a little odd. Does that model permit 75Hz on lower resolutions, perhaps?

Yep, looks like normal operating frequency ranges that are listed. So like most monitors you will get higher Hz at lower than the monitors optimal resolution.
 
Soldato
Joined
19 Dec 2010
Posts
12,039
To Aatu. Yes, We will never agree, because I keep trying explain that I am talking about Freesync and Adaptive Sync only. Just what is happening between Monitor manufacturers and AMD. Not trademarks rules in general.

Also, your arguments are a bit ridiculous for several reasons. You keep saying AMD could raise prices, AMD could charge royalties. COULD? What if I said AMD could release the 390x at £1500 tomorrow. You would say rubbish, and you would could give reasons why based on AMD's previous pricing strategies, current market prices etc.

And that's what's wrong with your arguments. You have no understanding of how Freesync or adaptive sync works, where the technology comes from, and despite everything I have said, you still believe Freesync is in the monitor!!

****You keep saying AMD have to recoup R&D costs, when adaptive sync isn't an AMD technology. It's from VESA. If you strongly disagree with that assessment you must have some idea of how it all works. So tell me please.****

It's already been shown on websites that it's the monitor manufacturers that have to spend most money and that adaptive sync monitors will be more expensive as they have to get an improved scaler.

Freesync is just a driver update on a compatible GPU and even this is based on previous tech.

****So, you say that freesync, Eyefinity, etc are products? That any feature used to sell anything is a product in itself. Even though that feature is not something that can be used by anything else or in anything else? Like my Panasonic TV has a feature called "intelligent Frame creation" you are saying that that's a product not a feature? I don't think you are right in what you are saying at all.

Can you put a price on what's basically a driver?

Mantle is a feature of AMD GPU's, I would call this a product as it can be used by other companies, it can make money by itself. I am not sure Freesync comes into this category at all. I don't know much about marketing, so I would like you to explain how Freesync is a product and not a feature?*****

I know you want to wrap this up. If you don't want to reply to this whole post, then just reply to the bits between the **** please. Thanks.

^^ @ Aatu.

Sorry for offtopic, don't think it's been that bad though. And there is another thread with this exact same info. Just don't read between the spoiler tags :p
 
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
Trademarks and other agreements:
Like I said, from my point of view sticking a "Freesync" sticker on the monitor is more beneficial than not to. You seem to disagree with that. And that is fine. In your case, the "other agreements" would come into play. In my case, trademark agreements. Well, actually your case would also need the trademark agreements, even if the fees were laid as 0€. Even moreso, if AMD wants to maximize adoption, then 0€ would be the ideal sum in my case, as well. And I also pointed this out earlier, by the way.

"COULD":
Well yeah, the initial phrase was "... AMD can still mess up ..." and "... COULD initiate royalties or other costs ..." (latter capital emphasis even on the original). You're the one trying to make it sound like a definite future action... I'm just saying the have the option to do so.

390x for £1500:
Is there something stopping them from doing it? Certainly not at all probable, but far from impossible. If you had said such a thing, then yes indeed, I would have probably corrected that they "most likely won't", with reasons.

Freesync in monitor:
No. Just to make sure, I am still taking your word for it (combined with the logicality of such component not being stated in the articles). But you didn't actually show any evidence for it, though. There's just the lack of evidence for my stance, which therefore points to your explanation being more plausible. If tomorrow AMD came out and stated the opposite, I wouldn't say I'd be flabbergasted.

R&D:
Above I told how I took your word for it. But, when you said the "as they have to pay these people anyway", I actually started to doubt the whole thing again.

I have no idea how AMD's R&D is structured, specifically. Only AMD knows. But I know how R&D works in general. That earlier statement implied you don't. And it wouldn't be such an easy task to explain.

There's the http://en.wikipedia.org/wiki/Research_and_development for R&D in general, but that doesn't cover even nearly enough. I don't think there was any one particular article which taught me the basics of R&D, let alone its cost structure. I think the R&D cost issue requires a wider understanding on the basics of economics, in total. As such, I wouldn't waste time on reading singular articles.

If you necessarily want something to look at, then the chapter 2.3.1.2 of the pdf-link below gives one short example overview (no theory reading or explanation as such, though) for a cost formula of R&D:
http://www.springer.com/cda/content...7233239-c2.pdf?SGWID=0-0-45-351708-p107940657

If you want more, then use search words like "product life cycle cost structure". Add "R&D" and "in computer industry", if you want more specific results.

Product:
Like I said, explaining the wider concept of "product" is something I'm not very fond of, either. But just for kicks I tried wikipedia (didn't expect to find anything useful in this case), and there's actually one fairly suitable page for it:
http://en.wikipedia.org/wiki/Product_(business)

Driver price:
Me personally? I can't. AMD can, they have the financial data. There are usually people who are specifically calculating how much input (money, material, man hours, etc.) the company is using on different areas of operation. Likewise there are people analyzing and estimating what would be the optimal input in the future, as well.
As for the topic at hand:
I'm a little disappointed that there weren't any "consumer-priced" IPS monitors. Something like 1920x1080 24"/27" for £250 would have been nice, even at 60Hz (30-60Hz operational range). £450 for 120Hz, I guess (30-120Hz).

But indeed, I'm beginning to think that the overall usefulness will be very dependent on the operational range, which seems to vary quite drastically. Additionally, now that I think about it: which is technically correct in this case, 30-60Hz or 30-60fps?

And just as a sidenote, the smaller 21:9 LG is rumored to go for 319€, the bigger one for 549€, it seems.
 
Soldato
Joined
19 Dec 2010
Posts
12,039
OK Aatu, I understand trademark laws and all of that. Can I ask you a question? If you put a logo on a product that reduces your Market by 80% would you say that is beneficial or not? And if that logo was actually helping the sales of the other product more than yours, would you expect them to pay you or would you pay them?

COULD. It's a still a silly argument because you are basing everything on knowing nothing about the technology. In fact your initial phrase about AMD can mess up and that they can initialise royalties was based on the AMD tech been in the monitor. It's not AMD tech, AMD is just connecting to that tech.

And that leads us to

R&D, You wrote a good few lines about R&D and even listed a link, but, you must have missed the line about VESA in my post. I will ask you another question. Can you please tell how AMD has to pay R&D for a technology that they didn't invent or research or anything? The display port is all VESA. I don't know how many times I can repeat this. It wasn't researched or developed by AMD.

I understand how some of how R&D works, But can you tell me how this applies to something introduced by VESA in 2009?

Product. I looked through your links and even did the search on google that you suggested. I read about tangible and intangible and a whole lot of other crap. But I didn't see one thing that makes Freesync, eyefinity or even intelligent frame creation products. They are feature of a product. I will go back to Mantle again. It is a feature and a product.

Here are some links, I posted the first one before.

http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/

http://www.anandtech.com/show/8008/...andard-variable-refresh-monitors-move-forward

http://www.eteknix.com/no-need-for-g-sync-now-that-display-port-has-adaptive-sync/

The adaptive sync technology is in the Monitor. Freesync is not.

About your question on ranges. At the moment Panel technology is holding it back a little. I don't know of any panel that refreshes below 30Hz. So I think the range is entirely dependent on whatever panel the monitor manufacturers use. It probably decides the price too. Better panel, more range = more expensive!!

I am guessing that the ranges quoted by AMD in their FAQ are probably the capabilities of the scalers but those ranges wont be reached until panel technology improves.
 
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
If using a logo reduces my market by 80% --> not beneficial
If using the same logo helps the owner more than me --> not gonna pay for it myself, so the only remaining option would indeed be the owner

Yes, I agree with those. I'm just disagreeing whether this is the case with Freesync.

Could:
Wouldn't matter if I thought Freesync was a dairy product or a space shuttle. The trademark holder can still, if he so chooses, charge for the trademark. That's his right as the trademark owner, and it's protected by law. Nobody is forcing anybody to take up that offer. But if you don't take the offer, you can't use the trademark, either.

R&D:
Nope, I didn't miss it. You do understand that integrating both outside and inside technologies/features/processes/whatever into your company's products doesn't happen automatically? They don't just magically appear all-ready overnight. It takes man hours, facilities, material costs (not talking about the production's material costs here), and so forth. Those are counted towards the overall R&D.

Look, just because you keep asking extra questions, doesn't mean I'm gonna keep answering them, anymore. I told you I ain't fond of explaining such a wide concept. It's commendable that you want to know more, but I can't do the research for you. I have other things to do, as well, and they are higher up in the priority list, I'm sorry.

So I will say this once again, and leave it at that, I won't repeat it anymore:
"If it is indeed that AMD's R&D for Freesync was 0€, then yes, you are correct, they have nothing to recoup. I strongly disagree with that assessment, though."

Product:
In that case, I can't help you. Same as above, I can't do the research for you.

Links:
Well, thanks for the links, but like I said, I already took your word for it. There's no need to reassure me. Nevertheless, I took a brief glance at them, but didn't notice the part that says there is nothing extra in the monitor. Feel free to quote that part, if you like.

Wrapping up:
This is starting to run in circles. I've repeated many things way too many times, already. We're not moving forward. If there is nothing new, I'd suggest we wrap this up quickly. Like NOW.
Ranges:
Hmm, at least HDTVs allow computers to set 24Hz/25Hz. Not sure if it's with internal frame multiplication at 48Hz/50Hz or 72Hz/75Hz or 96Hz/100Hz (most probably it is, as I can also enable motion interpolation with visible differences).
 
Soldato
Joined
19 Dec 2010
Posts
12,039
LOL fair enough, you are still wrong in everything you say regarding Freesync. The monitor manufacturers are losing over 80% of their market by putting AMD freesync compatible on their monitors as it rules out Nvidia and Intel owners because Nvidia and Intel have over 80% of the GPU market. So who benefits more AMD or the Monitor manufacturers?

You were using the R&D argument to say AMD could raise their prices for Freesync. I have been trying to show you that putting this technology into GPU's hasn't had any impact on prices. It has been used in laptops for years, from the cheapest to the most expensive. Laptops and APUs with this tech were no more expensive than laptops without this tech.

I mean seriously, can't you get that into your head? Rather than keep patronizing me about my lack of knowledge about R&D and products go look at how it actually affected prices and come back to me. It's obvious that the costs associated with implementing Freesync are negligible.

It's all based on stuff that's been around for years, Vsync, Changing refresh rates etc. etc. There is nothing new about it. And the most important part is on the monitor and has nothing at all to do with AMD.

Well I won't bother with more links, rather than you take my word only, The best evidence that I can show is that their are monitors out there already that can be made to support adaptive sync with a firmware upgrade for the display port. An Iiyama ProLite B2888UHSU-B1 and the Nixeus NX-VUE27D are two example of monitors that are already adaptive sync capable. There is no AMD hardware or software in these monitors.

Is that enough evidence for you?

It's the same with HDTV's they are using some kind of pulldown detection to multiply the signal to a supported refresh rate. For example, a 120Hz TV/monitor uses 5:5 pulldown, because 5 by 24 is 120.
 
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
Market impact + Product + R&D:
Like said, we'll just have to disagree on those. I didn't want to patronize, but I also have no intention to become your tutor, either.

As for the evidence:
If that is your best evidence, then you have a weak argument. That would indeed prove that there is no urgent necessity for a proprietary hardware component. But there's also the possibility that without a proprietary hardware component, the functionality will only be limited, but still compatible.

Furthermore, wouldn't that firmware upgrade in itself make it possible to include AMD's enhanced proprietary algorithms, as well, for example?

As such, I wouldn't state that as the best evidence. Rather, I would state the lack of mentioning such a component (hardware or software) as the best evidence.
Personally, I would like it very much if the same variable refresh rate functionality would come to HDTVs, as well. Unfortunately, the usefulness would be so limited for regular TV watching, that there's really no incentive for companies to cater such a small market. Then again, if it would be implemented in game consoles, there's a chance that the TV manufacturers (Sony, at least) would follow suit, as well.
 
Soldato
Joined
19 Dec 2010
Posts
12,039
lol, more patronising. You are dealing with theory and I am showing you the reality. I have shown you how it hasn't affected prices, how the adaptive sync tech(ie the variable refresh tech) is all VESA and nothing to do with AMD. About how putting AMD freesync compatible will reduce the market for the monitors to less than 20% of the total market.

You disagree? Prove what you say, and I don't want to hear about general R&D,trademarks etc. I just want to you be specific to freesync and adaptive sync.

I knew you would come back with that argument, that how do I know the firmware hasn't AMD software included. VESA is responsible for the display port and I have told you this several times.The firmware upgrade is for the display port to enable the optional adaptive sync specification.

But this conversation is really over. Because basically, despite what you say, you aren't taking my word for it, and you still believe that Freeysnc is in the monitor and all our differences stem from that.

so no worries, good discussion. Guess we will find out in a few months.

Well, don't think it will come to HDTV's or consoles anytime soon. TV's and consoles would have to put in compatible display ports. Just want to say that all consoles have this tech already. They would just need a display port, as they are GCN APUs inside the consoles.
 
Last edited:
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
Mm-hmm. Not sure what you think we would find out in a few months that would change, prove or disprove anything that has been discussed here. Theory is theory, that's hard to change. Unless I've misquoted some theory. And I'm not sure if I've made even one definite claim - mostly possibilities and personal opinions (both of which I have clearly stated, btw). You, on the other hand, seem to wield more definitives than conditionals.

But ta-ta, in any case.
There's also the option to have the tech imported into next-gen HDMI, for example. Even the latest major HDMI update (1.4>2.0) took 4 years, so the next-gen consoles might just have big enough time frame to adopt the next-gen HDMI (when using the average 6-7 years interval for the consoles). It's still a niche market, though, and currently only Sony would have the added benefit of complementary products.
 
Back
Top Bottom