Upcoming FreeSync monitors

Associate
Joined
6 Apr 2011
Posts
710
Location
Finland
AMD has released a list of the upcoming FreeSync-compatible monitors (apparently also showcased in CES)

Manufacturer -- Model # -- Size -- Resolution -- Refresh Rate -- (panel type, pricing)

BenQ XL2730Z 27" 2560x1440 144Hz (TN)
LG Electronics 29UM67 29" 2560x1080 75Hz (IPS, 319€)
LG Electronics 34UM67 34" 2560x1080 75Hz (IPS, 549€)
Nixeus NX-VUE24 24" 1080p 144Hz (TN)
Samsung UE590 23.6"/28" 4K 60Hz (TN, 28" $600)
Samsung UE850 23.6"/28"/31.5" 4K 60Hz
Viewsonic VX2701mh 27" 1080p 144Hz (TN)

Sources:
http://ir.amd.com/phoenix.zhtml?c=74093&p=irol-newsArticle&ID=2002910
http://www.computerbase.de/2015-01/freesync-kostet-bei-lg-319-und-549-euro/

Another thread here at OcUK has a more frequent updater, so you might want to bookmark that:
http://forums.overclockers.co.uk/showthread.php?t=18637089
 
Last edited:
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
While this might be against the general consensus, I hope someone would make 30"+ 1080p/1200p FreeSync monitors (still 16:9/16:10 IPS and 120Hz+, preferably). I'm using a 42" 1080p HDTV atm, and I'm fine with the resolution on it (from 1m distance). Unfortunately, it seems 1440p and 4k are all the rage nowadays with anything upwards from 27".
 
Last edited:
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
im with you on that, Philips did change that recently, I hope it sells good and there is more of them, nice to see a 40inch display with displayport, I thought it would never happen when you see 4k TV's still with scart

with all the windows scaling issues I don't feel like squinting to see stuff
oled TV's finally coming down in price and interesting stuff like QD
I want to see displayport on all that not just hdmi2 ><

i want away from tn/va i want the next tech as soon as possible on pc

[rant]
I would like to comment on the Philips-part: I don't know how they handle computer monitor support, but I've had VERY BAD experience with their HDTV support. And from the looks of it, I'm considered as one of the lucky ones, as my issue is mainly with the color banding, rest of the TV is still in essence a working unit. Some people are practically out of a working unit (they only get blank screen), and fixing it themselves (by downgrading the firmware) would void the warranty. And from the looks of it, this is precisely what Philips is trying to achieve by not fixing the widely known bugs. Previously I was very happy with one of their old CRT monitors. But that was because I never had to deal with their support department. With their current support, they're actually making things worse all the time. And for some reason they refuse to go back to the working firmware iterations... (hmm, I made a review of my unit at some point, maybe I should make an update on it)
In summary: for the time being, I would recommend to stay away from Philips in its entirety. Same goes for TP Vision, which is directly linked to their TV sector.
[/rant]

As for scaling issues: yes, that's one of the reasons I refuse to go above 1080p. And why the scaling issues aren't being looked at by the manufacturers or OS providers, I have no idea.

As for people talking about V-sync:
One thing now started to trouble me, with both G-sync and FreeSync:
if the V-sync is ON (within the specified range, 30-144Hz, for example), does it also cause the usual input lag associated with V-sync, or does the variable timing mitigate this? Or is the V-sync only a backup plan solely for out-of-range situations? From the linked video (from around 2:20 onwards), and frankly it was a little mixed message, but I got the notion that enabling the V-sync did affect the image/motion, even inside the specified fps range. Did I misunderstand that part or something?

He quite specifically points out in the video this is NOT a freesync or AMD limitation but specifically down to those panels and what the controllers(or could simply be down to the willingness of a company to get it working) are capable of.

The BenQ supports down to 30hz which is all g-sync does, the other panels support a smaller window for whatever that reason is but it's 60hz max because those are 60hz panels.

Hmm, I wonder whether manufacturers and retailers will start to advertise the functional range, or is that something that the customers have to find out by themselves from reviews, like we currently have to do with regards to input lag, PWM, etc.

As for the ranges in general:
I was hoping we could get stutter-free 24fps and 25fps videos, without the need to manually change the refresh rate.
(not sure if 24Hz and 25Hz are actually even available with computer monitors, but it's possible with HDTVs connected to computers, at least)
 
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
From past experience, new stuff coming to EU area will usually cost pretty much the same in pounds as it does in dollars. So even after taxes and exchange rates, there's still a premium. After a while it will balance itself a little, but in EU we usually indeed have a longer and more expensive "early adopter"-phase.

But indeed, the exchange rates have somewhat fluctuated in the past year, so remains to be seen what's the situation after a few months.

edit: Whoa, there was only one reply in this page when I started typing and checking the rates...
 
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
I think they (nVidia) held the leash too tight. They wanted to limit the supply, keep the prices high and this way regain their R&D costs. And from a financial perspective, these are the things you indeed should do. In moderation. But now it seems they went overboard. They created hype, but the pricing (and available models) kept the customers from actually buying it. Now there's an alternative solution, which seems to achieve the same results. Can't say that the momentum was lost, they actually just never gained it, in the first place.

Then again, AMD can still mess up their own pricing, as well.
 
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
The pricing has nothing to do with AMD.

... What do you mean?

Ok, I'll first state how I've understood the current situation, so someone can correct me if I've misunderstood something:

1. Adaptive Sync is an optional VESA standard for DisplayPort
2. AMD was part of the group bringing it into existence
3. Adaptive Sync is a royalty-free standard
4. AMD has no say as to how it's priced
5. Freesync, OTOH, is AMD's proprietary implementation of the Adaptive Sync, and AMD holds all rights to Freesync
6. To implement Freesync in a monitor, the manufacturer would first have to make arrangements with AMD for it
7. If AMD wanted, they COULD initiate royalties or other costs to Freesync

Is it currently known WHO makes the hardware and/or software that's required for a compatibility with Freesync? If AMD has a part in it, then they can charge custom labor costs, material costs and/or royalty costs. For now, we have not been given such info?

Example:
nVidia could set the royalties for manufacturing G-Sync monitors as "free", but still control the scaler/IC manufacturing. And, instead of reasonable pricing, they could charge 10x the regular cost of the hardware implementation.

Furthermore, is Freesync limited to AMD's GPUs? If AMD wanted, they could alternatively recoup R&D costs by increasing the price of these said GPUs.

This is no doubt what nVidia is currently doing with their GPUs.
 
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
@melmac:

I had multiple notions with the points you provided. But all of them focus on the same inconsistency: you're suggesting that Freesync == Adaptive Sync. But every source I found single-mindedly positions the Adaptive sync as the standard, and Freesync as an implementation of this very standard.

Even your own statement:
"Freesync isn't an implementation of adaptive sync. It's just what AMD are calling their method to connect to an adaptive sync monitor."

Doesn't the latter part practically translate that Freesync is indeed AMD's implementation of the Adaptive Sync standard?

Considering the overwhelming bias to contradict your statement, I will need to ask for sources for your claim.

Here are few examples of mine:
http://support.amd.com/en-us/kb-articles/Pages/freesync-faq.aspx
http://www.sweclockers.com/artikel/...ngsfrekvenser-med-project-freesync/2#pagehead
http://www.forbes.com/sites/jasonev...adaptive-sync-gets-added-to-displayport-spec/
http://www.tomshardware.com/news/amd-project-freesync-vesa-adaptive-sync,27160.html

Here's a quote, straight from AMD's FAQ:
"DisplayPort Adaptive-Sync is an ingredient DisplayPort feature that enables real-time adjustment of monitor refresh rates required by technologies like Project FreeSync. Project FreeSync is a unique AMD hardware/software solution that utilizes DisplayPort Adaptive-Sync protocols to enable user-facing benefits: smooth, tearing-free and low-latency gameplay and video."

Also from the interview with Robert Hallock, Technical Communications Officer at AMD:

"Could you please explain the difference between AMD FreeSync and VESA Adaptive-Sync?
– VESA DisplayPort Adaptive-Sync is a new component of the DisplayPort 1.2a specification that allows a graphics card to control the refresh rate of a display over a DisplayPort link. As it seems there is some confusion, I want to emphasize that DisplayPort Adaptive-Sync is not FreeSync. By itself, DisplayPort Adaptive-Sync is a building block that provides a standard framework a source device, e.g. a graphics card, can depend on to execute dynamic refresh rates.

DisplayPort Adaptive-Sync is an important development in our industry, however, because there now exists an industry-standard framework that dynamic refresh rate technologies, like Project FreeSync, can rely on to deliver end-user benefits: no tearing, minimal input latency, smooth framerates, etc. Make no mistake, providing dynamic refresh rates to users still takes a lot of ‘secret sauce’ from the hardware and software ends of our products, including the correct display controllers in the hardware and the right algorithms in AMD Catalyst.
"

I think the emphasized parts make the situation quite clear.
 
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
@melmac:

???

My conclusion was that Freesync =/= Adaptive Sync. As a confirmation, what conclusion did YOU reach? I'm getting a little bit of mixed signals here. For example, for my initial point 6, which discussed Freesync, you replied with discussion about Adaptive Sync. Even more confusing is that (IMO) you implied that Freesync (or its implementation) "has nothing to do with AMD" (?!). On point 7, in which I talk about AMD and Freesync, you reply with discussion about VESA and Adaptive Sync. So for me, those made the impression that you consider them to be the one and the same, as you're using them back and forth, interchangeably. But on your latest reply you seem to be making a distinction between them?

Or are you depicting the Adaptive Sync as the "hardware" portion, and Freesync as the "software" portion, or something along those lines? But even that would still fall inside the "implementation" concept.

And just to make sure: when you are talking about "their method", to me that's pretty much synonymous to "implementation", "integration", "solution", etc. (in this context, naturally).

Also, you do understand I'm using quotes straight from AMD... And IMO, they make a pretty clear distinction between Freesync and Adaptive Sync, I even emphasized the parts that confirm this.
 
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
6. To implement Freesync in a monitor, the manufacturer would first have to make arrangements with AMD for it
Nope, it has nothing to do with AMD. They can decide if they want to make the monitor adaptive sync or not themselves.

I'm talking about Freesync and its implementation. Then you said AMD has nothing to do with it. This is also linked to the next agenda: How I see it, monitor manufacturers need to talk to AMD if they want to use the Freesync-trademark and advertise their products as compatible. And this is not a special case with Freesync, it applies to all trademarks; at some point of the value chain, somebody has to make an agreement of using the trademark, and possibly even paying for it. Unless that specific trademark-holder has made a public statement to use it freely (potentially with light/obvious restrictions). For example, monitor manufacturers will have to pay not just for using an HDMI port, but also for using the logo. If AMD says no to using Freesync, manufacturers can't legally advertise it as such (Adaptive Sync would be a separate agreement, btw). The monitor might adhere to all the technical requirements, and might even work as-is (possibly through a patch), if AMD/nVidia don't impose any sort of certification check upon usage. With big and trustworthy manufacturers, this usually isn't a problem, as AMD actually WANTS more manufacturers to use their tech. But that doesn't prevent AMD from charging for the usage of their trademark, if they wanted to.

It might be that I'm misunderstanding something, but indeed, for Freesync I always took it as the monitor requiring something extra, as well. Not necessarily anything more than a few extra code lines at the firmware (like a certification code). At CES, I remember one monitor being advertised as Adaptive Sync -compatible (quick search suggests it was the Asus MG279Q), while others were advertised as Freesync-compatible.

And no, I didn't make any such assumptions towards G-Sync similarity, as I don't know that much about G-Sync's internal architecture, to begin with.
 
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
(just a minor correction: they weren't "questions", but statements of how I saw the situation -- after the numbered statements, I gave a few questions, though)

Hmm, I think there's a disagreement, then. In my view, I still see the Freesync as an implementation of the Adaptive Sync technology.

I see the Freesync as something that's supposed to ENHANCE the Adaptive Sync technology. Adaptive Sync is the base requirement, but Freesync adds some extra "secret sauce" to give it an edge over the base functionality. As for G-Sync, it apparently takes advantage of a specialized hardware to do things slightly differently, but achieving similar results.

The trademark discussion is indeed more akin to the rights of advertising it as such. But also to the rights of using the proprietary code, not just the logo. As for who pays for the logo usage: it would definitely be the manufacturers. AMD is "selling" a product to the market, Freesync. Monitor manufacturers are selling their product, the monitors. Freesync would be a positive feature they can flaunt with their products. I see no reason why AMD would pay for manufacturers, instead. As for whether AMD would even charge for it, is another matter. Granted, AMD has an incentive to make it easy and cheap for manufacturers to become "part of the family", to make Freesync an ubiquitous technology, but it would make little sense to actually pay manufacturers to use it. That would be more akin to bribing, actually. :D

And this links to the part where I said "AMD can still mess up their own pricing". But that's not even all of it. There's a huge temptation to recoup R&D costs via GPU card price-hikes. They know the industry/market is now interested in the tech, and they could quite safely keep the prices 5-10% above of what they would normally charge. If they get greedy, they will add 30%, and thus "mess up". Another point would be the cost of Freesync-certification. Is it an expensive proprietary certification, or is there a standard body that can certificate it while doing the other standard tests (CE, for example)?

And just to make sure, when I say "few lines", I'm not talking about "Freesync_enabled=1", I'm talking about something along few kilobytes. There could be some proprietary algorithms, for example, that work in consort with the Freesync-compatible GPU. Then again, it could be just a certification check, so the GPU knows how to proceed next.

As for Adaptive Sync royalties:
Yes, and I stated this in point 3. But bear in mind, manufacturers still can't advertise their products as compatible without prior agreement. This could be something as simple as a certificated test verifying that the product conforms to the standard, though.
 
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
@melmac:

Bear with me still a moment as I'm trying to grasp the things.

Hmm, ok. I think I get it now. So for now, I can accept that there is NOTHING of AMD in the monitor itself. No proprietary code, no info bits, whatsoever. So there's no technical aspect to license, either. I had misunderstood this part. Which is actually quite disappointing, as that effectively means that Freesync can't bring anything MORE to the table than what Adaptive Sync provides, so there's no "enhancement", like I initially thought there was. That would indeed also lead to no need for any technical Freesync-certification, only for Adaptive Sync.

Nevertheless, I still don't see how Freesync wouldn't be defined as an implementation for the Adaptive Sync, sorry. And I also don't see how exclusivity to AMD's GPUs&APUs would limit it, in any way. Or are we disagreeing on the "implementation"-term itself, perhaps? Wiki's definition is exactly how I've understood the situation (the monitor not having anything AMD-proprietary doesn't change this):
http://en.wikipedia.org/wiki/Implementation#Computer_science

And I also disagree with the trademark issue. That's not how I've understood trademarks work, at all. Usually the royalties are paid to the trademark holder, not the other way around... Wiki agrees with me on this, as well:
https://en.wikipedia.org/wiki/Royalties#Trade_mark_royalties

So indeed, maybe what you're talking about with "agreements" is something else, as I think it would definitely fall outside of trademark agreements, at least. If the monitor manufacturer advertises their products as "Freesync-compatible", then that part is under a trademark agreement. They will need a permission to use AMD's trademark. Unless AMD has given a public statement (or something similar) which defines the requirements for using that specific trademark. Which would still be counted as a permission, though.

And Freesync is AMD's product, and it is indeed trying to "sell" it to the market. I did use quotation marks with "selling" (not only in this reply), because it's not in the traditional sense of selling a product. It's more like selling an ecosystem, a brand, an image, a functionality. It is something that AMD wants the market (users and manufacturers) to adopt as widely as possible, and create interest/hype around it.

As for R&D cost recouping:
The R&D costs don't just magically stop after you release a product to the wild. Or yes, in a perfect world products would indeed be "complete"/"final" on release, but in the current world (especially in computer industry) products have bugs, issues, fixes and improvements. There are probably people working on Freesync even at this very moment.
 
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
Trademark, etc.:
Ok, so that's just some other "agreement" discussion. What I've been saying, is that if AMD so chooses, they COULD initiate royalties to Freesync (aka. point 7). That holds true even if there's no technical property inside the monitor. If they want to make Freesync more ubiquitous and increase their GPUs' sales, they will probably want to keep the trademark costs low, though.

And this is just my personal opinion, but I think that for monitor manufacturers the "Freesync-compatible" branding is more profitable than leaving it out.

As for Freesync=product:
Where did I argue against this? I even said it in my last post. They are using this feature to sell more cards. They are a business after all.
-->
AMD isn't selling a product to the market in this case. They pushed for the Adaptive sync technology to be included in the display port standard. Of course they are using this feature to sell more cards, the more people that buy an adaptive sync monitor, the more people might buy AMD cards.

There was a little bit of mixed signals there, so I just wanted to make sure we're on the same page with regards to Freesync being a product to be "sold".

R&D, etc.:
The point was that they've just brought Freesync to the market. From this point forward, they will be recouping the material and labour costs that went into bringing it to market. Or are you suggesting that the laptop functionality was already all they had to do, and just copy-paste it into desktop side of things? Because I would disagree with that. I'd guess they had a whole department working on it. And not just over the weekend.

Also, they naturally can't divide the R&D costs equally between a £20 card and a £200 card. So bringing an example of a bargain bin product doesn't prove a point, I'm afraid.

Price-hike being small:
That's why I said they COULD mess up their pricing, not WOULD. I even gave examples of what I would think would be fairly acceptable and what would be over-the-top. It was way back in the thread.

And actually I already read the WHOLE thread, before I even made the previous reply. The whole discussion initially spiraled from the "AMD can still mess up pricing" and "Freesync=implementation". Which both still hold true, from my point of view. Granted, part of the momentum disappeared with the no-technical-portion part, but that still doesn't change the bottom line.
 
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
Trademarks:
But they are not paying for the technology, they are paying for the trademark. And stupid or not, trademark royalties just work that way. And like I said, I'm not saying they WILL charge for it, I'm saying they COULD charge for it. I think I've emphasized this enough.

Being selective:
And I wasn't being selective, doesn't matter whether there is AMD in there or not, it's still more beneficial than not. But like I said it's just my personal opinion. Furthermore, if they don't want to mention AMD, it's not like they HAVE to. And they can also still advertise the regular "Adaptive Sync -compatible" or "variable refresh rate compatible".

Product:
Umm, didn't I already say it myself that it's "not in the traditional sense of selling a product"...? From a company's view, even a service is a product. Freesync is indeed a product. It can't be packaged and shipped, but it's still a product. That's why I used the quotation marks for "selling".

R&D:
"It costs AMD next to nothing"
"There is nothing that AMD has to recoup"
"as they have to pay these people anyway"
(!!!)

... Ok, I'm not even going to go to that. Let's just say I disagree VERY strongly with how costly you think common R&D is. And I think the last part tells every economist/engineer that it would also take a relatively big effort to explain it. :D
(or maybe it's me who's wrong, so let's just leave it at that)

Pricing:
I didn't base the price-hikes to R&D. Those were just my own opinions on how much they could and could not get away with. I think I was quite clear on that.

Bluntness:
And sorry to be blunt myself, but you haven't brought any piece of evidence that would convince me otherwise.
 
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
@melmac:

Trademark:
It doesn't matter which one you think might potentially gain more. That's not how TRADEMARK royalties work. If there is some other kind of agreement, that's their business. But it's outside the trademark royalty issue.

And for the record, I did consider it (the part about who benefits more), but like I already stated, I just disagree with it. You do understand you are disagreeing with my opinion, just the same?

"Adaptive Sync -compatible" or "variable refresh rate compatible":
... Are you arguing with me on a technicality, or something? Fine, they'll say it "supports"/"uses" (or whatever word you want to put in there) Adaptive Sync / variable refresh rate. Either way, the point was that they can still flaunt the Adaptive Sync / variable refresh rate, no matter whether they even mention Freesync or not. Or are you suggesting they wouldn't want to mention them at all?

Product/service/etc:
It would also seem I would have to explain the wider concept of a product, as well. Which I'm not going to. In short, when you're talking about a product, it seems you're talking more about a physical object. I'm talking about a product in the marketing sense, from a company's perspective. That includes not only physical objects, but pretty much anything you can put a price on. Anything you can collect money from, is essentially a product for the company holding the rights to it.

R&D:
I wasn't kidding when I said I won't go further into that. If it is indeed that AMD's R&D for Freesync was 0€, then yes, you are correct, they have nothing to recoup. I strongly disagree with that assessment, though.

Pricing:
Like I said, "on how much they could and could not get away with". If you want a term for it, then "consumer/market psychology/behaviour". Freesync has created hype around itself, which AMD can take advantage of.

Links:
?? As I recall, those links were to prove that Freesync=/=Adaptive Sync. Which they actually did prove. Although they weren't needed, because I just confused your replies as if you were suggesting they would be the same thing.

And I don't remember finding anything that would actually DENY the monitor component. But because there's no mention of it, either, I took your word for it, instead. Even if there WERE a physical component in the monitor, those articles would probably still hold true. So at the moment, I'm actually just trusting your word on that (combined with the logicality of there being no mention of it).

But I agree with Marine-RX179, this is going too off-topic, so let's try to wrap this up. In essence, we are disagreeing on so many fronts, so I don't think it's worthwhile to continue, when we have so radically different views. So I'd suggest we'll just agree to disagree.
Meanwhile, to the topic at hand:
It seems the operating frequency ranges are a bit narrower than what I had anticipated. From the pre-order thread, there's a little bit of details (only Acer monitors, though):

Acer XB270HA = 55 - 144Hz
Acer XB270HU = 30 - 144Hz
Acer XB280HK = 55 - 75Hz
Acer XG270HU = 50 - 144Hz

The 30-144Hz is quite alright, but that 55-75Hz is quite ... depressing. Anyone buying these monitors should definitely check the ranges before purchase, so there are no inconvenient surprises afterwards.
 
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
Trademarks and other agreements:
Like I said, from my point of view sticking a "Freesync" sticker on the monitor is more beneficial than not to. You seem to disagree with that. And that is fine. In your case, the "other agreements" would come into play. In my case, trademark agreements. Well, actually your case would also need the trademark agreements, even if the fees were laid as 0€. Even moreso, if AMD wants to maximize adoption, then 0€ would be the ideal sum in my case, as well. And I also pointed this out earlier, by the way.

"COULD":
Well yeah, the initial phrase was "... AMD can still mess up ..." and "... COULD initiate royalties or other costs ..." (latter capital emphasis even on the original). You're the one trying to make it sound like a definite future action... I'm just saying the have the option to do so.

390x for £1500:
Is there something stopping them from doing it? Certainly not at all probable, but far from impossible. If you had said such a thing, then yes indeed, I would have probably corrected that they "most likely won't", with reasons.

Freesync in monitor:
No. Just to make sure, I am still taking your word for it (combined with the logicality of such component not being stated in the articles). But you didn't actually show any evidence for it, though. There's just the lack of evidence for my stance, which therefore points to your explanation being more plausible. If tomorrow AMD came out and stated the opposite, I wouldn't say I'd be flabbergasted.

R&D:
Above I told how I took your word for it. But, when you said the "as they have to pay these people anyway", I actually started to doubt the whole thing again.

I have no idea how AMD's R&D is structured, specifically. Only AMD knows. But I know how R&D works in general. That earlier statement implied you don't. And it wouldn't be such an easy task to explain.

There's the http://en.wikipedia.org/wiki/Research_and_development for R&D in general, but that doesn't cover even nearly enough. I don't think there was any one particular article which taught me the basics of R&D, let alone its cost structure. I think the R&D cost issue requires a wider understanding on the basics of economics, in total. As such, I wouldn't waste time on reading singular articles.

If you necessarily want something to look at, then the chapter 2.3.1.2 of the pdf-link below gives one short example overview (no theory reading or explanation as such, though) for a cost formula of R&D:
http://www.springer.com/cda/content...7233239-c2.pdf?SGWID=0-0-45-351708-p107940657

If you want more, then use search words like "product life cycle cost structure". Add "R&D" and "in computer industry", if you want more specific results.

Product:
Like I said, explaining the wider concept of "product" is something I'm not very fond of, either. But just for kicks I tried wikipedia (didn't expect to find anything useful in this case), and there's actually one fairly suitable page for it:
http://en.wikipedia.org/wiki/Product_(business)

Driver price:
Me personally? I can't. AMD can, they have the financial data. There are usually people who are specifically calculating how much input (money, material, man hours, etc.) the company is using on different areas of operation. Likewise there are people analyzing and estimating what would be the optimal input in the future, as well.
As for the topic at hand:
I'm a little disappointed that there weren't any "consumer-priced" IPS monitors. Something like 1920x1080 24"/27" for £250 would have been nice, even at 60Hz (30-60Hz operational range). £450 for 120Hz, I guess (30-120Hz).

But indeed, I'm beginning to think that the overall usefulness will be very dependent on the operational range, which seems to vary quite drastically. Additionally, now that I think about it: which is technically correct in this case, 30-60Hz or 30-60fps?

And just as a sidenote, the smaller 21:9 LG is rumored to go for 319€, the bigger one for 549€, it seems.
 
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
If using a logo reduces my market by 80% --> not beneficial
If using the same logo helps the owner more than me --> not gonna pay for it myself, so the only remaining option would indeed be the owner

Yes, I agree with those. I'm just disagreeing whether this is the case with Freesync.

Could:
Wouldn't matter if I thought Freesync was a dairy product or a space shuttle. The trademark holder can still, if he so chooses, charge for the trademark. That's his right as the trademark owner, and it's protected by law. Nobody is forcing anybody to take up that offer. But if you don't take the offer, you can't use the trademark, either.

R&D:
Nope, I didn't miss it. You do understand that integrating both outside and inside technologies/features/processes/whatever into your company's products doesn't happen automatically? They don't just magically appear all-ready overnight. It takes man hours, facilities, material costs (not talking about the production's material costs here), and so forth. Those are counted towards the overall R&D.

Look, just because you keep asking extra questions, doesn't mean I'm gonna keep answering them, anymore. I told you I ain't fond of explaining such a wide concept. It's commendable that you want to know more, but I can't do the research for you. I have other things to do, as well, and they are higher up in the priority list, I'm sorry.

So I will say this once again, and leave it at that, I won't repeat it anymore:
"If it is indeed that AMD's R&D for Freesync was 0€, then yes, you are correct, they have nothing to recoup. I strongly disagree with that assessment, though."

Product:
In that case, I can't help you. Same as above, I can't do the research for you.

Links:
Well, thanks for the links, but like I said, I already took your word for it. There's no need to reassure me. Nevertheless, I took a brief glance at them, but didn't notice the part that says there is nothing extra in the monitor. Feel free to quote that part, if you like.

Wrapping up:
This is starting to run in circles. I've repeated many things way too many times, already. We're not moving forward. If there is nothing new, I'd suggest we wrap this up quickly. Like NOW.
Ranges:
Hmm, at least HDTVs allow computers to set 24Hz/25Hz. Not sure if it's with internal frame multiplication at 48Hz/50Hz or 72Hz/75Hz or 96Hz/100Hz (most probably it is, as I can also enable motion interpolation with visible differences).
 
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
Market impact + Product + R&D:
Like said, we'll just have to disagree on those. I didn't want to patronize, but I also have no intention to become your tutor, either.

As for the evidence:
If that is your best evidence, then you have a weak argument. That would indeed prove that there is no urgent necessity for a proprietary hardware component. But there's also the possibility that without a proprietary hardware component, the functionality will only be limited, but still compatible.

Furthermore, wouldn't that firmware upgrade in itself make it possible to include AMD's enhanced proprietary algorithms, as well, for example?

As such, I wouldn't state that as the best evidence. Rather, I would state the lack of mentioning such a component (hardware or software) as the best evidence.
Personally, I would like it very much if the same variable refresh rate functionality would come to HDTVs, as well. Unfortunately, the usefulness would be so limited for regular TV watching, that there's really no incentive for companies to cater such a small market. Then again, if it would be implemented in game consoles, there's a chance that the TV manufacturers (Sony, at least) would follow suit, as well.
 
Associate
OP
Joined
6 Apr 2011
Posts
710
Location
Finland
Mm-hmm. Not sure what you think we would find out in a few months that would change, prove or disprove anything that has been discussed here. Theory is theory, that's hard to change. Unless I've misquoted some theory. And I'm not sure if I've made even one definite claim - mostly possibilities and personal opinions (both of which I have clearly stated, btw). You, on the other hand, seem to wield more definitives than conditionals.

But ta-ta, in any case.
There's also the option to have the tech imported into next-gen HDMI, for example. Even the latest major HDMI update (1.4>2.0) took 4 years, so the next-gen consoles might just have big enough time frame to adopt the next-gen HDMI (when using the average 6-7 years interval for the consoles). It's still a niche market, though, and currently only Sony would have the added benefit of complementary products.
 
Back
Top Bottom