What statistics?
You've not posted anything that is backed up by real numbers. You are literally just making things up based on nothing. Vs looking at real numbers and basic economics.
I can't post the address of the aggregate site where I got the numbers for the FreeSync vs. G-Sync model variants, but I can post a screenshot, if you want. Are you disputing the 6:1 ratio?
I also gave the
Acer XZ321Q vs.
Acer Z321Q links for price difference comparison. Unsurprisingly, the discussion suddenly dropped at that.
I also provided calculations to show the logic behind my theory why G-Sync is mostly found on £500+ monitors. Nobody picked up on that, either.
All you've been doing is sticking to the "nVidia sells more GPUs", by comparison.
They both increased market share? AMD just went from 30% to 27% in discrete GPU sales. In a quarter where they had a major product release. People buying APU's, even if they did pair them with freesync, isnt affecting gsync sales in the slightest.
Just because nVidia only focuses on selling discrete GPUs, doesn't mean we have to limit AMD to it, as well. It's called
market cannibalization, you should look it up. And they indeed still increased their shipments, even in discrete GPUs.
As for major product release, the stocks are still on shortage. The problem is, it's not the gamers that were/are getting them, but the miners.
Read here if you want to know why it's a bad thing. Anyway, this resulted in more people driven towards nVidia. Also noted earlier: some people only want AMD to succeed so that nVidia would lower their excess prices.
As for APU & FreeSync pairing: if they pair them with FreeSync, then it's not a G-Sync. By your logic, people buying shoes from Nike is not affecting Adidas sales in the slightest.
Lets be clear here, I'm not saying freesync is going to die off, it cant really as its already done and out there, there is no benefit to removing it.
Well that's where we disagree. Obsolete technologies will die off. Look at the previously mentioned 3D, for instance. FreeSync isn't immune to it, either. It's just in a far better position than G-Sync.
You are saying gsync will die, however none of the data supports that.
What do you mean "none of the data supports that"? The manufacturers flocking to FreeSync supports it. G-Sync price premium supports it. You do know how standard wars go, right?
I'm not going to get in to a nonsense argument over what might happen in 5 or 10 years time, its irrelevant, freesync is having no effect on gsync and that was its entire purpose. Here we are 2 years on and its done nothing for AMD.
What you consider "done nothing for AMD", others see that without FreeSync, AMD would be in a far worse position. And as a side note, I'm not giving G-Sync more than five years... (unless the previously mentioned "evolvement" or price-drop happens)
Just looking at your claim there is no margin in gsync - the gsync version is more expensive by $1-200, the fpga costs $25 in bulk... but no margin? If that is even remotely true then it also means there is no margin on any monitor and the whole industry is about to go out of business. Its complete and utter nonsense to say there is no margin on gsync monitors.
And that's not how manufacturing works. Do you know why we had flickering PWM-driven LED backlights? Because the component that would have allowed no-PWM would have cost 50cents, instead of 10cents (*). So $25 is actually a huge deal, and that's only the MATERIAL cost. Then there's nVidia's engineering costs that have to be recouped, in addition to extra engineering by the monitor manufacturers. Read more here (doesn't even take into account the R&D, which is a major point in the computer industry):
https://www.investopedia.com/terms/p/production-cost.asp
https://www.investopedia.com/ask/an...en-production-cost-and-manufacturing-cost.asp
After that, there is the profit margin expectancy for potential investors, shareholders or business owner. That is based on risk-free interest rate, combined with industry-specific profit margin expectancy. Read more here:
http://www.inetstart.com/how-to-set-your-pricing-and-profit-margins-on-computer-products.html
Also, $200 is not even nearly enough. Even the example pair I provided earlier had £270 ($360) difference.
Also re-read my earlier explanation:
"
Or are we talking about different concepts/terms? When I'm talking about margin, I'm talking about the "extra" that is left after the costs are deducted from the selling price. G-Sync monitors have higher manufacturing and engineering costs, and to recoup these costs, they need to be kept at a higher price point, or be sold at a loss. Now, who is going to take that hit? nVidia? No. Manufacturer? No. Retailer? No. So who do we have left? Yes, it's indeed the consumer, by paying the nVidia-tax. Who SHOULD take the hit? nVidia, because manufacturers and retailers don't really have a personal stake in the matter, as they can just manufacture and sell FreeSync monitors. Which is what they are increasingly moving towards to. And while nVidia can foot the bill on consumers, the consumers will naturally direct their interest elsewhere, a.k.a. FreeSync."
Value is entirely subjective, the person buying ascribes value to something. So more people value nvidia cards.
"Best value for money". Yes, you can purchase a card that offers 10000 points in a benchmark for £500, or you can purchase a card that offers 12000 points for £800. For those that NEED the extra 2000 points, and have some excess cash, it is indeed better value, because they NEED it. But the first one gives 20 points/£, whereas the latter gives 15 points/£. Then there is the crowd that is willing to take 6000 points for £200 (30 points/£).
Similarly, a £480 monitor with FreeSync offers value for some, whereas the same monitor with G-Sync for £750 offers value for others. Some people are willing to pay extra for the lock-in they got suckered in. Then there are people who refuse to pay the price premium.
If we are talking about value for money for computer hardware, the initial starting point is indeed performance/£. But certainly, if someone prefers to have it black, then they can prioritize their purchasing options accordingly.
And not just more but they waited for vega to come out and then jumped on nvidia to the tune of 4 times as many extra cards sold.
Like I stated earlier, some people only want AMD to succeed because they want nVidia to lower their prices. Also, stock shortages. Please pay attention.
(*):
OK, that's only part of the reason, partly it was that some people thought 120Hz flicker with LED would be no worse than with 120Hz on CRT or CCFL -- which was totally incorrect