Missing the point. Consumers are not choosing freesync, they are buying cheap screens because they are cheap and then buying Nvidia GPU's to go with them.
Nope, I don't think I'm missing the point. And not actually even disagreeing with the above sentence. But like EsaT so eloquently noted earlier, and the point I tried to drive through, but which you apparently ignored: "at the moment"
We all agree that nVidia is indeed currently selling more GPUs. But with their present G-Sync strategy, they are undermining their future GPU sales. Currently people keep ignoring G-Sync monitors and instead purchase FreeSync monitors, simply because G-Sync is indeed too expensive. And because of FreeSync's zero price premium, there is no reason to NOT go for FreeSync, if you're not willing to pay extra for G-Sync. Furthermore: good luck finding a post-2015 100Hz+ monitor without FreeSync or G-Sync (hint: BenQ Zowie TN monitors).
But monitors are a long-term purchase, whereas GPUs are noticeably shorter term. So if the customer already has a FreeSync monitor, then during the next GPU upgrade, the red team suddenly starts to make more sense. For some people, that upgrade phase has already begun, and the red team will keep receiving more and more purchases. With the way things are going, we can expect this trend to increase.
All in all, for nVidia's future G-Sync strategy, the logical solution is to start supporting FreeSync / Adaptive Sync.
Or if I'm still missing the point, please do elaborate.
Lol at gsync monitors being low margin.
I find it really funny that someone can accuse gsync of having low margins and then the next person comes up with a £200 monitor as proof that freesync is thriving
How are these two things interconnected? The low margin was presented as a reason for why the G-Sync prices won't drop. Whereas FreeSync is zero premium, so it doesn't affect margin, in the first place.
Or are we talking about different concepts/terms? When I'm talking about margin, I'm talking about the "extra" that is left after the costs are deducted from the selling price. G-Sync monitors have higher manufacturing and engineering costs, and to recoup these costs, they need to be kept at a higher price point, or be sold at a loss. Now, who is going to take that hit? nVidia? No. Manufacturer? No. Retailer? No. So who do we have left? Yes, it's indeed the consumer, by paying the nVidia-tax. Who SHOULD take the hit? nVidia, because manufacturers and retailers don't really have a personal stake in the matter, as they can just manufacture and sell FreeSync monitors. Which is what they are increasingly moving towards to. And while nVidia can foot the bill on consumers, the consumers will naturally direct their interest elsewhere, a.k.a. FreeSync.
Currently all monitors are low margin. Manufacturers wouldn't need to resort to shoddy QC if they had proper margins to fund their operations. In the last decade or so, when the technologies were relatively stabilized, manufacturers ran to the bottom with price. But now that there are new technologies introduced, which naturally necessitates higher price tags, they're trying to re-educate the consumers to accept the higher prices. But that sort of thing takes a lot of time.
people have been saying "gsync is dead" since day one, however it is still here, no big price drops and more monitors announced to be released
Have they said it IS DEAD, or WILL DIE? There's a difference. I thought the consensus was that G-Sync is overpriced, and nVidia will lose the battle because of it. Meaning it WILL die.
there isn't a gsync version of a 75hz VA monitor though because they aren't throwing gsync in to every monitor just for the sake of it - even the HDR gsync models have been delayed due to the panels not being up to spec - gsync is going for the premium market and keeping to specs instead of just throwing it in to everything just because
if people want to get freesync over and above having a consistent experience then there is a market for that, its not the market nvidia are aiming at though
He means they are fussy what Panels they match it too, some of the cheap panels with F-Sync and poor ranges are basically too cheap and not good spec wise.
Gsync is curated to provide a consistent experience instead of just thrown on to everything because it ticks a box.
... prefers F-Syncs poor HZ range and poorer panels used in some cases.
Honestly they throw it into any cheap panel to get the numbers up.
Now, this would otherwise be a sound argument, except FreeSync can be found in high-end, mid-end AND low-end. More choice is a good thing. Surely we're not giving G-Sync extra credit for LIMITED selection?
Also, I thought it was a universally mutual agreement that G-Sync monitors have the nVidia tax. It's not about "quality", it's about recouping costs. If someone disagrees, then please tell me the differences between these two:
Acer XZ321Q
Acer Z321Q
What justifies the price difference? The panels are the same (Samsung's LTM315HP01, I think?). The FreeSync counterpart actually has BETTER features. So unless there is some panel binning / cherry picking at the background, that "premium quality" -claim holds no water.
(otherwise identical monitors from the same manufacturer are hard to come by, so this pair is my favourite comparison point - anyone is welcome to add more to confirm or refute)
The more probable reason why G-Sync is mostly found on £500+ monitors is because the price premium's relative proportion is smaller when the price is high. For example, if you slap a £200 premium on a £300 monitor, the premium makes 40% of the total price, and further exemplifies the price disparity to the consumer, when he's doing comparisons. But if you slap even a £300 price premium on a £700 monitor, it's still only 30%. Whereas with FreeSync, the £200-£300 can be used for other positive features (better resolution, higher refresh rate, better panel, QC).
Who the hell want to try run 8k today, 4k is not even ideal for gamers yet.
I know some will say Web/Photo work etc but...
I think the original message was that Panos' claim that all Pascal cards should technically support FreeSync because of their full blown DP1.4, is apparently incorrect, and a counter-point was introduced in the form of DSC/8k. Doesn't matter whether anyone needs 8k or not. If it's indeed supposed to be a "full blown" DP1.4, but is not, then we can't straight up assume FreeSync capability, either.