I find the lack of G-Sync Monitors, Disturbing

Wow, 2 pieces of anecdotal evidence. We're on a roll. I never said everyone with an nvidia gpu was going to buy a gsync monitor, i said 30% market share on GPU's indicates that if freesync monitors are outselling gsync then it isnt because they are buying them for freesync. And then you provide an example of exactly that.

And yeah ^ this. Gsync is curated to provide a consistent experience instead of just thrown on to everything because it ticks a box.
 
Last edited:
Yes I knew you misread my post but he does not prove anything other just he is anti G-Sync and prefers F-Syncs poor HZ range and poorer panels used in some cases.

Honestly they throw it into any cheap panel to get the numbers up.

Laptops used basically the same for years.
 
Last edited:
On hardware level all Pascal cards are capable to do so, because all they come with a full blown DP1.4 and HDMI 2.0b slot and all standards apply.

I'm not sure you're correct there. According to Linus Tech Tips, Nvidia's implementation of DP 1.4 doesn't support Display Stream Compression which means that two DP cables are required for Dell's 8K monitor. So I wonder what else is missing?
 
Who the hell want to try run 8k today, 4k is not even ideal for gamers yet.

I know some will say Web/Photo work etc but...
 
Yes I knew you misread my post but he does not prove anything other just he is anti G-Sync and prefers F-Syncs poor HZ range and poorer panels used in some cases.

Honestly they throw it into any cheap panel to get the numbers up.

Laptops used basically the same for years.

Anecdotal evidence means it comes purely from what someone says - it is basically worthless in this context compared with market evidence like sales figures and prices/discounts.

If you did a survey of thousands of people then it might be of some limited value, but a sample size of 2 is completely irelevant.
 
Video editing, art, and design, really.

Why not quote my whole post and see you just said exactly what I said someone would say... :rolleyes:

They seem to have done ok till now without 8k its once again to try make a market in a slowing market to get sales like initially 4k ands HDR well before they were even needed right up to today.

Edit: how on earth did I forget the curve????
 
Last edited:
Missing the point. Consumers are not choosing freesync, they are buying cheap screens because they are cheap and then buying Nvidia GPU's to go with them.
Nope, I don't think I'm missing the point. And not actually even disagreeing with the above sentence. But like EsaT so eloquently noted earlier, and the point I tried to drive through, but which you apparently ignored: "at the moment"

We all agree that nVidia is indeed currently selling more GPUs. But with their present G-Sync strategy, they are undermining their future GPU sales. Currently people keep ignoring G-Sync monitors and instead purchase FreeSync monitors, simply because G-Sync is indeed too expensive. And because of FreeSync's zero price premium, there is no reason to NOT go for FreeSync, if you're not willing to pay extra for G-Sync. Furthermore: good luck finding a post-2015 100Hz+ monitor without FreeSync or G-Sync (hint: BenQ Zowie TN monitors).

But monitors are a long-term purchase, whereas GPUs are noticeably shorter term. So if the customer already has a FreeSync monitor, then during the next GPU upgrade, the red team suddenly starts to make more sense. For some people, that upgrade phase has already begun, and the red team will keep receiving more and more purchases. With the way things are going, we can expect this trend to increase.

All in all, for nVidia's future G-Sync strategy, the logical solution is to start supporting FreeSync / Adaptive Sync.

Or if I'm still missing the point, please do elaborate.
Lol at gsync monitors being low margin.
I find it really funny that someone can accuse gsync of having low margins and then the next person comes up with a £200 monitor as proof that freesync is thriving
How are these two things interconnected? The low margin was presented as a reason for why the G-Sync prices won't drop. Whereas FreeSync is zero premium, so it doesn't affect margin, in the first place.

Or are we talking about different concepts/terms? When I'm talking about margin, I'm talking about the "extra" that is left after the costs are deducted from the selling price. G-Sync monitors have higher manufacturing and engineering costs, and to recoup these costs, they need to be kept at a higher price point, or be sold at a loss. Now, who is going to take that hit? nVidia? No. Manufacturer? No. Retailer? No. So who do we have left? Yes, it's indeed the consumer, by paying the nVidia-tax. Who SHOULD take the hit? nVidia, because manufacturers and retailers don't really have a personal stake in the matter, as they can just manufacture and sell FreeSync monitors. Which is what they are increasingly moving towards to. And while nVidia can foot the bill on consumers, the consumers will naturally direct their interest elsewhere, a.k.a. FreeSync.

Currently all monitors are low margin. Manufacturers wouldn't need to resort to shoddy QC if they had proper margins to fund their operations. In the last decade or so, when the technologies were relatively stabilized, manufacturers ran to the bottom with price. But now that there are new technologies introduced, which naturally necessitates higher price tags, they're trying to re-educate the consumers to accept the higher prices. But that sort of thing takes a lot of time.

people have been saying "gsync is dead" since day one, however it is still here, no big price drops and more monitors announced to be released
Have they said it IS DEAD, or WILL DIE? There's a difference. I thought the consensus was that G-Sync is overpriced, and nVidia will lose the battle because of it. Meaning it WILL die.

there isn't a gsync version of a 75hz VA monitor though because they aren't throwing gsync in to every monitor just for the sake of it - even the HDR gsync models have been delayed due to the panels not being up to spec - gsync is going for the premium market and keeping to specs instead of just throwing it in to everything just because

if people want to get freesync over and above having a consistent experience then there is a market for that, its not the market nvidia are aiming at though
He means they are fussy what Panels they match it too, some of the cheap panels with F-Sync and poor ranges are basically too cheap and not good spec wise.
Gsync is curated to provide a consistent experience instead of just thrown on to everything because it ticks a box.
... prefers F-Syncs poor HZ range and poorer panels used in some cases.

Honestly they throw it into any cheap panel to get the numbers up.
Now, this would otherwise be a sound argument, except FreeSync can be found in high-end, mid-end AND low-end. More choice is a good thing. Surely we're not giving G-Sync extra credit for LIMITED selection?

Also, I thought it was a universally mutual agreement that G-Sync monitors have the nVidia tax. It's not about "quality", it's about recouping costs. If someone disagrees, then please tell me the differences between these two:
Acer XZ321Q
Acer Z321Q

What justifies the price difference? The panels are the same (Samsung's LTM315HP01, I think?). The FreeSync counterpart actually has BETTER features. So unless there is some panel binning / cherry picking at the background, that "premium quality" -claim holds no water.
(otherwise identical monitors from the same manufacturer are hard to come by, so this pair is my favourite comparison point - anyone is welcome to add more to confirm or refute)

The more probable reason why G-Sync is mostly found on £500+ monitors is because the price premium's relative proportion is smaller when the price is high. For example, if you slap a £200 premium on a £300 monitor, the premium makes 40% of the total price, and further exemplifies the price disparity to the consumer, when he's doing comparisons. But if you slap even a £300 price premium on a £700 monitor, it's still only 30%. Whereas with FreeSync, the £200-£300 can be used for other positive features (better resolution, higher refresh rate, better panel, QC).

Who the hell want to try run 8k today, 4k is not even ideal for gamers yet.

I know some will say Web/Photo work etc but...
I think the original message was that Panos' claim that all Pascal cards should technically support FreeSync because of their full blown DP1.4, is apparently incorrect, and a counter-point was introduced in the form of DSC/8k. Doesn't matter whether anyone needs 8k or not. If it's indeed supposed to be a "full blown" DP1.4, but is not, then we can't straight up assume FreeSync capability, either.
 
If you spin it out to long enough then yeah eventually it might die off, in 10 or 20 years we might not even be using monitors anymore, but i dont think that qualifies as freesync having "won" at anything.
 
If you spin it deep enough, then only a mention of G-Sync in the drivers could be interpreted to mean that G-Sync is still technically "alive", and hasn't thus "lost".

In 5 years' time, I'm expecting that G-Sync has either
A) evolved into something more multipurpose than what it is now
or
B) driven into oblivion like the 3D fad 5 years ago

This is actually quite a good analogy, as 3D is partly revived/continued with VR. So like VR to 3D, G-Sync better be something "different" in the future (A). If not, then it will follow the fate of regular 3D (B).

With FreeSync vs. G-Sync, I will consider the party that is at some point deemed irrelevant to be the one that "lost", and the other party by definition to be the one that "won".
 
Nvidia increased GPU shipments by 30% this quarter while AMD didnt... if that doesnt make freesync irrelevant then I dont know what does.

Just let that sink in - AMD had a major product release yet people flock to nvidia - and in the face of heavy "AMD + freesync = cheaper than gsync" advertising. Obviously people arent buying that schtick.

Your big long word salads based on no actual information completely ignores all the real life indicators. It is hilarious and I would love to read more fan fiction from you.
 
Nvidia increased GPU shipments by 30% this quarter while AMD didnt... if that doesnt make freesync irrelevant then I dont know what does.

Just let that sink in - AMD had a major product release yet people flock to nvidia - and in the face of heavy "AMD + freesync = cheaper than gsync" advertising. Obviously people arent buying that schtick.

Your big long word salads based on no actual information completely ignores all the real life indicators. It is hilarious and I would love to read more fan fiction from you.

That's not the whole story though is it - AMD & Intel also saw GPU shipments increase, Nvidia's market share dropped, AMD's didn't & Intels increased. You can even read that as Nvidia's GPU's are more likely to break and need replacing if you really want to :(

Full story is here: https://wccftech.com/nvidia-amd-intel-gpu-market-share-q3-2017/

Infact NVidia's market share is on the decline, and has been since Q1'17.

Just playing Devil's Advocate really, I have no particular love for either manufacturer. I go with the best overall performance/price whenever I upgrade & it's usually Nvidia these days. (Hackintosh means limited AMD choice)
 
Just playing Devil's Advocate really, I have no particular love for either manufacturer. I go with the best overall performance/price whenever I upgrade & it's usually Nvidia these days. (Hackintosh means limited AMD choice)

Sadly from how people post rabid fanboys. Even if one costs 10x the amount they'll still go for that brand.
 
That's not the whole story though is it - AMD & Intel also saw GPU shipments increase, Nvidia's market share dropped, AMD's didn't & Intels increased. You can even read that as Nvidia's GPU's are more likely to break and need replacing if you really want to :(

Full story is here: https://wccftech.com/nvidia-amd-intel-gpu-market-share-q3-2017/

Infact NVidia's market share is on the decline, and has been since Q1'17.

Just playing Devil's Advocate really, I have no particular love for either manufacturer. I go with the best overall performance/price whenever I upgrade & it's usually Nvidia these days. (Hackintosh means limited AMD choice)

Lol, nvidias market share just increased, particularly in the discrete market where they actually operate, but it also went from 16 to 19% even including APU's.

The total discrete market increased 29% and nvidias sales increased 29%, so the increase in discrete sales all went to nvidia.

Wccftech are just pulling their data from jon peddie research website so the original story is here: https://www.jonpeddie.com/store/market-watch

If the APU/GPU report is just out the GPU only will be along shortly.
 
Last edited:
Lol, nvidias market share just increased, particularly in the discrete market where they actually operate, but it also went from 16 to 19% even including APU's.

The total discrete market increased 29% and nvidias sales increased 29%, so the increase in discrete sales all went to nvidia.

Wccftech are just pulling their data from jon peddie research website so the original story is here: https://www.jonpeddie.com/store/market-watch

If the APU/GPU report is just out the GPU only will be along shortly.

I take it back, I looked at the blue line and thought it was nvidia, not intel!!!

I won't edit my above post, even though it contains bad info. I had no idea AMD & Nvidia were that close wrt. market share - about 6% difference!

That will change though, a lot. The new consoles have AMD GPU's...
 
lol yeah pretty close between Nvidia and AMD. No need to be rabid fanboy to the point of mentall illness. Who cares it's just a bloody GPU, I'll pick whatever is best value for money- and that at the moment is not Nvidia. And lol at "ticking box" for Freesync/Adaptive Sync, it works, it works bette than Vsync, and there is no royality or locking into a brand. If Intel release a Freesync GPU just means fully open source/no royalities.

And Intel wasting both of them.
 
I take it back, I looked at the blue line and thought it was nvidia, not intel!!!

I won't edit my above post, even though it contains bad info. I had no idea AMD & Nvidia were that close wrt. market share - about 6% difference!

That will change though, a lot. The new consoles have AMD GPU's...

The interesting one will be the discrete GPU report - my maths skills are failing me on working out a 30% gain on 70% market share but I think that puts AMD back in to low 20's on discrete share. Which means people waited for vega then heavily jumped on nvidia cards, so freesync not really doing its job of trying to hold back nvidia sales.


I fail to see how consoles or TV's having freesync affects gsync sales. As the % of people who use a monitor with a console or TV with a PC is tiny. Freesync doing well in alternative markets doesnt kill off a product in a completely unrelated field.

I'll pick whatever is best value for money- and that at the moment is not Nvidia.

there are literally an extra 3 million card sales in the last 3 months that disagree

Ad hominem attacks dont change the data
 
Last edited:
Back
Top Bottom