I find the lack of G-Sync Monitors, Disturbing

Man of Honour
Joined
13 Oct 2006
Posts
91,164
Which means people waited for vega then heavily jumped on nvidia cards

Its funny how many did that who could have just bought a 1070 18 months earlier :s not that I can blame them holding out - but just funny to see so many jump once the Vega cards were actually out.

Those prophesying G-Sync dying off or FreeSync "winning" any time soon aren't actually reading the numbers - with nVidia heavily dominant in discrete GPU space and on a rise it will keep fuelling sales of it.
 
Associate
Joined
12 Sep 2006
Posts
758
yeah i was pretty disappointed after their hype.

Currently i'm thinking if you are upgrading both:

vega+freesync monitor = best performance/cost
nvidia+gsync monitor = best performance

if you don't care about the monitor, then both nvidia & amd are <very roughly> in the same ball park performance/cost wise, driver updates seem to be skewing that all over the place currently too. For every game Nvidia is significantly faster, there's another AMD takes the crown.
 
Associate
Joined
6 Apr 2011
Posts
710
Location
Finland
@andybird123, in general:
I see you're still trying to drive through the "nVidia sells more GPUs" -- WHICH I ALREADY AGREED TO. Everybody agrees to it. Is that the only straw you have, or why are you glutching into it like your life depended on it?

Well, because you're recycling your messages, I'll echo with mine:
"We all agree that nVidia is indeed currently selling more GPUs. But with their present G-Sync strategy, they are undermining their future GPU sales. Currently people keep ignoring G-Sync monitors and instead purchase FreeSync monitors, simply because G-Sync is indeed too expensive. And because of FreeSync's zero price premium, there is no reason to NOT go for FreeSync, if you're not willing to pay extra for G-Sync."

... I think that answer actually covers quite a lot of the points you were trying to make (again). Or would you like to make yet another spin on dressing the same message as something else? My Ctrl+C fu is quite strong, I can keep going.

------------------

Individual points:
Nvidia increased GPU shipments by 30% this quarter while AMD didnt... if that doesnt make freesync irrelevant then I dont know what does.
... No, they both increased their shipments. Actually they both increased their market share, as well. nVidia just increased them more. Furthermore, how exactly would that make FreeSync irrelevant, in the first place? With manufacturers introducing new FreeSync monitors 6 to 1 against G-Sync, you'll have to provide quite a bit better arguments to refute FreeSync's standing. Sorry, but it's indeed not FreeSync's irrelevance that's on the table. Or were you just trying to throw off the discussion to nVidia vs. AMD?

If you want to side-step, I might just as well start talking about bunny revolution: bunnies like carrots, so pears are irrelevant. Pears are green, and nVidia is considered the "green team", which means nVidia is irrelevant. How about you "just let that sink in".

Your big long word salads based on no actual information completely ignores all the real life indicators. It is hilarious and I would love to read more fan fiction from you.
Cute. I'm presenting statistics, and you're saying there is no actual information, yet you basically offer no other message than "nVidia sells more GPUs". Also, be careful on the ad hominem, yourself. Saying my writings are fan fiction might not be a direct slander, but you're quite on the edge there. Either present a direct argument on where I'm wrong, or stop circling around.

The total discrete market increased 29% and nvidias sales increased 29%, so the increase in discrete sales all went to nvidia.
Ugh, that's not how math works. Please read that again, maybe you just didn't realize how completely counter-intuitive that was.

Just in case, let me give you an example:
If the total market (=100%) is 1000 units, and nVidia holds 98% of market (980 units), and AMD holds 1% (10 units), and "others" hold 1% (10 units), and then total market increases by 20% (from 1000 to 1200), during which AMD then manages to increase THEIR sales by 20% (from 10 to 12), does that mean that 200 units equals to 2 units?

Please make a distinction between market share and sales. When one side raises its market share, the others have to collectively lose some. It is ALWAYS 100% in total. But everyone's unit shipments and/or sales can indeed increase simultaneously.

I fail to see how consoles or TV's having freesync affects gsync sales. As the % of people who use a monitor with a console or TV with a PC is tiny. Freesync doing well in alternative markets doesnt kill off a product in a completely unrelated field.
He wasn't actually talking about G-Sync vs. FreeSync market shares or sales, he was talking about AMD vs. nVidia. Pay attention. As for how consoles' and TVs' FreeSync support affects G-Sync sales: indeed, it doesn't, at least not directly, in the short term. But it increases FreeSync's ubiquitousness, which has a halo effect. Essentially, it makes FreeSync (/Adaptive Sync) the new de facto standard.

there are literally an extra 3 million card sales in the last 3 months that disagree
That doesn't actually mean they're better value for money, by default. It just means more people bought nVidia cards. For example, Apple products aren't better value for money, but they've always had good marketing and brand image, and thus the loyal fans buy them. For more info, you can check
https://en.wikipedia.org/wiki/Veblen_good
https://en.wikipedia.org/wiki/Giffen_good
https://en.wikipedia.org/wiki/Status_symbol

Its funny how many did that who could have just bought a 1070 18 months earlier :s not that I can blame them holding out - but just funny to see so many jump once the Vega cards were actually out.
Unfortunately, lots of people want AMD to succeed just so nVidia would drop their prices. Which we can all agree were high because of AMD's long period of non-competitiveness. It also didn't help that the cryptominers made (/still make?) Vega stay on a constant shortage, thus driving the gaming populace more towards nVidia. From what I've seen on my country, Vega cards are still low on stock.

Those prophesying G-Sync dying off or FreeSync "winning" any time soon aren't actually reading the numbers - with nVidia heavily dominant in discrete GPU space and on a rise it will keep fuelling sales of it.
With G-Sync's current price premium...? Yeah, I think I'll put my money on FreeSync, sorry. Granted, if they were on equal ground, nVidia's GPU sales would indeed help G-Sync monitors skyrocket.
 
Soldato
Joined
3 Dec 2012
Posts
2,718
Location
Northern Ireland
As I mentioned on page 2 I bought the Aoc AG322QCX and was worried about moving to this from my G-sync Acer XB270HU but I needn't have. I loved the idea of having g-sync but a while back I thought I'd disable it just to see what I was benefitting and I couldn't tell any difference, I appreciate there those out there that can but at 120-140fps I just can't. When I was running my old GTX970 and struggling to hit 60fps at 1440p then YES I could see what g-sync was offering me but running a GTX 1080 now and easily hitting 120+fps I feel the tangible benefit decreases. My Aoc cost me £350 on sale and have it up and running now and it's buttery smooth with no g-sync - at least to my eyes.
 
Soldato
Joined
30 Nov 2011
Posts
11,376

What statistics?
You've not posted anything that is backed up by real numbers. You are literally just making things up based on nothing. Vs looking at real numbers and basic economics.

They both increased market share? AMD just went from 30% to 27% in discrete GPU sales. In a quarter where they had a major product release. People buying APU's, even if they did pair them with freesync, isnt affecting gsync sales in the slightest.

Lets be clear here, I'm not saying freesync is going to die off, it cant really as its already done and out there, there is no benefit to removing it. You are saying gsync will die, however none of the data supports that.

I'm not going to get in to a nonsense argument over what might happen in 5 or 10 years time, its irrelevant, freesync is having no effect on gsync and that was its entire purpose. Here we are 2 years on and its done nothing for AMD.

Just looking at your claim there is no margin in gsync - the gsync version is more expensive by $1-200, the fpga costs $25 in bulk... but no margin? If that is even remotely true then it also means there is no margin on any monitor and the whole industry is about to go out of business. Its complete and utter nonsense to say there is no margin on gsync monitors.

Value is entirely subjective, the person buying ascribes value to something. So more people value nvidia cards. And not just more but they waited for vega to come out and then jumped on nvidia to the tune of 4 times as many extra cards sold.
 
Last edited:
Associate
Joined
6 Apr 2011
Posts
710
Location
Finland
What statistics?
You've not posted anything that is backed up by real numbers. You are literally just making things up based on nothing. Vs looking at real numbers and basic economics.
I can't post the address of the aggregate site where I got the numbers for the FreeSync vs. G-Sync model variants, but I can post a screenshot, if you want. Are you disputing the 6:1 ratio?

I also gave the Acer XZ321Q vs. Acer Z321Q links for price difference comparison. Unsurprisingly, the discussion suddenly dropped at that.

I also provided calculations to show the logic behind my theory why G-Sync is mostly found on £500+ monitors. Nobody picked up on that, either.

All you've been doing is sticking to the "nVidia sells more GPUs", by comparison.

They both increased market share? AMD just went from 30% to 27% in discrete GPU sales. In a quarter where they had a major product release. People buying APU's, even if they did pair them with freesync, isnt affecting gsync sales in the slightest.
Just because nVidia only focuses on selling discrete GPUs, doesn't mean we have to limit AMD to it, as well. It's called market cannibalization, you should look it up. And they indeed still increased their shipments, even in discrete GPUs.

As for major product release, the stocks are still on shortage. The problem is, it's not the gamers that were/are getting them, but the miners. Read here if you want to know why it's a bad thing. Anyway, this resulted in more people driven towards nVidia. Also noted earlier: some people only want AMD to succeed so that nVidia would lower their excess prices.

As for APU & FreeSync pairing: if they pair them with FreeSync, then it's not a G-Sync. By your logic, people buying shoes from Nike is not affecting Adidas sales in the slightest.

Lets be clear here, I'm not saying freesync is going to die off, it cant really as its already done and out there, there is no benefit to removing it.
Well that's where we disagree. Obsolete technologies will die off. Look at the previously mentioned 3D, for instance. FreeSync isn't immune to it, either. It's just in a far better position than G-Sync.

You are saying gsync will die, however none of the data supports that.
What do you mean "none of the data supports that"? The manufacturers flocking to FreeSync supports it. G-Sync price premium supports it. You do know how standard wars go, right?

I'm not going to get in to a nonsense argument over what might happen in 5 or 10 years time, its irrelevant, freesync is having no effect on gsync and that was its entire purpose. Here we are 2 years on and its done nothing for AMD.
What you consider "done nothing for AMD", others see that without FreeSync, AMD would be in a far worse position. And as a side note, I'm not giving G-Sync more than five years... (unless the previously mentioned "evolvement" or price-drop happens)

Just looking at your claim there is no margin in gsync - the gsync version is more expensive by $1-200, the fpga costs $25 in bulk... but no margin? If that is even remotely true then it also means there is no margin on any monitor and the whole industry is about to go out of business. Its complete and utter nonsense to say there is no margin on gsync monitors.
And that's not how manufacturing works. Do you know why we had flickering PWM-driven LED backlights? Because the component that would have allowed no-PWM would have cost 50cents, instead of 10cents (*). So $25 is actually a huge deal, and that's only the MATERIAL cost. Then there's nVidia's engineering costs that have to be recouped, in addition to extra engineering by the monitor manufacturers. Read more here (doesn't even take into account the R&D, which is a major point in the computer industry):
https://www.investopedia.com/terms/p/production-cost.asp
https://www.investopedia.com/ask/an...en-production-cost-and-manufacturing-cost.asp

After that, there is the profit margin expectancy for potential investors, shareholders or business owner. That is based on risk-free interest rate, combined with industry-specific profit margin expectancy. Read more here:
http://www.inetstart.com/how-to-set-your-pricing-and-profit-margins-on-computer-products.html

Also, $200 is not even nearly enough. Even the example pair I provided earlier had £270 ($360) difference.

Also re-read my earlier explanation:
"Or are we talking about different concepts/terms? When I'm talking about margin, I'm talking about the "extra" that is left after the costs are deducted from the selling price. G-Sync monitors have higher manufacturing and engineering costs, and to recoup these costs, they need to be kept at a higher price point, or be sold at a loss. Now, who is going to take that hit? nVidia? No. Manufacturer? No. Retailer? No. So who do we have left? Yes, it's indeed the consumer, by paying the nVidia-tax. Who SHOULD take the hit? nVidia, because manufacturers and retailers don't really have a personal stake in the matter, as they can just manufacture and sell FreeSync monitors. Which is what they are increasingly moving towards to. And while nVidia can foot the bill on consumers, the consumers will naturally direct their interest elsewhere, a.k.a. FreeSync."

Value is entirely subjective, the person buying ascribes value to something. So more people value nvidia cards.
"Best value for money". Yes, you can purchase a card that offers 10000 points in a benchmark for £500, or you can purchase a card that offers 12000 points for £800. For those that NEED the extra 2000 points, and have some excess cash, it is indeed better value, because they NEED it. But the first one gives 20 points/£, whereas the latter gives 15 points/£. Then there is the crowd that is willing to take 6000 points for £200 (30 points/£).

Similarly, a £480 monitor with FreeSync offers value for some, whereas the same monitor with G-Sync for £750 offers value for others. Some people are willing to pay extra for the lock-in they got suckered in. Then there are people who refuse to pay the price premium.

If we are talking about value for money for computer hardware, the initial starting point is indeed performance/£. But certainly, if someone prefers to have it black, then they can prioritize their purchasing options accordingly.

And not just more but they waited for vega to come out and then jumped on nvidia to the tune of 4 times as many extra cards sold.
Like I stated earlier, some people only want AMD to succeed because they want nVidia to lower their prices. Also, stock shortages. Please pay attention.

(*): OK, that's only part of the reason, partly it was that some people thought 120Hz flicker with LED would be no worse than with 120Hz on CRT or CCFL -- which was totally incorrect
 
Soldato
Joined
6 Sep 2016
Posts
9,528
I also gave the Acer XZ321Q vs. Acer Z321Q links for price difference comparison. Unsurprisingly, the discussion suddenly dropped at that.

And that is why Nvidia lost a sale from me, and no doubt to many other people as well. Who in the right mind would spend £300 for the same monitor, just to get G-Sync?

Definetly there are people buying Nvidia GPU, but won't be buying G-Sync monitors, because of the extra cost. They'll just use 144hz, and v-sync on or off instead.

Freesync more or less does the same thing, sure G-Sync maybe offer more, but freesync does enough- locks monitor frequency to the GPU.

Also cost of Nvidia GPU's is a bit insane, £600+ on a GPU? Yeah right.
 
Soldato
Joined
30 Nov 2011
Posts
11,376

You're just regurgitating the same nonsense. You have no information on the manufacturing cost of a gsync monitor so you have no idea what the margins are. A gsync monitor does not cost $100 more to produce, it just doesnt, so if a gsync monitor is selling for $100-200 more than the exact same without, there is MORE margin, not less. Manufacturers wouldnt be onboard if there was nothing in it for them. If they werent sellong prices would drop or manufacfuring would end - nearly 3 years after freesync is released and neither has happened which means your assumptions are wrong.

How many models of freesync monitors has no bearing on gsync sales. As I've pointed out repeatedly, a freesync monitor being sold to an nvidia card user, APU user, console user, is in no way a lost sale for gsync. An addidas sale is not a lost sale for a welington boot.

Your examples are completely made up based on nothing, real world indicators say you are wrong. Your data points simply don't support your conclusions.

Freesync hasnt killed gsync, it hasnt even improved AMD's market share. In that respect it has completely failed. Vega isnt sold out anywhere, again its nonsense to claim poor sales are a result of being limited stock. AMD increased sales by about half a million, with a major product release, against a competitor who sold an extra 3 million with no new product. How anyone can claim thats anything other than an abject failure on AMD/freesyncs part baffles me. Freesync generates no revenue, it doesnt increase AMD's sales, it serves no purpose for the company that spent the r&d on it.

And 3 years in you think they havent already recouped r&d costs on gsync? Thats cute, but also has to be wrong.

Even the monitor pair you've chosen is available at other prices elsewhere, which proves the retailer is setting the price based on what they think they can sell it for, not what anyones cost price is.
 
Last edited:
Soldato
Joined
29 May 2007
Posts
4,898
Location
Dublin
It can only be because Nvidia charges so much more than they need to for the G-Sync module. It’s one of the reasons why I stick with AMD. FreeSync monitors will always be better value. Nvidia’s whole closed garden ecosystem and pricing etc. is bad for everyone. Apart from Nvidia and their shareholders of course.
 
Associate
Joined
6 Apr 2011
Posts
710
Location
Finland
You're just regurgitating the same nonsense.
Mr. "nVidia sells more GPUs" doesn't have much credibility to say that, I'm afraid...

You have no information on the manufacturing cost of a gsync monitor so you have no idea what the margins are. A gsync monitor does not cost $100 more to produce, it just doesnt, so if a gsync monitor is selling for $100-200 more than the exact same without, there is MORE margin, not less.
I didn't say it costs $100 more to PRODUCE. I gave an example pair of otherwise identical monitors, where the G-Sync counterpart costs £270 more, for the end-user.

Also, did you even look at the links I gave, or at least the text accompanied with the link? Please contact one of your friends who has some economics and/or business classes under their belt, and ask how much impact an indiviual $25 extra material cost has. Also ask how R&D cost recouping works. Or better yet, please show them this discussion and ask their opinion on what to answer next. Or if you don't want to bother them for something like this, please search google for a bit. Because it seems to me like you are way over your head on this topic. And yes, I know I'm condescending, at the moment. And indeed, I'm doing it on purpose.

Manufacturers wouldnt be onboard if there was nothing in it for them.
That 6:1 ratio indicates that they AREN'T as much onboard and that there isn't as much for them as there is with FreeSync. And your insistence on "nVidia sells more GPUs" now works against you on this one, because if we consider the fact that nVidia indeed sells more GPUs, then why aren't manufacturers releasing more G-Sync monitors, instead?

If they werent sellong prices would drop or manufacfuring would end - nearly 3 years after freesync is released and neither has happened which means your assumptions are wrong.
The price dropping would be a viable action if they had enough margin to sacrifice, or were emptying EoL stock. But if they are releasing new monitors at high prices, when the competing standard is more ubiquitous, it indicates that they CAN'T lower the price. Also, manufacturing doesn't just end to a wall. I repeat: you do know how standard wars go, right? Anyway, that increasing 6:1 ratio is a sign of the trend the manufacturing is going towards. Side note: it seems to me like you're using "which means" and "proves", etc. a little too lightly. Do you have a degree in economics, retail, business or in any similar field?

How many models of freesync monitors has no bearing on gsync sales. As I've pointed out repeatedly, a freesync monitor being sold to an nvidia card user, APU user, console user, is in no way a lost sale for gsync. An addidas sale is not a lost sale for a welington boot.
Your shoe analogy doesn't hold water (pun intended). FreeSync and G-Sync monitors are used for the same end purpose. Adidas and wellington are not. FreeSync purchase means that there will be no G-Sync purchase. If the customer needs running shoes AND rubber boot, he has to buy both. With FreeSync and G-Sync, you only purchase one. Purchased FreeSync monitor is indeed a lost sale to G-Sync. Likewise, an Adidas shoe purchase is a lost sale to Nike.

Your examples are completely made up based on nothing, real world indicators say you are wrong. Your data points simply don't support your conclusions.
Please elaborate. Which example? What real world indicators are you talking about, that I haven't belied, yet? Which data point and which conclusion are you talking about? If you want to say you disagree with everything I'm saying, that's your right. But if you try to say I'm WRONG, then you have to back it up by something, otherwise your statement doesn't have any weight on it.

Freesync hasnt killed gsync, it hasnt even improved AMD's market share. In that respect it has completely failed.
Who said it HAS killed G-Sync? I said FreeSync will win the standard war. As for the non-improved market share and "failing", I already gave my answer:
"What you consider "done nothing for AMD", others see that without FreeSync, AMD would be in a far worse position." Also, like stated earlier, AMD DID increase their market share. Just not in some individual category of your choosing, on an individual quarter.

Vega isnt sold out anywhere, again its nonsense to claim poor sales are a result of being limited stock. AMD increased sales by about half a million, with a major product release, against a competitor who sold an extra 3 million with no new product. How anyone can claim thats anything other than an abject failure on AMD/freesyncs part baffles me.
Like you yourself stated, their sales INCREASED. And are you denying the miner-craze and/or stock shortage, by any chance? Please search "vega crypto mining". Or if you are denying the effects of such shortages to a competitor's sales, please read the links I provided earlier. Also, nobody is denying that AMD could have done better. Same principle as with "What you consider "done nothing for AMD", ...".

Freesync generates no revenue, it doesnt increase AMD's sales, it serves no purpose for the company that spent the r&d on it.
Please read my earlier messages, this relates to how nVidia is undermining their future GPU sales by letting FreeSync take the lead, and which will drive them to eventually have to change their strategy.

And 3 years in you think they havent already recouped r&d costs on gsync? Thats cute, but also has to be wrong.
I'm not even sure they'll ever get in the black with this project. So no, I don't think they have yet recouped their R&D costs. And I think they shouldn't stubbornly even try. They should instead write the project off as something to increase/maintain the profits on their main business segment. Then price it so that it's more price-competitive with FreeSync. But that strategy might be too late, already. So they'll have to weigh in whether it's already time to switch to FreeSync/Adaptive Sync.

Even the monitor pair you've chosen is available at other prices elsewhere, which proves the retailer is setting the price based on what they think they can sell it for, not what anyones cost price is.
Are you actually serious with that statement...? That's because different retailers have different profit margin targets/requirements and different supplier contracts and middlemen, etc. Do you know how supply chain, investors, shareholders and other interest groups work? If the retailer cuts prices, it means they want to move merchandise. It's a viable short-term tactic, but for a long-term profitability, a retailer can't just sell everything at a loss or even at a near-zero margin.
 
Soldato
Joined
30 Nov 2011
Posts
11,376
Again, no evidence to back up any of your claims, showing a massive disregard for basic business sense, all the indicators prove the opposite.

The only thing your ratio proves is that freesync is cheap to implement and AMD are happy to devalue their trademark by slapping it on any old tat. Like allowing manufacturers to slap freesync-HDR on monitors that arent.

Its a bit difficult to actually be condescending when the indicatirs all point in the opposite direction. But good luck in whatever you choose to do, just hope it isnt business as you dont appear to be able to read the market, prices, shipments and draw any kind of realistic conclusion.

Retailers havent "cut" prices, there is a £100 spread on one of the monitors you chose, without discounts, which means the 3-4 retailers have deliberately chosen different price points from release. So one of those retailers is making £100 more by choice, it doesnt prove gsync is low margin, it proves the opposite. The one with the highest price is the one that claims to have the best links direct to manufacturers, so as close to no supply chain as you can get.
 
Last edited:
Soldato
Joined
18 Oct 2002
Posts
19,338
Location
Somewhere in the middle.
I own a g sync Ips and I can tell you that it's image quality is no higher than my cheap Korean non g sync Ips.

I like GSync but I can't defend it. It's price mark up is a disgrace. About 5 years after Korean 1440p IPS monitors have been over clocking to 100hz and we still have to pay over 3 times the price for a newer g sync variant.

55 inch HDR Oled tvs cost about ten percent more than some superwide Asus using ancient technology. Winds me right up lol.
 
Associate
Joined
12 Sep 2006
Posts
758
Bring back that 55" curved oled and give it freesync!

I'd buy that in a heartbeat.

a 43" would be better for my desk tbh, but i'd suffer ^^
 
Associate
Joined
12 Sep 2006
Posts
758
yeah, I kinda feel there is something just round the corner. Wish i'd just grabbed a 34" 2 years ago now but there were so many panel issues all over the place it put me off. I don't think it's much better these days :(
 
Associate
Joined
28 Nov 2012
Posts
266
Location
Zurich, Switzerland
Loads more ppl would have gsync if they fixed the panel lottery. Gimme a dell ultrasharp with gsync.
the problem is, if they "willingly" fixed that issue, these monitors would cost twice what they cost today. It would mean selling A, B and C class monitors. So they are just bundling them up and acting like they do not understand what the fuss is all about. Oh ..bleeding? sorry, we did not know, send it back and we'll send you another one. I will report back how my 32" IPS Predator is once I get home today :)
 
Man of Honour
Joined
13 Oct 2006
Posts
91,164
About 5 years after Korean 1440p IPS monitors have been over clocking to 100hz and we still have to pay over 3 times the price for a newer g sync variant.

There is a huge difference between the motion clarity of a 100+Hz "gaming" monitor and those overclocked Korean, etc. panels though and that doesn't come cheap and some of them had quite high input latency (though some where quite good).
 
Back
Top Bottom