I find the lack of G-Sync Monitors, Disturbing

And that is good thing. Which you can't get your head around..

More freesynx monitors is a good thing, more choice, not less. And more affordable.

Once TVs become freesynx I think that would be end of gsync. Nvidia will drop gysnc and use a sync system.

I never said choice was a bad thing, I just said it doesn't prove gsync is about to die off.

In fact if anything I am being pro-choice, people have the choice to buy an AMD GPU and monitor to have cheaper freesync, or nvidia GPU and gsync to have a more curated experience with actual specifications to stick to. The fact that people are actually not choosing AMD and freesync seems to be completely irrelevant to the discussion though so lets hope we have less choice in the future.
 
Last edited:
I've used Dell monitors for years and still have an old 2408 model that's working perfectly setup portrait alongside an also now old 2711. Wouldn't mind upgrading in next year to a g-sync. The 2716DG has been around a while now (too long IMO), so hoping they release a new model for 2018. Wouldn't mind going 4K but still think GPU's are not powerful enough, at least with one, to power a 4K panel. And severe lack of choice of 4k g-sync.
I want something of high quality that doesn't look too "gamey", so professional looking.
 
I find the lack of G-Sync Monitors, Disturbing
I still feel my Asus ROG Swift PG348Q G-SYNC is a great monitor and have no interest in changing it..

But when It does come to time to change am guessing my next monitor will be a 4k 120hz G-SYNC one..
 
Imo the reason there isnt a lot of G-Sync monitors around is because there are still problems with G-Sync. It works in certain situations awesomely i.e. full screen high fps shooters. But if your not doing that, and your playing slow games then you'll suffer from flicker. Also there are issues if you have 2 monitors, play games full screen windowed, and also windows desktop trying to grab control. All these problems causes flicker and low fps in games.

There are work arounds for the issues I've mentioned.

Nvidia as been working on this issue for the last year.

I hope there will be a G-Sync v2 comes out that will build on the foundation of G-Sync technology.
 
More word salad, nom nom nom.

Still banging on the "models proves it" trope. Which still proves nothing except how cheap it is to implement freesync, not how well it is selling.
Still trying to evade every single question and request for clarification, I see... If you think those "word salad" sentences somehow work as a defense, then I can tell you now that you're wrong. On the contrary, it always tells me that I've hit the bulls-eye, and you have nothing more to add.

Also, as long as you keep trying to sneak in false information, I'll keep refuting them one-by-one. Like I said, my Ctrl-C fu is strong. There are plenty of discussion paths that have just simply stopped because you chose to ignore them. Then at later point you've tried to sneak some of them back in, without success. It's not my fault you don't want to defend your position with anything but "Nuh-uh!".

This also applies to the 6:1 ratio (or what you call "models proves it" trope, apparently). As to what it "proves", what it doesn't, and what it indicates, please read my earlier comments. If you want to present counter-arguments, you'll indeed need to propose something more besides "no it doesn't", or even worse, insisting on anti-proofs to things nobody claimed to be the case. For example: I never said that the ratio PROVES FreeSync monitors are selling better -- actually, I haven't said that it would even INDICATE it. There is PROBABLY a correlation, but whether the ratio is higher or lower, there are too many factors to say anything that could be more accurately rationalized. Personal guess would be that on average, for the whole market on new monitor sales, the ratio would be around 8:1, with brick-and-mortar stores definitely keeping it high on FreeSync's favor. Whereas on enthusiast-focused retailers (like OcUK), the ratio would probably be closer to 4:1, or maybe even 3:1. It's a shame Steam Hardware Survey doesn't ask for this, so we could at least get the gaming community's ratios.

Actually, now that we're on the topic of "A proves B" and unnecessary anti-proofs, how about we do this:
You list ALL the claims I've made of something "proving" something (with quotes!), and then we'll see where we stand at? In my view, you're the one that keeps saying "stuff A proves stuff B" -- and like said before, I also think you're using that word a little too willy-nilly. Using some logic, market theory or even facts/statistics as a counter-argument doesn't necessarily PROVE anything, but they can indeed undermine the original claim.

Freesync monitors have had to have big discounts all over the place, not just this weekend. Yawn.
I haven't seen any abnormally high discount sprees among FreeSyncs. Nor does the price aggregate site I'm using as a source show any such out-of-the-order activity. Maybe you're just confusing them because there is indeed the 6:1 ratio to consider, so by design the discounts should be more frequent, as well?

Nice to see you finally admit that your conclusions are completely made up though.
... Conclusion by definition is "a position or opinion or judgment reached after consideration". My words are indeed mostly interpretations of various factors and aspects, as they should be. Why, are you copying your conclusions from a book or something? If you copy someone else's words, it would be encouragable to give credit where credit is due, instead of presenting them as your own thoughts.

I never said choice was a bad thing, I just said it doesn't prove gsync is about to die off.

In fact if anything I am being pro-choice, people have the choice to buy an AMD GPU and monitor to have cheaper freesync, or nvidia GPU and gsync to have a more curated experience with actual specifications to stick to.
Well, you've been presenting lack of choice (or as you call it "curated experience") as some sort of great positive aspect for G-Sync. And frankly, if you yourself consider that as a positive aspect for G-Sync, then that's OK. But, and while not directly saying it, you've also been insinuating that FreeSync is instead for "low-end monitors" (example quote: by slapping it on any old tat), so it kind of needs to be pointed out that FreeSync is indeed for low-end, mid-end AND high-end.

And actually, no, I don't think you've yet said that it doesn't prove that G-Sync is about to die off. You said that the "models prove it" -trope doesn't prove anything. Which I took as referencing to the 6:1 ratio. While connected to the topic, it actually is indeed a separate issue from model variance and requirements.

And just as a pre-emptive notion: I don't think anybody is actually saying that more choice WOULD prove that G-Sync is dying off, either. But we, or at least I, am saying that it's a contributing factor, all the same.

As for "actual specifications":
It's not that FreeSync doesn't have spec requirements, they are just more lax, so manufacturers can choose to make monitors for a larger audience, and give the choice to the customers.

The fact that people are actually not choosing AMD and freesync seems to be completely irrelevant to the discussion though so lets hope we have less choice in the future.
I'm not sure whether you are trying to be witty or something, but that sentence doesn't make any sense.
 
hahaha, well completely ignore the facts and it becomes much easier to believe your own ramblings I guess

I never said freesync is only for low end, I just said the variance of putting it on low end stuff that doesn't meet the specs is damaging to the brand as a whole, and doesn't mean that people are buying freesync for freesync if the monitor is just a really cheap monitor

same as slapping HDR on a monitor that struggles to do 400cd/m2 is either misleading, or people are not buying that monitor for HDR as it can't actually do HDR
 
Last edited:
hahaha, well completely ignore the facts and it becomes much easier to believe your own ramblings I guess
Again, please elaborate what "facts" you are talking about.

Or are you talking about the "conclusion" -definition? Conclusion is not a fact, in itself. Unless we are talking about conclusion in mathematical sense, like algebra, etc., of course. For example, if A > B, and B > C, then yes, we can indeed CONCLUDE that A > C, and state it as a FACT. But if A > B > C > D > E > F > G, we can't conclude that A+B+C > D+E+F+G, at least not without further information.

I am indeed using facts to form my conclusions, and so are you, hopefully. But essentially these conclusions are only our interpretations and opinions of the factors represented to us. It's another issue whether they are correct.

For example:
"Germany is a country in Europe" -- that is a fact.
"G-Sync will lose the standard war" -- that is a statement or opinion.
"HD-DVD lost the standard war" -- that is a fact.

We are both trying to present and explain different aspects (be it facts, theories or opinions) that illustrate and fortify the conclusions we've made on our part. That's how argumentation works. (side note: your "I don't want to talk about it anymore" -approach hasn't improved your stance in the slightest)

You can bring facts to the table ("6:1 ratio", or "nVidia sells more GPUs"), but what you present as an effect of those facts, might only be an opinion, instead.

Example fact out of the "6:1 ratio":
thus far manufacturers have been more interested in implementing FreeSync than G-Sync

Example of opinion/interpretation/conclusion of the "6:1 ratio":
FreeSync is cheaper to implement than G-Sync

While the latter interpretation is most probably true, the 6:1 ratio by itself doesn't PROVE that. And on the former, if I would have said that the nVidia tax is the reason why manufacturers have released more FreeSync monitors, that would be an interpretation, and while probably true, it wouldn't be a fact, anymore. At least not until a manufacturer or two comes forward and represents it as a reason.

I never said freesync is only for low end, I just said the variance of putting it on low end stuff that doesn't meet the specs is damaging to the brand as a whole
Exactly, you didn't say it directly, and I said as much, as well:
"But, and while not directly saying it, you've also been insinuating that FreeSync is instead for "low-end monitors" (example quote: by slapping it on any old tat), so it kind of needs to be pointed out that FreeSync is indeed for low-end, mid-end AND high-end."
Maybe you should start using the quote-feature, so you can double-check how your reply is aligned to the original text, before hitting the "Send" -button.

Also, what do you mean by "the variance of putting it on low end stuff that doesn't meet the specs"? It's not that the FreeSync's own specs aren't met, it's that the specs are more lax than G-Sync HDR, HDR10 or Dolby Vision. That's just differentiation. Same goes for "damaging the brand". G-Sync can try playing the "premium-only" game, but that doesn't mean FreeSync should follow lead. I was also about to say that time will show which strategy is better, but considering their current market positions, maybe this is nVidia's attempt to gain back the momentum. Bad judgment call, in my opinion. HDR and the technology required for it are on their infancy, so mandating the stricter requirements (and the resulting extra cost that follows) will alienate some of the manufacturers and customers.

doesn't mean that people are buying freesync for freesync if the monitor is just a really cheap monitor
Again, nobody said anything of the contrary. For nVidia, the bad part is that they ARE being bought, regardless of the reason.

same as slapping HDR on a monitor that struggles to do 400cd/m2 is either misleading, or people are not buying that monitor for HDR as it can't actually do HDR
I don't see this any differently than how there were FullHD and HD-ready. Manufacturers want to provide cheaper transition-period products for those that are content with it. Nothing is stopping them from releasing products that adhere to higher standards. And already requested this earlier, but could you elaborate on what problems did the FreeSync 2 monitors face? Links, please.
 
Some hefty strawmen being thrown around. Good laughs though thanks.

Obviously as a multibillion dollar corporation nvidia have no idea how to sell product and their strategy is deeply flawed. You should tell them they have it all wrong, I'm sure with their (not) falling sales they will listen to some random on a forum who doesnt appear to have a clue.

I didnt say it directly or indirectly. You are inferring meaning that isnt there just like you are doing with "data" that "proves" nothing.
 
Last edited:
Imo the reason there isnt a lot of G-Sync monitors around is because there are still problems with G-Sync. It works in certain situations awesomely i.e. full screen high fps shooters. But if your not doing that, and your playing slow games then you'll suffer from flicker. Also there are issues if you have 2 monitors, play games full screen windowed, and also windows desktop trying to grab control. All these problems causes flicker and low fps in games.

There are work arounds for the issues I've mentioned.

Nvidia as been working on this issue for the last year.

I hope there will be a G-Sync v2 comes out that will build on the foundation of G-Sync technology.

Weird I've never had any of these issues you mention with G-Sync irrespective of the game.
 
Some hefty strawmen being thrown around. Good laughs though thanks.

Obviously as a multibillion dollar corporation nvidia have no idea how to sell product and their strategy is deeply flawed. You should tell them they have it all wrong, I'm sure with their (not) falling sales they will listen to some random on a forum who doesnt appear to have a clue.

I didnt say it directly or indirectly. You are inferring meaning that isnt there just like you are doing with "data" that "proves" nothing.
You're the one inferring non existent things.

Like Apple Nvidia might be laughing their ass off while igniting cigars with £50 notes scarcity of G-Sync monitor models tells that monitor makers aren't seeing any of those profits.
If something gives better than usual profits that brings new entities to try to take their share.
Just like has happened with all these gaming products like headsets, with new brands "growing like mushrooms in rainy weather" to use Finnish saying.

So unless Nvidia starts actually covering good part of the extra cost of G-Sync for monitor makers G-Sync monitors are going to stay in market niche likely starting to shrink in year or two.

Remember that Intel announced discrete GPU plans and they're never going to start paying Nvidia for being able to support G-Sync.
Already adding Adaptive-Sync/FreeSync to integrated GPUs can increase its demand greatly, because variable refresh rate has the biggest benefits at lower frame rates.
Heck, variable monitor refresh rate would benefit even movie/video watching because of varying video framerates.

So while G-Sync isn't going anywhere in few years it's hard to see any kind long term future for it as closed proprietary standard.
 
Weird I've never had any of these issues you mention with G-Sync irrespective of the game.

That is good that you've never had an issue with it. However, if you do a search the flicker issue is well known. All over the nvidia forums, youtube, and there is a video of experts who ran proper tests to try to figure it out. That's why we know the issue is built in to g-sync i.e. its not some error. It's doing what its supposed to do. But isn't a desired result in some situations.

Here is a video of some people testing the flicker and why it happens: https://www.youtube.com/watch?v=ujgRjsmwtgY
 
Last edited:
That is good that you've never had an issue with it. However, if you do a search the flicker issue is well known. All over the nvidia forums, youtube, and there is a video of experts who ran proper tests to try to figure it out. That's why we know the issue is built in to g-sync i.e. its not some error. It's doing what its supposed to do. But isn't a desired result in some situations.

Here is a video of some people testing the flicker and why it happens: https://www.youtube.com/watch?v=ujgRjsmwtgY

Only time I've ever had flicker with G-Sync is if the FPS goes down to 0 in menus, etc. and even then it seems to happen on some drivers and not others. As mentioned it only happens in normal circumstances if there is a complete stall in the rendering pipeline and if that is happening G-Sync is the least of your issues.

I know some people have had some issues with setups especially with multi-monitor but personally I've had no issues at all.

I currently have a Dell S2716SG alongside a Dell U2913WM and no problems at all stemming from G-Sync and multi-monitor - though older incarnations could have issues with G-Sync and windowed modes (something FreeSync does even worse on - but that is due to inadequacies in the OS itself and not driver or hardware level and G-Sync having the additional hardware actually makes it easier to work around the OS issues with windowed and borderless window modes) but those issues shouldn't be present in more recent incarnations of the software.
 
That is good that you've never had an issue with it. However, if you do a search the flicker issue is well known. All over the nvidia forums, youtube, and there is a video of experts who ran proper tests to try to figure it out. That's why we know the issue is built in to g-sync i.e. its not some error. It's doing what its supposed to do. But isn't a desired result in some situations.

Here is a video of some people testing the flicker and why it happens: https://www.youtube.com/watch?v=ujgRjsmwtgY

That video to me suggests that it is a problem inherent in all LCD panels and perhaps G-Sync is heightening it but is not the cause. Also at the end the summing up was it's not really a big issue. Looking at the forums where people are complaining about constant flickering and some have said drivers have caused it/cured it, maybe that's a different problem altogether.
 
Looking at the forums where people are complaining about constant flickering

Some of that might be due to 144Hz flickering in some cases - sometimes fixed by just using adaptive or full performance rather than the optimal power management setting which still seems a bit buggy.
 
Some useful information and views in this thread. Thanks folks (currently sat on fence between VEGA and GTX for a new build, which will include a FreeSync 1600P).
 
Back
Top Bottom