• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Id g-sync worth it?

The "wait and see" people have a couple advantages.

In the case of format wars, a better chance not to be on the losing side (aka HD-DVD).

Cheaper prices.

Being an early adopter is great if you have tons of wonga, for sure. For a lot of us tho, a £500+ monitor is not an investment you want to make lightly.

But there is no 'loser'

G-SYNC will be here till the end with nVidia GPUs.
Adaptive-Sync is an open standard and will also remain around as long as hardware vendors support it.

G-SYNC has been out years now, we are well beyond early adoption.
 
Also gsync appears to be able to push well beyond what standard scalers are capable of, take the 1440/165hz swift, only kicks out 165hz on Maxwell, so nvidia are certainly pushing the boundaries of what we would perceive as normal.

I know on real world terms that extra 20hz means sod all, but iirc 1440/144 was at the absolute peak of what could be done with dp1.2, so 165hz must be well beyond that, which means nvidia have an ace up their sleeves somewhere.

Fun times ahead, I wouldn't be too worried at what you purchase and end up vendor locked for a few years, let's face it, neither camp has released a really terrible card for quite some time.
 
Really, this is SUCH a non-issue. If you buy into G-SYNC or Freesync you lose LITTLE NOTHING by not being able to swap GPU vendors, in the grand scheme of things. Who in the right mind would dump a G-SYNC monitor to swap to AMD, what tangible benefit is AMD going to offer over a nVidia GPU that also makes up for the loss of G-Sync?

I would say that I tend to keep a monitor for longer than a graphics card, but that is not personally true for me at the moment. I have yet to find a monitor I am 100% happy with.

But, the general idea is people see monitors as a big investment as they are often seen as something that will last them through many builds. I like having options open but in all honesty I'm probably going to go G-Sync if the X34 turns out to be okay.

Edit:

Fun times ahead, I wouldn't be too worried at what you purchase and end up vendor locked for a few years, let's face it, neither camp has released a really terrible card for quite some time.

Also a good point :p
 
Last edited:
I know on real world terms that extra 20hz means sod all, but iirc 1440/144 was at the absolute peak of what could be done with dp1.2, so 165hz must be well beyond that, which means nvidia have an ace up their sleeves somewhere.

Is it a case of the panel getting overclocked?

the G2460PF support a 30Hz-160Hz range when FreeSync is in use ;)
 
Good point guys about the longevity of monitors. Iirc I've had my benq in sig for over seven years now. Lost count of the number and type of gpu's I've had in that time, and this is only the second monitor I've owned.
 
Good point guys about the longevity of monitors. Iirc I've had my benq in sig for over seven years now. Lost count of the number and type of gpu's I've had in that time, and this is only the second monitor I've owned.

I'd imagine that's more typical than the people here claiming to change monitors every 2 years...

So vendor lock-in is definitely an issue in the real world.

Plus when have consumers ever benefited from two or more competing hardware standards? Esp if one side eventually "loses", and you happen to have bought into that technology.
 
Well I'd buy into the one that 'wins' if I felt the need.

Wasn't prepared to wait it out and miss out on what is an excellent tech.
 
Is it a case of the panel getting overclocked?

But that AOC monitor is 1080, 1080 @ 160hz is well within the realms (around 10Gbps) of what should be possible over DP1.2a.

Isn't DP1.2a's maximum video bandwidth somewhere in the 17Gbps range? iirc the 1440/144 monitors are right on the edge of that. A calculator I found for calculating bandwidth (not sure how reliable this is) puts 1440/144 @ 8-bit right around 16Gbps, 1440/165 @ 8-Bit comes in at over 18Gbps.

So it begs the question, just what are Nvidia using on the latest Gsync scalars and maxwell cards DP ports?
 
Pass, idk mate but you could be onto something, at the end of the day as you said, choice is there and everyone can get a piece of AS no matter what brand they choose.:)
 
Very much so, wonderful tech.

Tbh I know very little about monitors and the bits that come with it, I'm just making assumptions based on the initial hype on 1440/144 and it being right on dp1.2's limit.

I also can't see either going away any time soon, amd are playing the vesa card, gets it implimented - eventually - at no/little extra cost while nvidia will gun it solo, giving them the freedom to not have to wait for standardisation albeit at a cost.

I feel those waiting for one or the other to die out are just wasting valuable time away from variable refresh :)

Now we just need dp1.3 to hurry up or someone to jump onto thunderbolt 3 (40 Gbps!!) To keep things moving along :D
 
Back
Top Bottom