Associate
That does not sound right......admittedly I'm on a BENQ currently , but mine are all 2.2.
Got my replacement monitor already. Plugged it in and so far zero dead / stuck / bright pixels. so im happy.
Only thing is the courier swapped out my pristine new box with a dirty old battered up one so gona drive back to the Depot and get mine back.
There was even a drivers not from ASUS to say to keep the battered box.
My OCD Kicking in.
Bang on 2.2 for me on my Swift so that doesn't sound right at all.
^^ Don't set it to 2.2 in the nVidia control panel - it allows you to tweak the gamma that the monitor is displaying rather than sets that value explicitly. Somewhere about 0.8-0.92 should give best results.
Nexus do you use this application for your monitor whilst gaming? That's assuming you have an icc profile
http://forums.guru3d.com/showthread.php?t=386325
From comparing side by side with my Dell 0.86+ is fairly negligible for that - below that you quickly start to get noticeable banding, etc. on gradients and so on.
Apologies for not reading through 167 pages to find the answer which im sure has been answered many times!
Does a single GTX980 cut it with this monitor or should I consider SLi or wait for the non reference 980Ti's?
I've not tested but I think with the new settings if you have gsync enabled then the vsync on/off function works like:
VSync on: Gsync active, framerate won't exceed refresh rate
VSync off: Gsync active until you exceed refresh rate, disabled when rendering above refresh rate (so you'd get tearing, etc. at higher framerates).
Wasnt the point of g-sync so you get no more screen tearing any more no matter what the refresh rate is?
Why would they make this change? seems to make g sync exponentially worse or am i missing something?