Correction, G-Sync is here and now![]()
For a few people that happened to have the correct monitor type for the diy kit maybe.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Correction, G-Sync is here and now![]()
For a few people that happened to have the correct monitor type for the diy kit maybe.
It doesn't make it any less incorrect.
Didn't say it did, if you happened to have the proper type of monitor and felt confident enough to install the diy kit then yes its here for those people which are comparatively few in the grand scheme of things.
I didn't say you did and retailers were supplying the monitors pre-fitted, so warranty was in tact and no end user fiddling was needed.
How much did that tack onto the price?
I will bow out. I wasn't here to argue over semantics.
Thanks and Orangey pulled me on it and corrected me, so wasn't any need to do the same really.
Wasn't "arguing" over anything, wasn't aware that some places were installing the units themselves.![]()
But it's not getting to the market quicker.
G-Sync was in the market in half the time? The Nvidia video is Jan 2014 for the DIY. And by all accounts G-Sync was announced September/October 2013.
We're 6 months from AMD's conception?
I can't see G-Sync gaining too much momentum unless it's drastically better. And Nvidia should be able to use the Vesa standard anyway. For all we know we'll have G-Sync monitors as well as the Adaptive Sync ones and Nvidia users able to use both.
Nvidia stated a while back that they would look at adaptive sync and adopt it if it proved to be a better solution then gsync.
I'm still leaning in favour of the hardware solution, its just taken amd 6 months to get 4k to work somewhat properly, this is what fuels my fears of a software based solution from either side.
Nvidia stated a while back that they would look at adaptive sync and adopt it if it proved to be a better solution then gsync.
I'm still leaning in favour of the hardware solution, its just taken amd 6 months to get 4k to work somewhat properly, this is what fuels my fears of a software based solution from either side.
I think a lot of the issues are bandwidth / signal related and should disappear with DP1.3a anyway hopefully.
But having NVs scaler will most likely give it the advantage...or not. Time will tell. Despite the general anger towards them charging for it, I very much doubt they didn't anticipate this happening. They're not stupid.
It just reminds me of Tri Def this whole thing. Sorry fans...
It's nice that DM has blocked me as that'll save the whole "It's nothing like Tri Def here is a technical run down on why whilst I overlook the fact it's similar due to AMD out sourcing"
Because Project FreeSync obviates the need for v-sync, gamers especially sensitive to input latency — a delay between mouse movement and cursor movement — will also see a distinct increase in responsiveness.
Finally, disabling v-sync would typically introduce nasty horizontal tearing, but Project FreeSync also eliminates tearing as a rule. Project FreeSync is a "best of all worlds" solution from the perspective of smoothness, image quality and responsiveness.
How does Project FreeSync utilize DisplayPort™ Adaptive-Sync to determine the period of time a frame is displayed to the user?
An AMD Radeon™ graphics card compatible with Project FreeSync uses the DisplayPort™ Adaptive-Sync specification to automatically determine the minimum and maximum refresh rates supported by a dynamic refresh-ready system. Using this approach, no communication must occur to negotiate the time a current frame remains on-screen, or to determine that is safe to send a new frame to the monitor.
By eliminating the need for ongoing communication with pre-negotiated screen update rates, Project FreeSync can execute highly dynamic changes in frame presentation intervals without incurring communications overhead or latency penalties.