Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
With F-Sync telling the Display its Frame rate and when to Refresh you do after wounder this must be faster right?
Great if you have fixed FPS. Unless AMD GPUS can see into the future there will be latency and overhead between pushing X fps on the GPU and driving that on the display as part of being told that FPS?
Just to reiterate the last line of angry birds post...you shouldn't be asking these questions when latency has been addressed in blurbusters testing!!! People seem to be fishing for reasoning but I think generally, NV just aren't all that concerned about FreeSync enough.
NV are very savvy at blocking out foreign hardware, they're not known for blocking out their own users. I don't want to hear irrational discrete PhysX scenarios either lol.
http://www.overclock.net/t/1514521/...ync-competitor-adaptive-sync/60#post_22885997Nvidia likes voltage locking its cards, not allowing AIB partners to do their thing and make awesome third party products. Just look at what happened to the EVbot. They also like artificially restricting 4-way SLI to only the highest end card at the time, so even if your cards could have done it at one point they shut it off in the drivers later. Example: I had GTX 670's at launch that said 4-way SLI on the box but they killed the feature off immediately because reasons.
NVIDIA's G-SYNC technology is rivaled by AMD's project Freesync, which is based on a technology standardized by the video electronics standards association (VESA), under Adaptive Sync. The technology lets GPUs and monitors keep display refresh rates in sync with GPU frame-rates, so the resulting output appears fluid. VESA's technology does not require special hardware inside standards-compliant monitors, and is royalty-free, unlike NVIDIA G-SYNC, which is based on specialized hardware, which display makers have to source from NVIDIA, which makes it a sort of a royalty.
When asked by Chinese publication Expreview on whether NVIDIA GPUs will support VESA adaptive-sync, the company mentioned that NVIDIA wants to focus on G-SYNC. A case in point is the display connector loadout of the recently launched GeForce GTX 980 and GTX 970. According to specifications listed on NVIDIA's website, the two feature DisplayPort 1.2 connectors, and not DisplayPort 1.2a, a requirement of VESA's new technology. AMD's year-old Radeon R9 and R7 GPUs, on the other hand, support DisplayPort 1.2a, casting a suspicion on NVIDIA's choice of connectors. Interestingly, the GTX 980 and GTX 970 feature HDMI 2.0, so it's not like NVIDIA is slow at catching up with new standards. Did NVIDIA leave out DisplayPort 1.2a in a deliberate attempt to check Adaptive Sync?
I just saw this on TPU,although it is conjecture ATM:
http://www.techpowerup.com/205656/n...ve-sync-tech-to-rake-in-g-sync-royalties.html
I assume a firmware update is all that is required so the cards can be patched later??
I read that as well and the anti-Nvidia spin from the writer was clear to see. For what was basically "Nvidia don't support DP 1.2a atm but do support HDMI 2"
I do tend to agree patch/driver/bios update would be all that's needed