• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia promises support for G-Sync competitor Adaptive-Sync

Maybe the Overhead or Polling AMD speak off is the issue AnandTech picked up on with Sleeping Dogs.

"I’ve been wanting to play Sleeping Dogs ever since it came out, and the G-Sync review gave me the opportunity to do just that. I like the premise and the change of scenery compared to the sandbox games I’m used to (read: GTA), and at least thus far I can put up with the not-quite-perfect camera and fairly uninspired driving feel. The bigger story here is that running Sleeping Dogs at max quality settings gave my GTX 760 enough of a workout to really showcase the limits of G-Sync.

With v-sync (60Hz) on I typically saw frame rates around 30 - 45 fps, but there were many situations where the frame rate would drop down to 28 fps. I was really curious to see what the impact of G-Sync was here since below 30 fps G-Sync would repeat frames to maintain a 30Hz refresh on the display itself.

The first thing I noticed after enabling G-Sync is my instantaneous frame rate (according to FRAPS) dropped from 27-28 fps down to 25-26 fps. This is that G-Sync polling overhead I mentioned earlier. Now not only did the frame rate drop, but the display had to start repeating frames, which resulted in a substantially worse experience. The only solution here was to decrease quality settings to get frame rates back up again. I was glad I ran into this situation as it shows that while G-Sync may be a great solution to improve playability, you still need a fast enough GPU to drive the whole thing."

http://www.anandtech.com/show/7582/nvidia-gsync-review/2

Because the GPU still must wait for the Display, you still can with Gsync have high Latency. 28FPS wouldn't been smooth vs 60fps Gsync or Higher

With F-Sync telling the Display its Frame rate and when to Refresh you do after wounder this must be faster right?
Look the issue with Vsync that also needs the Display to tell the GPU when to release a frame.

So Testing F-sync vs Gsync Sleeping Dogs and CSGO so far seem to be the best games to use for testing.
 
It would have been silly for nVidia to not support this. G-Sync is here and now and has been for some time, so the best of both worlds would be good :)
 
Of course they had a choice. This is nvidia remember - the guys who cut their nose off to spite their face where using a secondary nvidia for physx is concerned.

In all honesty, am surprised, but pleased if they do.

As others have said, will only be good for consumer.
 
With F-Sync telling the Display its Frame rate and when to Refresh you do after wounder this must be faster right?

Great if you have fixed FPS. Unless AMD GPUS can see into the future there will be latency and overhead between pushing X fps on the GPU and driving that on the display as part of being told that FPS?
 
Great if you have fixed FPS. Unless AMD GPUS can see into the future there will be latency and overhead between pushing X fps on the GPU and driving that on the display as part of being told that FPS?

the main claim for freesync about low overhead seems to be this claim that there is no 2 way handshake - but neither is there with gsync

as with the other "problem" that shankly is posting about - the monitor tested (and over 10 months ago) was only rated down to 30fps and they were testing settings that dropped down to mid 20's in fps - as magic as gsync/fsync is, if you drop below what the monitor is physically capable of maintaining you are going to hit problems

as usual the amd FAQ is full of unsubstantiated hyperbole - I would like to see this monitor that can go down to 6fps and still hold a frame without artifacting

The gpu doesnt wait for the display shanks, not sure where you are getting that from, with gsync the gpu spits out the frames as quick as it can and the gsync module itself uses a lookaside buffer to decide what it needs to do - because it is look aside and not inline any latency is minimised (around 1 ms acfording to blur busters tests)
 
Last edited:
Just to reiterate the last line of angry birds post...you shouldn't be asking these questions when latency has been addressed in blurbusters testing!!! People seem to be fishing for reasoning but I think generally, NV just aren't all that concerned about FreeSync enough.

NV are very savvy at blocking out foreign hardware, they're not known for blocking out their own users. I don't want to hear irrational discrete PhysX scenarios either lol.
 
Last edited:
Just to reiterate the last line of angry birds post...you shouldn't be asking these questions when latency has been addressed in blurbusters testing!!! People seem to be fishing for reasoning but I think generally, NV just aren't all that concerned about FreeSync enough.

NV are very savvy at blocking out foreign hardware, they're not known for blocking out their own users. I don't want to hear irrational discrete PhysX scenarios either lol.

Nvidia likes voltage locking its cards, not allowing AIB partners to do their thing and make awesome third party products. Just look at what happened to the EVbot. They also like artificially restricting 4-way SLI to only the highest end card at the time, so even if your cards could have done it at one point they shut it off in the drivers later. Example: I had GTX 670's at launch that said 4-way SLI on the box but they killed the feature off immediately because reasons.
http://www.overclock.net/t/1514521/...ync-competitor-adaptive-sync/60#post_22885997
 
I just saw this on TPU,although it is conjecture ATM:

http://www.techpowerup.com/205656/n...ve-sync-tech-to-rake-in-g-sync-royalties.html

NVIDIA's G-SYNC technology is rivaled by AMD's project Freesync, which is based on a technology standardized by the video electronics standards association (VESA), under Adaptive Sync. The technology lets GPUs and monitors keep display refresh rates in sync with GPU frame-rates, so the resulting output appears fluid. VESA's technology does not require special hardware inside standards-compliant monitors, and is royalty-free, unlike NVIDIA G-SYNC, which is based on specialized hardware, which display makers have to source from NVIDIA, which makes it a sort of a royalty.

When asked by Chinese publication Expreview on whether NVIDIA GPUs will support VESA adaptive-sync, the company mentioned that NVIDIA wants to focus on G-SYNC. A case in point is the display connector loadout of the recently launched GeForce GTX 980 and GTX 970. According to specifications listed on NVIDIA's website, the two feature DisplayPort 1.2 connectors, and not DisplayPort 1.2a, a requirement of VESA's new technology. AMD's year-old Radeon R9 and R7 GPUs, on the other hand, support DisplayPort 1.2a, casting a suspicion on NVIDIA's choice of connectors. Interestingly, the GTX 980 and GTX 970 feature HDMI 2.0, so it's not like NVIDIA is slow at catching up with new standards. Did NVIDIA leave out DisplayPort 1.2a in a deliberate attempt to check Adaptive Sync?

I assume a firmware update is all that is required so the cards can be patched later??
 
Last edited:
1.2a has an identical pinout to 1.2 does it not? Don't see why the a revision cannot be retrospectively patched into 1.2 connectors.

The question is, how long before Adaptive-Sync even makes it to market, and how long after that for monitors to appear.
 
I read that as well and the anti-Nvidia spin from the writer was clear to see. For what was basically "Nvidia don't support DP 1.2a atm but do support HDMI 2"

I do tend to agree patch/driver/bios update would be all that's needed

OTH,would that be up to NV or the OEMs that make the cards?? So will it be a case of needing to go for a company which is known for its better support??
 
It would be nvidia, i would imagine a driver update and drivers always come from nvidia, AIB's dont like to support bios updates if they can possibly avoid it due to the potential for customers to brick their cards

But yeah, theres no rush, nvidia supporting adaptive sync now would only potentially harm gsync sales, where as a rumour they might does just enough not to put people off buying the 970/980
 
Nvidia not going to support VESA Adaptive Sync

There’s a bit controversy breaking on the web which I wanted to write a few words on.
Last week some website posted that Nvidia would be adding VESA Adaptive Sync support to their graphics cards,
this misinterpretation or rumor was quickly debunked by Nvidia’s Brian Del Rizzo (Senior PR Manager from Nvidia).

source
 
This is an outrage!
E6rkIOH.gif.png
 
Back
Top Bottom