• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD's Low Framerate Compensation For Nvidia ?

No point worrying about it, it's reads like it's always better on the one the posters using.;):p

I'm so confused :p

Sounds like you wouldn't know any better as what's what between G/FreeSync as you don't play down there anyway?

Anyways anything 30fps just isn't enjoyable no matter what tech we using..

Exactly, G/FS is superb it smooths out the fps and it isn't as brash without it, but it can't mask sitting@30/40 fps, I still know it's sitting there.

It handles most of the sudden fps drops, definitely notice less of those stutters that's for sure, but 80+fps is where I want to be, the higher the better when running 144Hz.
 
Frames are doubled/tripled to bring it back into GSync range, so 21 FPS becomes 42 FPS (and thus 42Hz). That's the long and short of it anyway :p
 
No I was confused by Gregster saying G-Sync works down to 1FPS yet G-Sync currently only works down to 30Hz according to Asus about the Swift.

Yes the vrr range is 30Hz, but theoretically talking, down to 1fps 'G-Sync' range is being handled by G-Sync-just not in the sense of it being in the vrr window-it's as much use as spam windscreen wipers.:D
 
Frames are doubled/tripled to bring it back into GSync range, so 21 FPS becomes 42 FPS (and thus 42Hz). That's the long and short of it anyway :p

It's doesn't become frame per second though.. 21fps will be 42hz...
Image might look smooth but interactions remain unplayable/enjoyable high input latency at them frame rates.

Just like a movie at 23fps looks great but try interacting with it becomes a difference story.
 
As it is right now, there is no difference between the two as freesync has catched up. Now the only difference is the price tag which is around £150 extra for g-sync?
 
As it is right now, there is no difference between the two as freesync has catched up. Now the only difference is the price tag which is around £150 extra for g-sync?

Pretty much, though the price is closer to £200 usually.

Freesync is absolutely wonderful, as is gsync.
 
As it is right now, there is no difference between the two as freesync has catched up. Now the only difference is the price tag which is around £150 extra for g-sync?

caught up?

and yeah, almost, except on those monitors where the gsync one supports a wider range... oh, and ULMB... and 3D
 
Last edited:
Ulmb and 3D work through G-Sync andy or is it still one or the other?

3D almost no one is using it anymore, each time ulmb is mentioned it's the same story-almost no ones using it, that's not to say it's useless.

Imo, not convinced they are much of a usp but more an excuse to bump cost but Nvidia is more desirable it seems as if they didn't sell price would reduce.

They can both surpass each other in different areas at the end of the day and moot unless purchasing at the same time you upgrade gpu.
 
I'm still sitting on my cash, waiting for manufacturers to put their act together to sort out the QC for their IPS monitors, or a good VA monitor with sync tech to be released :rolleyes:
 
caught up?

and yeah, almost, except on those monitors where the gsync one supports a wider range... oh, and ULMB... and 3D

You do know Ulmb is just a monitor thing don't you? My BenQ XL2730Z has is and even my older BenQ none Freesync has it. On BenQ it's called blur reduction and can't be used with freesync enabled because how the strobing of back light works.

Here is people talking about the two.
http://www.overclock.net/t/1531157/ulmb-mode-vs-blur-reduction
 
It's seems light boost aka Ulmb still has issues with brightness and colour when enabled.. It was one of the reasons I didn't use it on my older BenQ monitor just remembered it didn't have BenQ blur reduction, it was light boost and got a hack from blurbusters to enable it on AMD..


Updated Conclusion 17/3/15: Since this article was originally published we've seen a fairly decent uptake of blur reduction backlights by some of the main manufacturers. Natively supported blur reduction modes like those BenQ, Eizo and LG have introduced make using them so much easier. They also seem to offer better performance than LightBoost 'hack' methods, helping maintain a more reliable colour and picture setup. More recently still we've seen NVIDIA's new ULMB mode coupled with their G-sync technology, offering blur reduction options for quite a few new G-sync enabled displays. There are still some issues inherent to blur reduction strobed backlights, most notably their limiting brightness and the appearance of strobe cross-talk. We hope more manufacturers will adopt blur reduction modes in the future, and also figure out a way (like Eizo did in fact) to provide high brightness possibilities when they are being used. We would also like to see more user control over not only the pulse width (provided fairly often), but also the strobe timing, to give users that bit more control over the feature.
http://www.tftcentral.co.uk/articles/motion_blur.htm
 
You do know Ulmb is just a monitor thing don't you? My BenQ XL2730Z has is and even my older BenQ none Freesync has it. On BenQ it's called blur reduction and can't be used with freesync enabled because how the strobing of back light works.

Here is people talking about the two.
http://www.overclock.net/t/1531157/ulmb-mode-vs-blur-reduction

Is it not the same with G-sync though? As far as I know you cannot use ulmb and g-sync together either, but maybe some new monitors can, not sure.

ULMB is not an nvidia thing and 3D I personally have zero interest in. Therefore for me there is no difference between freesycn and g-sync now that freesync has caught up, apart from the price tag :D
 
Is it not the same with G-sync though? As far as I know you cannot use ulmb and g-sync together either, but maybe some new monitors can, not sure.

ULMB is not an nvidia thing and 3D I personally have zero interest in. Therefore for me there is no difference between freesycn and g-sync now that freesync has caught up, apart from the price tag :D

These blur reduction technology can never be used with freesync or Gsync.. Least not with back lighting.
Because how the display needs to strobe it changes the effect of refresh rate to flicker so fast.

Because Freesync and GSync need to change the refresh rate on the fly this can't happen with the back light strobing.

Ulmb is Nvidia's tech it's from light boost and it's locked to Nvidia but like meny review sites have said it effects the brightness and colour.

Other methods out there do a better job and it open to the users to pick and choose how much they want to strobe the light etc blurbusters offer a utility app for other monitors. But sadly not nvidia because again it's locked and what nvidia thinks is best is what you get.
 
Back
Top Bottom