• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

DisplayPort 1.2a Specification Change Request for Industry Standard Variable Refresh Rate

If you are not noticing something then by definition it is not there.

Example.
This is how large the Andromeda Galaxy would look to us if we could see in Ultraviolet end of the specturm. In reality it is bigger than the moon but we don't see anything like this. To us it simply isn't there in this form.
EpuhHJa.png

That is a terrible assumption to be making - I cannot see gamma radiation, so would it be fine for me to just wander into a nuclear reactor? Just because you cannot see something does not mean it isn't there! Also, I'd like to know the source of that picture as I'm not sure that it's accurate - given that Andromeda is ~2M light years away from us.
 
Ironically where G-SYNC/Free SYNC/whatever SYNC will be most useful, will be at the budget end(people using IGPs and low end graphics cards) and maybe at the extreme high end(4K level) IMHO.
 
That is a terrible assumption to be making - I cannot see gamma radiation, so would it be fine for me to just wander into a nuclear reactor? Just because you cannot see something does not mean it isn't there! Also, I'd like to know the source of that picture as I'm not sure that it's accurate - given that Andromeda is ~2M light years away from us.

I'd like to know as well, I look at M31 through my 12" Newtonian and it is a smudge even at 200x, so not sure where that pic is from lol, doesn't quite add up.
 
120hz monitor doesn't stop screen tear. Wish people would stop spreading this. It's a 100‰ fact to stop screen tear the display and gpu must be in sync with each other.

What a 120hz monitor does do is hides the effect of screen tear more than 60hz.
It's does not remove it, it's very much still there you just not noticing it.

Hiding the effect to the degree you aren't effected by it, is all that matters.

As for saying it's still there, tearing doesn't occur with every frame without g-sync or v-sync, it only merely CAN tear if the screen buffer is updated part way through a screen refresh.

simply put just about everything you can think of looks better at 120hz, screen blur is drastically reduced at 120hz, screen tear is vastly reduced due to smaller differences between frames, though this is only when combined with higher frame rates. Higher frame rate on a 60hz screen makes no difference but on a 120hz screen the difference between two given frames will be reduced drastically.

Screen blur is by far the most important improvement though, VR's current goal is basically low persistence, one of if not the biggest advantage from OLED is lower persistence. Screen blur is the single biggest issue with gaming and the worse the frame rate and the worse the refresh rate, the worse it is.

There is of course nothing stopping you using v-sync or trip buffering at 120hz, with both input lag significantly reduced, the potential drop in frame rate from v-sync significantly reduced and screen blur still massively improved. v-sync at 120hz is 100% tear free, has vastly reduced screen blur, lower input latency(with whichever type of v-sync you enable), if it drops to a lower frame rate due to performance then the next step down isn't 30fps like it is on a 60hz/60fps v-sync setup.

^ Not only that but why do people also assume it's only really beneficial under the refresh rate...

Those with enough GPU grunt are probably dying to give this a go...tearing is in my experience far worse above the refresh rate, and triple buffering / V-Sync or for that matter adaptive V- Sync do not eradicate it by any stretch.

Rubbish, v-sync ABSOLUTELY eradicates screen tearing, it is the entire point and works perfectly. By it's very design adaptive v-sync turns v-sync off... which is when you get screen tearing and triple buffering with v-sync is also completely tear free.

Why do people assume it's only really beneficial below the refresh rate, errm, because if you're at 60hz on a 60hz screen, g-sync/v-sync/no sync will look identical. It will also look identical at 120hz, 30hz.

Most people aren't saying that they, they are saying it benefits more noticeably the lower the framerate goes. The gap between frames is 16.67ms at 60hz, so that is literally the biggest time difference g-sync can improve on best case scenario, on a 120hz screen this is reduced to 8.33ms, at 144hz the biggest difference is even lower.

There is an incredibly obvious reason Nvidia has used 144hz screens yet not once run them at anything above 60hz. If the difference between 144/60hz was minor they wouldn't go out of their way to reduce the refresh rate. The pretty logical conclusion is that Nvidia themselves think showcasing the differences at higher refresh rate would make it look like a minimal improvement.
 
Last edited:
Biggest difference at 60fps yes, however reviewers and videos show a difference at higher refreshes as well

Of course marketing are going to use best case and mass market appeal to showcase, showing a 760 getting 290 levels of smoothness is a good marketing tool

It doesnt automatically mean that there is no benefit in other situations

vsync gets rid of tearing but introduces stutter and lag

In and of itself, an actual retail-not-hacked 1440 120hz monitor is attractive, add in 1440 3d and gsync options and it starts ticking a lot of boxes
 
Last edited:
can introduce stutter and lag, CAN. If you have a rock solid 30/60/120(90?) fps then there is no stutter and no lag and no difference to g-sync, it will be updating at precisely the same times as g-sync. Likewise there is no noticeable lag from v-sync in general, only trip buffered v-sync, which in and of itself reduces stutter at none multiples of 30fps.

At 120hz the potential stutter and potential lag is significantly reduced compared to 60hz, making the downsides of v-sync or trip buffered v-sync vastly smaller.

Either way, he said v-sync didn't eradicate tearing, and it absolutely does, it's the entire point of it and does 100% eliminate tearing.
 
I particularly like the question, how do you know freesync will be as good as g-sync, well, how do you know g-sync will be as good as freesync?

Answer is, the people that have seen both say they both look impressive, I saw multiple people jump on Anandtech's comments as purposefully misquote them to say it wasn't as good which is absolutely NOT what Anandtech(nor any other website on it) said. The fud is pretty clear in terms of which side it comes from.

Most people have been saying it's nice but it probably won't be a huge deal at higher frame rates. As I pointed out again above, how many people have 144Hz screens, who can afford a massively more expensive version of them what under a year later.... who have such low end cards that gaming below 60hz is a huge problem, I would wager it's VERY few people to start with.

I feel he was merely pointing out that G-Sync/Freesync is being accused of this that and the other but no one knows what is what. A few reviewers have seen what G-Sync can do, so we tend to listen to what they have to say.

Pretty much greg yeah :)


"how do you know g-sync will be as good as freesync?"

Surely we have to look at it as how I originally positioned the question, as will free sync be as good or better then gsync, as gsync has effectively set the benchmark, its already on the market for consumers to try, freesync isn't, same go's for reviews, freesync has only been viewed in controlled conditions (that I know of) in form of the windmill demo, same originally for gsync with the pendulum demo, but then a fist full of reviewers were sent units to get some real world testing done, who collectively pretty much said its a brilliant piece of tech :)

G-Sync has set the standard now FreeSync (or what ever it ends up being called) has that bar to aim for/surpass.
 
We need a DP standard that allows for more than a measly 9 foot cable distance at max bandwidth.
 
Rubbish, v-sync ABSOLUTELY eradicates screen tearing, it is the entire point and works perfectly. By it's very design adaptive v-sync turns v-sync off... which is when you get screen tearing and triple buffering with v-sync is also completely tear free.

Why do people assume it's only really beneficial below the refresh rate, errm, because if you're at 60hz on a 60hz screen, g-sync/v-sync/no sync will look identical. It will also look identical at 120hz, 30hz.

Most people aren't saying that they, they are saying it benefits more noticeably the lower the framerate goes. The gap between frames is 16.67ms at 60hz, so that is literally the biggest time difference g-sync can improve on best case scenario, on a 120hz screen this is reduced to 8.33ms, at 144hz the biggest difference is even lower.

There is an incredibly obvious reason Nvidia has used 144hz screens yet not once run them at anything above 60hz. If the difference between 144/60hz was minor they wouldn't go out of their way to reduce the refresh rate. The pretty logical conclusion is that Nvidia themselves think showcasing the differences at higher refresh rate would make it look like a minimal improvement.

I didn't mean to include V-Sync into that equation TBH, defeats the object. V-Sync does irradiate it, but at a high cost in some instances. It's terrible when using in FPS games. You might remember some of those when you were younger and played games perhaps?

If you care to try running 3 GPUs at any point, I'd like you to see just what tearing is. I think maybe you may benefit from the practical element instead of just reading chap. People are commenting a lot in saying "Yer well I'm rarely below 60FPS so this doesn't concern me."

Head to desk.
 
Last edited:
That is a terrible assumption to be making - I cannot see gamma radiation, so would it be fine for me to just wander into a nuclear reactor? Just because you cannot see something does not mean it isn't there! Also, I'd like to know the source of that picture as I'm not sure that it's accurate - given that Andromeda is ~2M light years away from us.

http://www.slate.com/blogs/bad_astronomy/2014/01/01/moon_and_andromeda_relative_size_in_the_sky.html

My post was in response to Shankly's point that just because you don't see it doesn't mean it doesn't exist and therefore we should do something about it (ie G-Sync). He was referring to screen tearing and in that context my argument is that, if it isn't noticeable then it doesn't matter. I brought up the image of Andromeda to show another example of something does exist but since we don't see it then it doesn't matter.

Anyone gaming at <60FPS will see a benefit but at 1920x1080 most high end GPU get way above that. One of the G-Sync review touch on the point that the benefit is almost negligible at higher FPS.
 
Last edited:
ONE of them, several others point out that there is a clear benefit whenever you are not able to maintain locked to the monitors refresh rate 100% of the time, e.g. with a 120hz monitor you are going to get stuttering because of vsync doing 120-60-120-60 everytime you get a dip below 120
 
Not sure if that's good news or not

Its good news, 1.2a supports vBlank Timings, some screens already support vBlank and AMD already have it in their drivers, which is how they were able to demonstrate it at CES 2014.

I think what AMD are trying to achieve is vBlank added as a standard to DP 1.2a, so its a standard amongst all screens.
 
ONE of them, several others point out that there is a clear benefit whenever you are not able to maintain locked to the monitors refresh rate 100% of the time, e.g. with a 120hz monitor you are going to get stuttering because of vsync doing 120-60-120-60 everytime you get a dip below 120

My post was in the context of screen tearing. 120Hz vsync off vs G-sync is where the reviews stated the benefit was negligible because screen tearing becomes almost unnoticeable at such high FPS. Input lag is exactly the same with G-sync vs vsync off.
 
Back
Top Bottom