Nvidia G-Sync - eliminates stutter and screen tearing

True Audio is just a marketing name for 3d sound, it's been around years. The same way PhysX is just a marketing name for Physics engine.
 
Serious question here.

Do you guys who are saying this is the next big thing, really see a lot of tearing or have a lot of input lag?

Maybe it's just me, but I cannot remember the last time I saw any tearing at all...

I get terrible screen tearing, probably because i use a very old TV, most noticeably on STALKER and The Hunter. I'm not sure exactly how old my TV is, but it is at least as old as my xbox 360 which i got quite near release. Not only do i need Vsync to be on in games, but i also need it to be on in some other applications. On a really bad day I will need it on youtube videos. Also, i play a lot of racing simulators, so you can imagine that any input lag is a major disadvantage for me.

If this G-Sync is all its cracked up to be and AMD don't offer something similar then I will not even be considering AMD for my next upgrade unless there is an incredible difference in price/performance in AMDs favour
 
what frequency are you running your display at? I've ran mine at 75 Hz and still see it loads.




Also, the removal of tearing is a biggy. If they can do it at 120 hz too.



Run my monitor at 120Hz mate.. Even so, I still haven't seen any major tearing or stuttering before I got my mon, unless it was some form of frame drop because of the amount of action on screen..
 
Looks awesome, this with upcoming 4k hd is very good for pc gamers!

What seriously is the enthusiasm with 4K gaming?

Around £4000 on a monitor, and then a further £3000 on a sli GTX TITAN setup just to be able to play a modern game @4K resolution with medium settings or >30 frames per second if you want to crank the eye candy up...

...erm, no thanks.

When a good 4K screen costs around £200 and I can run most games maxed at 60 fps with a single high end gfx card, then I might be interested. Until then, 4K gaming is just an unrewarding wallet rape.....and several reviewers have said the same thing. (would rather game at 1080p on Ultra than 4K on Medium)
 
i dont see the big issue with vsync, maybe because i dont play a lot of online fps, but unless my fps is really bad or triple buffering is somehow not enabled i dont see any lag/stutter.
 
With this being propriety I can't see it being good for end users.
Surely adaptive vSync is something that screen manufacturers could implement themselves?
 
LOL so this is more game changing then mantle? Don't think so

What ever this is it will be added to radeonpro I'm no time just like dynamic vsync.

Deluded. It isn't something that can be fixed purely through software. If you'd read into it though you would know that. Mantle only works with a few games this gsync improves all games.
 
Deluded. It isn't something that can be fixed purely through software. If you'd read into it though you would know that. Mantle only works with a few games this gsync improves all games.

Yes you correct I missed the part we need new monitor for this.
I would take high performance of mantle over anything related to vsync anyway.

But on another note it does indeed look good. But I don't use vsync anyway on 120hz.
 
So in their demo they didn't show(afaik) the variable framerate with vsync vs g-sync, but no vsync also with a fairly small frame rate differential.

Frame rate at 120hz, the time between frames(skipped frames) becomes dramatically smaller, and vsync would illiminate the tearing.

They compared Vsync in absolute best case for g-sync, and worst case(in the demo, no where near worst case real life) against no v-sync at all. Neither situation happens in real gaming so the demo is all but irrelevant. The closest to real situation is the last part of the demo but v-sync still enabled on the left.

The variable framerate g-sync WAS impressive, no question, but vsync wasn't remotely as bad as they said in the early part of the demo, and comparing the later part with no v-sync is a joke.

In reality, they are saying, if we bung more stuff on screen and give you 30% less frame rate, this will potentially still keep it looking good, Mantle is saying, we'll give you potentially 30% more frame rate preventing the problem in the first place.

Ultimately 120hz, a good screen and a good card will prevent 99% of tearing and "stutter".

Also worth pointing out that their variable framerate was also completely unrealistic. Have you seen a game generally go between 45-60 completely steady like the video? Or is it more likely to be a solid 60fps then drop to 30fps for some big explosion then jump about all over the place.

If it's very steadily increasing from 45-46-47..... 58-59-60 then the time between frames stays very very close. When you start jumping from 30-58-42-64-35 then the difference in frame times will be drastically different to what they showed in the demo.

Ultimately vs a 120hz screen(in 120hz mode) you're talking about a circa 8ms maximum worst case scenario that the screen would be slower than g-sync, sometimes less(depends when the frame would be ready on the graphics card). The steady frame rate pacing that demo showed(that is completely unrealistic) gives absolute best case scenario of near identical frame pacing from one frame to the next so g-sync will be worse in real games with framerates all over the place. As g-sync gets worse, it starts to approach a 120hz screens max frame time wait anyway. As in if a frame isn't ready, a refresh happens, then it has to wait for the next refresh it's only 8ms away max, at 60hz that is doubled to just over 16ms meaning you miss one frame and it could be 32ms between new frames. When the framerate is so steady(from one frame to the next) then g-sync can give give frame times of say 16ms apart at 60fps, then 16.9ms at 59, 17.2ms at 58fps, etc, with a minor frame pacing difference between each frame all the way down to 45fps. That is what makes it look so steady while the 60fps monitor is consistently dropping 16ms between frames.

But when the frame rate goes 60 straight to 30 and back to 60, g-sync would give you frame times of 16ms - 32ms, 16ms. A 120hz screen would be.... the same. It would update every 16ms, but would refresh the same image twice(which would visually look no different, one point to make on that later) with the next new frame would appear 32ms later, and then the next frame at a 60fps rendering speed would be 16ms later.


One thing to mention is, overdriver/various speeding up monitor modes..... with a monitor set for really heavy overdrive(which generally overshoots and makes everything look a bit poo) assuming each refresh causes the algorithms to process, then each refresh can cause more distortion of the image, two refreshes of the same image can change due to overdrive, while potentially g-sync could prevent ghosting/overdrive artefacts by way of refreshing less often.

Aside from overdrive there is no graphical reason(only screen processing reasons) that a screen in vsync showing the same frame for 32ms refreshing twice would look any different to a g-sync showing one frame for 32ms. There is no flicker on LCD's, the refresh(without screen processing) would look identical for 32ms, as would the g-sync screen.

I'd actually be very interested to g-sync compared on two monitors that (I wonder if they can run only in g-sync mode though?) in normal mode one is noticeably worse with artefacts, ghosting/overdrive issues, if g-sync made a lot more difference on a screen that had worse image processing, I wouldn't be surprised if that was the case.

Ultimately I think g-sync could be a great move forwards, but it will be a massively smaller increase than people would think from that demo, that is 100% best case scenario for g-sync and they simply didn't use vsync in a way that mattered, and did so with a vsync capable of refreshing less than half as often as that particular screen was capable of. Put that demo at 144 fps and say vsync is terrible.

Also in that worst case scenario above it will offer identical lag to vsync, they are telling porkies about it reducing lag significantly. But I would think in general g-sync can't be worse than vsync, and it will most often be better, but very rarely as different as that demo tried to show.
 
Last edited:
High performance of mantle?

Can you show me those figures please?

AMD and Dice have already said.
AMD has also announced Mantle, a “low-level high-performance console-style” graphics API for the PC.

Time will tell how much of a boost, but you can bet its going to perform better, much better than Dx11.
I mean just look at what devs get out of the ps3 with bf3/4 then think how much can they bring out of the PC hardware.

Its quite scary really.
 
I think its manufacturers exploring new ways to increase revenue. GPU's are not selling as they would really like despite sales figures because for a few years now there has been no games driving GPU sales. There are no games that need you to update your GPU to be able to play. I'm talking general populace not enthusiastic PC gamers. Its not likely to change anytime soon and I am taking games in double figures not one single game.

So they are looking at other ways to drive sales. As Custom PC said this month monitor resolutions have stood still for many many years. So thats an area they can push. 4k screens may be for the well off at the moment but once they have come down in price you are going to need a fast GPU to drive them. Likewise with this tech if it takes off its another revenue stream for nvidia.

Personally I dont see that much tearing or stuttering. I dont play BF3. I probably wont bother with BF4 or the new COD.

Last game I saw tearing in was Borderlands 2 and I just put vsync on and it was sorted.
 
not a big deal really but pretty cool guess these monitors will be 500+ for 24 inch?

Carmack said its very inexpensive the tech so i wouldnt have thought so. Probably an extra 20-30 quid for the small device that goes in the monitors and thats it. There maybe external ones without needing to buy a new monitor which might cost a little more say 50 quid?
 
Back
Top Bottom