• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Gsync announced 18/10/2013

But thats just it,it doesnt cache frames it display the frame as fast as it can be fed via your gpu....the monitor WONT stall the flow of frames that are being sent to it.The way i interpret it is like this :-

on the asus monitor in the demo

30-144 FPS------>frame displayed the instant its sent from the gpu not matter what the fps is
1-30 FPS------>last frame in the memory of the gsync board is redisplayed (unsure how this will look or indeed if its correct)

obviously the 1-30 fps bit doesnt sound that great but arguably when your running at them sort of frame rates theres not much you can do but lower detail or invest in something more powerful.

I didn't think it worked at under 30 fps and couldn't find anything to say it did or in fact, anything about 30 fps or under at all.

Nice find :)
 
You can clearly see the tearing on the left monitor without GSync but there is still tearing but much smaller and both still looks un-smooth to me because of the low frame rate, still an improvement none the less.

You do realise that you can't possibly make any determination of the actual frame rate and rendering quality through a video recording limited to 30fps?

Is it just me or has the average IQ on this forum taken a sharp drop recently? Peoples understanding of basic computer and video principles here is appalling. Also why are people talking about AMDs mantle in relation to this? Unless mantle can magically reconfigure the hardware inside your monitor, will have no bearing on the issue of monitor refresh syncing.
 
Last edited:
You do realise that you can't possibly make any determination of the actual frame rate and rendering quality through a video recording limited to 30fps?

Besides the tearing that the quality through a video recording limited to 30fps can introduce is besides the point when i know what 30fps looks like first hand without tearing which still looks un-smooth..

And the fact that fraps was running in the corner showing 40-50 fps
 
Last edited:
You can clearly see the tearing on the left monitor without GSync but there is still tearing but much smaller and both still looks un-smooth to me because of the low frame rate, still an improvement none the less.

Theres allsorts going on here though,you have 30 fps you tube video showing 120 fps footage slowed down with demos running at 45-55 fps.....showing on your 60/120 hz monitor :D.Somewhere in that lot you are losing a lot of the impact im pretty sure about that :).Also when eurogamer/digital foundry use these high speed captures and slow them down even vsync stuff looks like its tearing slightly but its beyond the range of being noticeable.
 
http://www.eurogamer.net/articles/digitalfoundry-nvidia-g-sync-the-end-of-screen-tear-in-pc-gaming

A good read. This is good **** from nvidia but I won't give up my IPS screen for a tn panel.

it will work with ips

http://www.geforce.co.uk/hardware/technology/g-sync#source=pr

Q: What are the resolutions of G-SYNC monitors?
A: NVIDIA G-SYNC enabled monitors will be available in a variety of resolutions from 1920x1080, to 2560x1440 to 4Kx2K. The ASUS VG248QE NVIDIA G-SYNC enabled monitor has a max resolution of 1920x1080.
 
You do realise that you can't possibly make any determination of the actual frame rate and rendering quality through a video recording limited to 30fps?

Is it just me or has the average IQ on this forum taken a sharp drop recently? Peoples understanding of basic computer and video principles here is appalling. Also why are people talking about AMDs mantle in relation to this? Unless mantle can magically reconfigure the hardware inside your monitor, will have no bearing on the issue of monitor refresh syncing.

Amen brother...amen +1
 
Theres allsorts going on here though,you have 30 fps you tube video showing 120 fps footage slowed down with demos running at 45-55 fps.....showing on your 60/120 hz monitor :D.Somewhere in that lot you are losing a lot of the impact im pretty sure about that :).Also when eurogamer/digital foundry use these high speed captures and slow them down even vsync stuff looks like its tearing slightly but its beyond the range of being noticeable.

See post before yours.
 
You do realise that you can't possibly make any determination of the actual frame rate and rendering quality through a video recording limited to 30fps?

Is it just me or has the average IQ on this forum taken a sharp drop recently? Peoples understanding of basic computer and video principles here is appalling. Also why are people talking about AMDs mantle in relation to this? Unless mantle can magically reconfigure the hardware inside your monitor, will have no bearing on the issue of monitor refresh syncing.

+1
 
However, as pointed out in a previous post, there is nothing to stop AMD for adopting the same technology.

except maybe a patent

now, US patent law obliges patent holders of significant patents the obligation to licence those patents if it is deemed to be essential to that industry, but can you see AMD paying a royalty to Nvidia for every GPU they sell that supports this?
 
aye i was typing it as you replied

And you make think that i would not notice 50fps Vsync looking less smooth than 60fps.

With the Xbox you could set DOA to Vsync 60FPS or Vsync 50FPS, i was messing around with the settings just before i went to work and when i got home i started DOA and even just looking at the demonstration screen i noticed something was not quite right, by mistake i had set it to 50fps, i noticed the difference right off the bat.
 
you won't have to, they are doing it in 1080p through 1440p to 4K
I don't know of any TN 1440p panels so they would have to be IPS?

it will work with ips

http://www.geforce.co.uk/hardware/technology/g-sync#source=pr

Q: What are the resolutions of G-SYNC monitors?
A: NVIDIA G-SYNC enabled monitors will be available in a variety of resolutions from 1920x1080, to 2560x1440 to 4Kx2K. The ASUS VG248QE NVIDIA G-SYNC enabled monitor has a max resolution of 1920x1080.


Sounds great then, I like having vsync on but you can feel the input lag over it being off, more so with an IPS panel. This is some good stuff nvidia have been working that really benefits the gamer.
 
And you make think that i would not notice 50fps Vsync looking less smooth than 60fps.

With the Xbox you could set DOA to Vsync 60FPS or Vsync 50FPS, i was messing around with the settings just before i went to work and when i got home i started DOA and even just looking at the demonstration screen i noticed something was not quite right, by mistake i had set it to 50fps, i noticed the difference right off the bat.


But can your tv play at 50hz?
 
Last edited:
Is it just me or has the average IQ on this forum taken a sharp drop recently? Peoples understanding of basic computer and video principles here is appalling. Also why are people talking about AMDs mantle in relation to this? Unless mantle can magically reconfigure the hardware inside your monitor, will have no bearing on the issue of monitor refresh syncing.

First mentioned
Old 18th Oct 2013, 23:59 In the
In the AMD Radeon R9 290X with Hawaii GPU pictured, has 512-bit 4GB Memory thread

Buy what you think is best and let others decide based on the information come retail. All this brand smooching is so old now.
Both Mantle, Gsync and shadow play are all intriguing.



First mentioned. 19th Oct 2013, 08:56
I reckon AMD's Mantle will offer the better solution, there lot more we haven't been told about that,:eek:, yes a bold statement.
But until we have it to try we won't know.
In this thread.

So they are going to be talked about in both rightly or wrongly.
 
Last edited:
Yes obviously.

no not obviously you could still be in 60 hz mode and forcing the game to run at higher settings which is then capped at 50 fps running triplebuffered on a 60 hz display...which is possibly why you witness it as not being smooth...precisely for the reason why nvidia have explained,your seeing 10 duplicate frames.THAT is what gives the impression of it not being smooth,50 fps is not inherently jerky,but running at 50 fps on a 60hz display IS.

But then you appear to have a 50hz display mode so that cant possibly be it...obviously ;)
 
Back
Top Bottom