• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Gsync announced 18/10/2013

It's a great idea but it's not going to sway me to an nvidia card. It's still going to come down to pricing, performance, drivers et al.

Also I love my U2713HM and am not in a rush to sell it.
 
I was looking at the Gsync thread over on overclock.net and supposedly it seems only 144HZ monitors can run this,meaning a lot of monitors with IPS panels will be excluded!!:(

The module also costs $175:

http://www.geforce.com/hardware/technology/g-sync/faq

That is around £130 including VAT.

Also,it is stated that they will try to eventually get the cost down to around $130,which is close to £100.
 
Last edited:
I was looking at the Gsync thread over on overclock.net and supposedly it seems only 144HZ monitors can run this,meaning a lot of monitors with IPS panels will be excluded!!:(

The module also costs $175:

http://www.geforce.com/hardware/technology/g-sync/faq

That is around £130 including VAT.

After using an IPS screen I wouldn't use anything else now, every other screen I look at just looks washed out and dull.

£130 is also a significant expense.

:(
 
I think it is a cool feature for those that need it no tearing which is important, smoother transitions to lower frame rates but im not interested in playing at less than 60fps and the cost on top and no TN thanks, ill stick to free Vsync, the only benefit to me would be less output lag maybe.
 
Last edited:
No need to worry, I noticed from the getgo :) I was playing BF2 2560x1600 60fps vsynced with no drops back in the day, it was glorious!

Todays games add so many bull****ty effects you wouldn't even notice without side by side comparison, DOF, motion blur, SSAO, etc, etc. to drag your framerate into the dumps and break you out of 60fps heaven. I like my games crystal clear and smooth as silk tyvm.

Tbh I put up with it in newer games, SC2 atm, that has a horrendous engine. lol, 4v4's end up hovering around 20 fps :) Though a new cpu in the mail should put that game in it's place.

Sorry :) I know you understand the principle but it was a general suggestion for others to try rather than being specific to you,sorry for not making that clear ;)
 
Exactly even with a triple buffered image dropping to just 59 fps for the slightest amount of time causes one frame to be duplicated and causes a slight hitch.you can view it for yourself in pretty much any game by limiting your framerate to 59 (or 119 on a 120 hz display) skyrim is quite good to try this.Next find a long wall and go very close to it,select crouch and walk mode so your going very slowly then press your strafe key left or right to move along the wall..........see that? If not try a few times,see that odd skip every second or so?

You can thank/hate me later that I've drawn your attention to it and you'll even notice it when your playing normally from now on haha ;)

P.s make sure you have vsync on cos you'll just see tearing,oh and obviously your gpu has to be able to render the limited fps total of course I.e 59 or 119

The 59 frame rate hitching seems to be a problem on NV cards, i have seen the hitching on youtube with people having that issue, i dont get that problem with 59 frame rate or switching from 60 to 59 but maybe its more a problem on single gpus which i have not used for a long time.
 
Last edited:
I was looking at the Gsync thread over on overclock.net and supposedly it seems only 144HZ monitors can run this,meaning a lot of monitors with IPS panels will be excluded!!:(

The module also costs $175:

http://www.geforce.com/hardware/technology/g-sync/faq

That is around £130 including VAT.

Also,it is stated that they will try to eventually get the cost down to around $130,which is close to £100.

I know sod all about monitors so apologies if I read this wrong...

The pcper article on gsync suggests future monitors using gsync won't be limited by standard h & v refresh cycles (the article is called something like the death of monitor refresh). Rather the pixel refresh will be limited by gtg which would see a 4ms monitor be able to push*a theoretical 250hz refresh with dp1.2. So refreshing per pixle in sync with the gpu should allow good ips panels to 'refresh' to the gtg limit rather then the traditional 60hz/frame.

I think the asus 144hz was chosen as its already compatible with gsync, so minimal fuss getting them out to the masses.
 
If this is available only on nvidia graphics cards, I wonder what AMDs answer will be to this tech. If gsync works as well as they claim it to, then it really is a game changer and will give them a major upper hand in image quality and perceived performance.

I think people don't fully appreciate how important this is. It should really become an industry standard with monitors and graphics cards.
 
I think it is a cool feature for those that need it no tearing which is important, smoother transitions to lower frame rates but im not interested in playing at less than 60fps and the cost on top and no TN thanks, ill stick to free Vsync, the only benefit to me would be less output lag maybe.

Until you see one in the flesh ;) it's going to be one of those things when you do see it will be hard to resist (well judging by what the experts are saying anyway)

This looks like it will be a major benefit to the lower/midrange cards as it will give the impression of high constant frames per second, I think this be like seeing/playing on a 120hz monitor monitor for the first time :p
 
The 59 frame rate hitching seems to be a problem on NV cards, i have seen the hitching on youtube with people having that issue, i dont get that problem with 59 frame rate or switching from 60 to 59 but maybe its more a problem on single gpus which i have not used for a long time.

The principle is the same though,whichever card you use.When showing 59 fps on a 60 hz cycle one frame has to be shown twice to maintain sync.now how noticeable that is to people I accept is down to the individual,which is why I gave the skyrim example to best show it(although any game can be used,you have to strafe using a keyboard or joypad at constant velocity slowly to see it)
 
If this is available only on nvidia graphics cards, I wonder what AMDs answer will be to this tech. If gsync works as well as they claim it to, then it really is a game changer and will give them a major upper hand in image quality and perceived performance.

I think people don't fully appreciate how important this is. It should really become an industry standard with monitors and graphics cards.

I reckon AMD's Mantle will offer the better solution, there lot more we haven't been told about that,:eek:, yes a bold statement.
But until we have it to try we won't know.
 
Never had any serious tearing issues after i upgraded to my 27" Samsung 120hz monitor and i have always hated vsync with a passion. This tech sounds cool but for the time being that is it.

The worse thing about low fps(like 30 and 40) is not the monitor but the horrible laggy controls you get in most games like shooters and whatnow. Only game to do it somewhat good was Crysis 1 and it didnt feel so bad actually.
 
I reckon AMD's Mantle will offer the better solution, there lot more we haven't been told about that,:eek:, yes a bold statement.
But until we have it to try we won't know.

Isn't that also the case with the 290/290x and now the 780Ti. Hey, this could even be why the 290x has been kept so secret with the lack of specs/benches as they new nVidia had a rival.
 
I reckon AMD's Mantle will offer the better solution, there lot more we haven't been told about that,:eek:, yes a bold statement.
But until we have it to try we won't know.

Mantle is game specific, gsync works in every game with no developer support needed

IF gsync makes dips to 40 imperceptible as the guys at the demo were saying, mantle will need to give a min frame rate boost of 50% to be on a level playing field

That is huge!

Vsync with no input lag is also a pretty big deal
 
Just watched the video and it's very impressive. The animation demo is something that would definitely make me think about going back to nVidia. However, as pointed out in a previous post, there is nothing to stop AMD for adopting the same technology.

This would cause havok for users wanting to move to nVidia for this new tech, to discover AMD have done a similar thing and then wanting to move back.

I think the next 12 months could be interesting!
 
I don't think many people understand the basis of what G-Sync does. There would be a reason why Linus - Ryan and all the other respected review sites are singing the praises of this and claim it as "Game changing"

It stores frames when fps is over the monitors refresh rate and delivers them in order, so no more do we see tearing (which I didn't on a 120Hz monitor in fairness). This is something that I get on my new 1440P 60Hz monitor (even with it overclocked), so V-Sync and input lag or tearing and no input lag.... Which of the two do I prefer to have is what I ask myself.

Another point of reading and listening on this is when you are trotting along the roadside on your merry way, minding your own buisness and suddenly 3 tanks and 3 heli's are on you and the **** is hitting the fan, we no longer need SLI Titans to try and keep the frames up to stop that stutter (and we have all seen that at some stage, regardless of AMD/Nvidia), the monitor will alter its refresh rate to the speed the GPU is delivering the frames, so apparently this eliminates any form of stutter that is seen during the drops.

If the reviewers were awe struck at the presentation and heralding it as a game changer, you can bet it isn't something that will end up under the carpet.

It WILL work for panels from 60Hz 1080P all the way to 4K monitors with different refresh rates. At the moment, it is done for the 144Hz Asus VG monitor and the DIY kit will be available for that from the release (don't know when that is sorry). They have said they will be implementing it into all monitors over time and you can also buy a DIY kit to do it at home and save a few quid. Most of us are not scared of a screw driver but you would lose your warranty.
 
Last edited:
yeah i dont know why it seems to be being downplayed so much on here, this thing is HUGE for pc gaming, this doesn't have to be adopted by developers it will just work for most games. all the review sites are singing its praises sure but watch the developer Q&A at the end of the nvidia presentation with tim sweeney (epic, unreal) , john carmack (id, quake), johan andersson (dice, battlefield). i'd say they are probably the most respected devs in the gaming community and they had nothing but positives to say about it, they also get asked about mantle, sweeney and carmack say its outright bad for pc gaming, andersson obviously defends it as he's working on it with amd

Tim Sweeney said:

“If you care about gaming, G-SYNC is going to make a huge difference in the experience.”

John Carmack added:

“Once you play on a G-SYNC capable monitor, you’ll never go back.”

Johan Andersson concluded:

“Our games have never looked or played better. G-SYNC just blew me away!”
 
Last edited:
so all it is is a buffer in a sense? frames get stored somewhere? memory ect and used up when needed?

same way as a video buffer/youtube ect
 
so all it is is a buffer in a sense? frames get stored somewhere? memory ect and used up when needed?

same way as a video buffer/youtube ect

In a sense but without knowing the full details, I wouldn't like to say. For sure when the frames are higher, it is acting as a buffer to store and deliever them in order to eliminate tearing but how the magic between low frames works is beyond me. Dropping the monitors refresh rate to the exact rate of the GPU makes it smooth....Apparently :D
 
yeah i dont know why it seems to be being downplayed so much on here, this thing is HUGE for pc gaming, this doesn't have to be adopted by developers it will just work for most games. all the review sites are singing its praises sure but watch the developer Q&A at the end of the nvidia presentation with tim sweeney (epic, unreal) , john carmack (id, quake), johan andersson (dice, battlefield). i'd say they are probably the most respected devs in the gaming community and they had nothing but positives to say about it, they also get asked about mantle, sweeney and carmack say its outright bad for pc gaming, andersson obviously defends it as he's working on it with amd

People were hung and burned for daring to say the World was not flat but round.... :D
 
Back
Top Bottom