Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
The 290x thread is hilarious the way people are downplaying something people in the industry have seen and praised and saying mantle is the game changer when nobody has seen it and only one company is on board so far.
The principle is the same though,whichever card you use.When showing 59 fps on a 60 hz cycle one frame has to be shown twice to maintain sync.now how noticeable that is to people I accept is down to the individual,which is why I gave the skyrim example to best show it(although any game can be used,you have to strafe using a keyboard or joypad at constant velocity slowly to see it)
In a sense but without knowing the full details, I wouldn't like to say. For sure when the frames are higher, it is acting as a buffer to store and deliever them in order to eliminate tearing but how the magic between low frames works is beyond me. Dropping the monitors refresh rate to the exact rate of the GPU makes it smooth....Apparently
Speaking during the announcement, DICE rendering architect Johan Andersson said that it is "essentially impossible to design a game for a fixed frame rate today," due to the different environments used in today's AAA games. By using G-Sync, he continued, developers will no longer have to worry about hitting a baseline performance of 60 frames per second, with the game appearing smooth at all times, regardless of the frame rate.
The combination of technologies like GeForce Experience, having a ton of GPU performance and G-Sync can really work together to deliver a new level of smoothness, image quality and experience in games. We've seen a resurgence of PC gaming over the past few years, but G-Sync has the potential to take the PC gaming experience to a completely new level.
The difference is incredibly obvious, and G-Sync made 40 FPS look incredibly smooth without tearing or lag.
The technology that NVIDIA is showing here is impressive when seen in person; and that is really the only way to understand the difference. High speed cameras and captures will help but much like 3D Vision was, this is a feature that needs to be seen to be appreciated. How users will react to that road block will have to be seen.
Until you see one in the flesh it's going to be one of those things when you do see it will be hard to resist (well judging by what the experts are saying anyway)
This looks like it will be a major benefit to the lower/midrange cards as it will give the impression of high constant frames per second, I think this be like seeing/playing on a 120hz monitor monitor for the first time
The market has a lot more midrange/lower end cards too. The percentage of people who have rigs that can hold 60 frames in all titles is tiny.
Sounds like you are sorted Final8y and you should stick with AMD. G-Sync will make no difference to some folks
I think it is a cool feature for those that need it no tearing which is important, smoother transitions to lower frame rates but im not interested in playing at less than 60fps and the cost on top and no TN thanks, ill stick to free Vsync, the only benefit to me would be less output lag maybe.
29th Apr 2013, 16:16
Tearing is a bad experience to me personally but it seems the majority accept varying amount of it, if they manage to solve it without Vsync then that will be the new thing on the block with what was before being put forwards as unplayable and flat out bad experience with reviews asked to focus on it depending on who gets there first.
The idea is really cool. Not sure about caching frames, just sounds like triple buffering to me. I wonder if this is something that could be hacked for AMD users, like the lightboost hack strobelight. Also everyone is saying how it'll be good for people with low end cards, those guys presumably don't have a tonne of cash so how are they gonna spring for a new monitor...
Also all this proprietary stuff is starting to get on my nerves tbh. Really splitting up the community
The idea is really cool. Not sure about caching frames, just sounds like triple buffering to me. I wonder if this is something that could be hacked for AMD users, like the lightboost hack strobelight. Also everyone is saying how it'll be good for people with low end cards, those guys presumably don't have a tonne of cash so how are they gonna spring for a new monitor...
Also all this proprietary stuff is starting to get on my nerves tbh. Really splitting up the community