• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Gsync announced 18/10/2013

The 290x thread is hilarious the way people are downplaying something people in the industry have seen and praised and saying mantle is the game changer when nobody has seen it and only one company is on board so far.
 
The 290x thread is hilarious the way people are downplaying something people in the industry have seen and praised and saying mantle is the game changer when nobody has seen it and only one company is on board so far.

To be honest does this not show how ignorent people on here can be :mad:

I love the concept of this and i feel people with lower end cards/systems will benefit the most and i love that:D
 
The principle is the same though,whichever card you use.When showing 59 fps on a 60 hz cycle one frame has to be shown twice to maintain sync.now how noticeable that is to people I accept is down to the individual,which is why I gave the skyrim example to best show it(although any game can be used,you have to strafe using a keyboard or joypad at constant velocity slowly to see it)

Its not about the principle its about an issue because as i said i was watching on youtube with someone useing Vsync and it was switching from 60 to 59 rapidly and you could clearly see the stuttering and hitching, so while in principle it should be the same there is clearly another issue going as i dont get the stuttering and hitching when i drop from 60-59 and skyrim is not the norm in behavior.

Just like in principle when playing BF3 lowest setting with a solid 60fps Vsync should feel no more responsive than on Ultra setting with a solid 60fps Vsync, but it does because it takes longer for the buffer to draw so while its still 60fps its a much older image being given to the screen, with low setting the buffer can be drawn much faster so it can flush the old one without displaying it as a more up to date one is ready in time..
 
Last edited:
The market has a lot more midrange/lower end cards too. The percentage of people who have rigs that can hold 60 frames in all titles is tiny.
 
In a sense but without knowing the full details, I wouldn't like to say. For sure when the frames are higher, it is acting as a buffer to store and deliever them in order to eliminate tearing but how the magic between low frames works is beyond me. Dropping the monitors refresh rate to the exact rate of the GPU makes it smooth....Apparently :D

I think from what I've read the actual board has memory on it 768/1556 mb depending on who's guessing,and that the frame in buffer is displayed immediatatly between 30-144 fps depending on your monitor obviously but when your framerate drops below 30 it will display the image in that buffer again,so how this will look would be interesting to see :).Though if your in danger of dropping below 30 fps arguably you would be better dropping detail levels or upgrading by that stage.

I'm really looking forward to seeing how sli performs with this,in theory it should really iron out frame pacing/input lag problem associated with multi card setups :D
 
Can't remember the last time I suffered from bad screen tearing, it'll be interesting to see how this pans out.

Though I can see this turning into Gsync vs Mantle all over the forums.
 
http://www.gamespot.com/articles/nv...-screen-tearing-lag-and-stutter/1100-6415660/
Speaking during the announcement, DICE rendering architect Johan Andersson said that it is "essentially impossible to design a game for a fixed frame rate today," due to the different environments used in today's AAA games. By using G-Sync, he continued, developers will no longer have to worry about hitting a baseline performance of 60 frames per second, with the game appearing smooth at all times, regardless of the frame rate.

http://www.anandtech.com/show/7436/nvidias-gsync-attempting-to-revolutionize-gaming-via-smoothness
The combination of technologies like GeForce Experience, having a ton of GPU performance and G-Sync can really work together to deliver a new level of smoothness, image quality and experience in games. We've seen a resurgence of PC gaming over the past few years, but G-Sync has the potential to take the PC gaming experience to a completely new level.

http://www.tomshardware.com/news/g-sync-geforce-gtx-780-ti-shadowplay,24760.html
The difference is incredibly obvious, and G-Sync made 40 FPS look incredibly smooth without tearing or lag.

http://www.pcper.com/news/Graphics-...Sync-Variable-Refresh-Rate-Monitor-Technology
The technology that NVIDIA is showing here is impressive when seen in person; and that is really the only way to understand the difference. High speed cameras and captures will help but much like 3D Vision was, this is a feature that needs to be seen to be appreciated. How users will react to that road block will have to be seen.

I did look on semiaccurate but sadly no news at all about yesterday.

Also, the Asus VG248QE will be $399 (£296.99 Incl tax) with the G-Sync PCB inside. That monitor sells for £289.99 on here without the G-Sync board inside....Pretty good I say :)
 
When you've got a Dice employee praising nvidia tech after AMD have just given them 8 million for mantle you know nvidia has done something good.
 
Until you see one in the flesh ;) it's going to be one of those things when you do see it will be hard to resist (well judging by what the experts are saying anyway)

This looks like it will be a major benefit to the lower/midrange cards as it will give the impression of high constant frames per second, I think this be like seeing/playing on a 120hz monitor monitor for the first time :p

It will make do difference to me because while some flaws are eliminated,like tearing the fundamentals are not changed, 30 still images are still 30 still images updated per second which are made worse because of the flaws like tearing ect.. the only difference will be in no tearing and responsiveness in controls.

On a 30" monitor an object taking 1 second to move from one side to the other will jump laterally nearly 1 inch per update, 60fps= half inch per update unless there is some motion interpolation hardware or software which looks at the frame before and after then adds in between frames, that's not what Gsync is doing unless i missed that feature, which is why i dont watch fast action movies at the cinema even with motion blur i can still see the jump between frames laterally, the screen is so big lateral moment between frames is even bigger.

I had Racedriver Grid on PC and on console when i had the console plunged into a 19" screen at 30fps, it looked nearly just as smooth as on the PC on my 2560x1600 30" at 60fps. but when i plugged the console into my 50" that 30fps [bigger lateral movement between frames] looked awful + the motion blur that could not be turn off nearly made me sick.

Getting rid of tearing and adjusting the Hz on the monitor does not get rid of the fundamental of the amount of moment that has occurred between frames.
 
Last edited:
The market has a lot more midrange/lower end cards too. The percentage of people who have rigs that can hold 60 frames in all titles is tiny.

Which is a very good point and anything that can improve the experience of lower fps is a good thing, but for some to claim that it will eliminate the need for high fps then they are mistaken.
 
Last edited:
Sounds like you are sorted Final8y and you should stick with AMD. G-Sync will make no difference to some folks ;)

Depending on the context.

I think it is a cool feature for those that need it no tearing which is important, smoother transitions to lower frame rates but im not interested in playing at less than 60fps and the cost on top and no TN thanks, ill stick to free Vsync, the only benefit to me would be less output lag maybe.


Hence at my frame rate at 60fps

But in the context of 30fps no good to me at all.
 
The idea is really cool. Not sure about caching frames, just sounds like triple buffering to me. I wonder if this is something that could be hacked for AMD users, like the lightboost hack strobelight. Also everyone is saying how it'll be good for people with low end cards, those guys presumably don't have a tonne of cash so how are they gonna spring for a new monitor...

Also all this proprietary stuff is starting to get on my nerves tbh. Really splitting up the community
 
The idea is really cool. Not sure about caching frames, just sounds like triple buffering to me. I wonder if this is something that could be hacked for AMD users, like the lightboost hack strobelight. Also everyone is saying how it'll be good for people with low end cards, those guys presumably don't have a tonne of cash so how are they gonna spring for a new monitor...

Also all this proprietary stuff is starting to get on my nerves tbh. Really splitting up the community

Getting rid of the tearing without Vsync is a good thing and is something that is needed, i dont remember anyone saying that they dont use Vsync because they dont like how it looks, most people like how it looks but how it feels is a real problem for many people, personally i have got accustomed to it so i notice it but my gaming skills over come it just like when i used to play Quake world at 12fps @ 320 x 200 on my Amiga againts PC users who had 100s of fps and still come in the top 3 until user HangTime on this forum joined the server.

The proprietary stuff does not bother me besides having to find a compatible monitor and the added cost which does bother me.
 
The idea is really cool. Not sure about caching frames, just sounds like triple buffering to me. I wonder if this is something that could be hacked for AMD users, like the lightboost hack strobelight. Also everyone is saying how it'll be good for people with low end cards, those guys presumably don't have a tonne of cash so how are they gonna spring for a new monitor...

Also all this proprietary stuff is starting to get on my nerves tbh. Really splitting up the community

But thats just it,it doesnt cache frames it display the frame as fast as it can be fed via your gpu....the monitor WONT stall the flow of frames that are being sent to it.The way i interpret it is like this :-

on the asus monitor in the demo

30-144 FPS------>frame displayed the instant its sent from the gpu not matter what the fps is
1-30 FPS------>last frame in the memory of the gsync board is redisplayed (unsure how this will look or indeed if its correct)

obviously the 1-30 fps bit doesnt sound that great but arguably when your running at them sort of frame rates theres not much you can do but lower detail or invest in something more powerful.
 
Proprietary is here to stay I'm afraid, both companies need other selling points rather than a couple of percent performance gain that you won't even notice over the other card.
 
Back
Top Bottom