• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Does games need NVIDIA G-SYNC support for it to work ??

If a company invents something then it is up to them how they use it. They have spent the money to design something, they have paid their engineers every day for however many years to design & build something that does what it does. So in my opinion they can use it however they like. I don't know why some of you find that hard to grasp.

You dont own a Bosch washing machine, then complain because it doesn't use Samsung bubble technology. So why would you buy an AMD card then complain it can't drive some Nvidia tech?

If they make G-Sync, regardless if in theory it could work on AMD cards too, they can use it how they like.
If AMD make something (eg: mantle) then they can use it how they like & people with Nvidia cards can't complain. That's how this world works.

As the end user, you have no right to complain about what either company does. Yes it's annoying when you buy 1 card but then wish you could use a feature another card has by a different manufacturer. But that doesn't give anyone the right to complain about it.
 
Someone didn't read the thread did they. DM called Greg out for being biased before anyone. Nice attempt at picking a fight with me again though.

For someone accusing someone of not reading it correctly, then blaming me, I would point out Stanners post in which he called me biased first. Wrong... yet again.

dave, the main point here isn't the g-sync module, Nvidia came up with that, and patented that and THAT ALONE.

Monitors have refreshed screens for years, in fact you know, since screens were invented. They already have variable refresh rate control, they just don't do it great. It is impossible for Nvidia to patent anything but the chip they are making to do this, because the idea is so basic and prevalent, just being done in a different way.

If/when monitor makers simply incorporate the same functions into their own controllers, it will be different silicon, different code and nothing to do with Nvidia. If we were talking about something vastly more complex than a monitor's controller using the already existing frame buffer to trigger a refresh, then maybe Nvidia could patent that, but this is all g-sync(fundamentally, ignoring their own chips implementation) is.

GPU's updating frames whenever they are ready(without v-sync on this is exactly what all gpu's currently and have always done, this is precisely what causes screen tearing, nothing changes gpu side, it sends the frames as ready rather than waiting to be in time with the screen's buffer), and a monitor using the buffer being updated as the event to trigger the screen refreshing is something so basic Nvidia can't patent that no matter how much people want. Essentially, in the way you've worded it, they haven't invented anything new(in what the chip actually does) just the chip itself is new. It's like Nvidia making an ARM chip, they can't patent the ARM parts inside the chip because those idea's both aren't theirs, aren't new and aren't unique, but they can patent their specific piece of silicon and stop anyone making an identical piece of silicon. The thing of it is, the entire controller for the monitor isn't stupidly cheap, but not that expensive, I can just about understand an upgrade cost. But a new monitor with the controller already in it. You're essentially talking about using controller A instead of controller B, why are you paying for both? You've bought a monitor with controller A and buy controller B to stick on it, fair enough, but brand new... the cost of controller B is offset by controller A not being used or bought in the first place... that is where the joke is. It's an alternative controller basically, it should cost next to nothing extra(though I assume this cost is Nvidia wanting to make a profit, which isn't unfair, just that it won't be competitive).

The more important thing is, low persistence is the far more interesting feature, g-sync(and the I assume new controller) seems like its going to offer a few extra low persistence frame rate settings. Again low persistence is something available on several screens, including the Asus one, and for Nvidia and AMD already.

Low persistence is "do-able" in monitors already, as is updating at different frame rates, nothing g-sync does is new or original, just the specific implementation that specific controller is patentable. The same way the controller in a Samsung 120hz screen and a Asus 120hz screen are all but identical in function, features yet both will be patented and actually marginally different at the silicon level.


I've said before, the push for dev's to increase the usage of low frame rate situations is just bad for gaming though. Read up on low persistence, watch some high fps videos of it in action. Pushing the industry to make 85-120fps running games which all but eliminate every downside of gaming is the far far far better move.

I'd take stagnation in how games look for a couple years(not the content) so hardware gets to a 100fps base, then as hardware increases we add graphical detail and stay at 100fps. Ghost/blur free gaming is THE best scenario for all gamers. Dropping more frequently to sub 40fps rates that are slightly smoother but still pretty crap is not the way the industry should head.
 
Last edited:
If a company invents something then it is up to them how they use it. They have spent the money to design something, they have paid their engineers every day for however many years to design & build something that does what it does. So in my opinion they can use it however they like. I don't know why some of you find that hard to grasp.

You dont own a Bosch washing machine, then complain because it doesn't use Samsung bubble technology. So why would you buy an AMD card then complain it can't drive some Nvidia tech?

If they make G-Sync, regardless if in theory it could work on AMD cards too, they can use it how they like.
If AMD make something (eg: mantle) then they can use it how they like & people with Nvidia cards can't complain. That's how this world works.

As the end user, you have no right to complain about what either company does. Yes it's annoying when you buy 1 card but then wish you could use a feature another card has by a different manufacturer. But that doesn't give anyone the right to complain about it.

ok, erm?

Whos that aimed at?
 
For someone accusing someone of not reading it correctly, then blaming me, I would point out Stanners post in which he called me biased first. Wrong... yet again.

dave, the main point here isn't the g-sync module, Nvidia came up with that, and patented that and THAT ALONE.

Monitors have refreshed screens for years, in fact you know, since screens were invented. They already have variable refresh rate control, they just don't do it great. It is impossible for Nvidia to patent anything but the chip they are making to do this, because the idea is so basic and prevalent, just being done in a different way.

If/when monitor makers simply incorporate the same functions into their own controllers, it will be different silicon, different code and nothing to do with Nvidia. If we were talking about something vastly more complex than a monitor's controller using the already existing frame buffer to trigger a refresh, then maybe Nvidia could patent that, but this is all g-sync(fundamentally, ignoring their own chips implementation) is.

GPU's updating frames whenever they are ready(without v-sync on this is exactly what all gpu's currently and have always done, this is precisely what causes screen tearing, nothing changes gpu side, it sends the frames as ready rather than waiting to be in time with the screen's buffer), and a monitor using the buffer being updated as the event to trigger the screen refreshing is something so basic Nvidia can't patent that no matter how much people want. Essentially, in the way you've worded it, they haven't invented anything new(in what the chip actually does) just the chip itself is new. It's like Nvidia making an ARM chip, they can't patent the ARM parts inside the chip because those idea's both aren't theirs, aren't new and aren't unique, but they can patent their specific piece of silicon and stop anyone making an identical piece of silicon.

The more important thing is, low persistence is the far more interesting feature, g-sync(and the I assume new controller) seems like its going to offer a few extra low persistence frame rate settings. Again low persistence is something available on several screens, including the Asus one, and for Nvidia and AMD already.

Low persistence is "do-able" in monitors already, as is updating at different frame rates, nothing g-sync does is new or original, just the specific implementation that specific controller is patentable. The same way the controller in a Samsung 120hz screen and a Asus 120hz screen are all but identical in function, features yet both will be patented and actually marginally different at the silicon level.


I've said before, the push for dev's to increase the usage of low frame rate situations is just bad for gaming though. Read up on low persistence, watch some high fps videos of it in action. Pushing the industry to make 85-120fps running games which all but eliminate every downside of gaming is the far far far better move.

I'd take stagnation in how games look for a couple years(not the content) so hardware gets to a 100fps base, then as hardware increases we add graphical detail and stay at 100fps. Ghost/blur free gaming is THE best scenario for all gamers. Dropping more frequently to sub 40fps rates that are slightly smoother but still pretty crap is not the way the industry should head.

If the Screen is running at 35 FPS (35Hz) to accommodate the game wouldn't that cause Ghosting?
 
If the Screen is running at 35 FPS (35Hz) to accommodate the game wouldn't that cause Ghosting?

It won't cause it, it will just be the same as any normal screen setting. The ghosting basically comes from how long the pixels are lit up, so variable refresh rate mode will have normal amount of ghosting. The ghosting isn't about time between frames, but the pixel response when it refreshes, which won't change at any hz.

You can do low persistence at low frame rate, it wouldn't cause ghosting, it will just be crap, like awful, probably kill people :p

Basically low persistence is using the strobe like effect the screens can do in 3d mode to pulse the light rather than keep it on.
So lets say a frame on 120hz is around 8ms, instead of the light being on for basically the entire 8ms, it's only on for around 1.5-3ms depending on the screen that's doing it.

So low fps won't cause ghosting, but I can only assume(having not seen it at low fps) that it will be like playing a game on a CRT at 20hz... it will induce essentially horrible flicker that will probably burn your eyes out of your socket :p

Ultimately low persistence is basically returning us to CRT level ghosting, ie, little to none, but reintroducing the flicker. If you remember the good old CRT days, the difference in gaming at 60hz and 120hz when flicker was involved was literally night and day. It's a strictly high refresh rate situation, not necessarily high FPS, you could do 20fps on a 120hz crt and not get flicker, because it's the light pulsing as quickly as possible that made the flicker imperceptible, not the fps.

So basically if you game in low persistence mode you would either want a very high fps solid then you could use variable refresh rate(potentially in the future anyway) or stick with 85hz + single refresh rate and the fps doesn't matter as much. Low persistence + variable refresh rate + low fps will be horrendous.

So in general the "best" absolute premium way to game is high fps + high refresh rate(at least 85hz but realistically 120hz +, low persistence mode. Low fps will simply look poo as it does now for most people. Variable refresh rate improves the sub 60fps bracket, sometimes not much, sometimes significantly but variable frame rate and low FPS afaik, is just not going to work for anyone for low persistence so ghosting/overdrive/that kind of stuff will always still be an issue.

Even Carmack said low persistence is the best thing, I'm not really sure why the industry moving towards bigger frame drops and lower FPS with higher visual quality is good for low persistence, it's moving in completely the opposite direction.
 
Last edited:
It won't cause it, it will just be the same as any normal screen setting. The ghosting basically comes from how long the pixels are lit up, so variable refresh rate mode will have normal amount of ghosting. The ghosting isn't about time between frames, but the pixel response when it refreshes, which won't change at any hz.

You can do low persistence at low frame rate, it wouldn't cause ghosting, it will just be crap, like awful, probably kill people :p

Basically low persistence is using the strobe like effect the screens can do in 3d mode to pulse the light rather than keep it on.
So lets say a frame on 120hz is around 8ms, instead of the light being on for basically the entire 8ms, it's only on for around 1.5-3ms depending on the screen that's doing it.

So low fps won't cause ghosting, but I can only assume(having not seen it at low fps) that it will be like playing a game on a CRT at 20hz... it will induce essentially horrible flicker that will probably burn your eyes out of your socket :p

Ultimately low persistence is basically returning us to CRT level ghosting, ie, little to none, but reintroducing the flicker. If you remember the good old CRT days, the difference in gaming at 60hz and 120hz when flicker was involved was literally night and day. It's a strictly high refresh rate situation, not necessarily high FPS, you could do 20fps on a 120hz crt and not get flicker, because it's the light pulsing as quickly as possible that made the flicker imperceptible, not the fps.

So basically if you game in low persistence mode you would either want a very high fps solid then you could use variable refresh rate(potentially in the future anyway) or stick with 85hz + single refresh rate and the fps doesn't matter as much. Low persistence + variable refresh rate + low fps will be horrendous.

So in general the "best" absolute premium way to game is high fps + high refresh rate(at least 85hz but realistically 120hz +, low persistence mode. Low fps will simply look poo as it does now for most people. Variable refresh rate improves the sub 60fps bracket, sometimes not much, sometimes significantly but variable frame rate and low FPS afaik, is just not going to work for anyone for low persistence so ghosting/overdrive/that kind of stuff will always still be an issue.

Even Carmack said low persistence is the best thing, I'm not really sure why the industry moving towards bigger frame drops and lower FPS with higher visual quality is good for low persistence, it's moving in completely the opposite direction.

Got it. thanks :)
 
The industry is moving that way because a lot of the market is using mid-low end GPU's that simply can't push 60 frames in new games.

They should be focusing on all round optimization though not proprietary technologies to help one side or the other.
 
Back
Top Bottom