• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Does games need NVIDIA G-SYNC support for it to work ??

I remember Nvidia saying the same for PhysX back in 2008. I see that went well as well Matt :D

Well AMD don't have a history of making things proprietary. We can take it from their statement that they won't, otherwise why go to the trouble of making it blatantly public. If they do make it proprietary it will end up in the same state as Physx. Rather useless, rarely used and unavailable to half the community.
 
Well AMD don't have a history of making anything.

^^That would have sufficed :D

We all know AMD like to jump on everybody else's work and then claim it should be an open proprietary :D

In seriousness, joking aside, what have AMD brought to the design world? I know they had TressFX (which I love) in Tomb Raider but I am finding it hard to see anything else. Some help please :)
 
^^That would have sufficed :D

We all know AMD like to jump on everybody else's work and then claim it should be an open proprietary :D

In seriousness, joking aside, what have AMD brought to the design world? I know they had TressFX (which I love) in Tomb Raider but I am finding it hard to see anything else. Some help please :)

:D

Right that's it for me. I've been summoned. :p
 
I remember Nvidia saying the same for PhysX back in 2008. I see that went well as well Matt :D

I know your Nvidia bias is laughable and your logic generally failing, but can you really not see a difference between code that CAN'T work on other hardware and code that can that you prevent from working on other hardware?

Nvidia would gain precisely nothing at all from Mantle, they can't possibly, it's different hardware it's different shaders. I have no problem with NVAPi, which by the way is of precisely no use for AMD hardware, they've had it for YEARS.. do you see all the complaining about it?

Be honest(if that is possible) a screen currently refreshes at set intervals, the screen under g-sync only refreshes after the screen buffer is refreshed..... you think that is only possible on Nvidia hardware, you think that should be locked to Nvidia gpu users?

Here is another point, Nvidia could get together with asus, have them release a screen which has precisely no extra hardware, costs no extra, and benefits Nvidia. Instead just so AMD users can't have access to that screen they are charging you $100-120 extra so you can use it.

Are you happy with paying $100 more for something that needs no hardware or software just so I'm prevented from using it? That is where Nvidia and AMD differ, AMD offer benefits where possible to everyone, and where not possible, only to AMD users.

Nvidia happily screw Nvidia users if it screws AMD users more, it's completely pathetic.
 
^^That would have sufficed :D

We all know AMD like to jump on everybody else's work and then claim it should be an open proprietary :D

In seriousness, joking aside, what have AMD brought to the design world? I know they had TressFX (which I love) in Tomb Raider but I am finding it hard to see anything else. Some help please :)

Cough, AMD brought us Tesselation, Nvidia, just to screw AMD users, added tesselation in Crysis 2.. or 3(forget which, must be 2?) to scenes in which no users, AMD or Nvidia could benefit, just to hurt AMD performance more, but it impacted Nvidia performance as well. They added tesselation in extreme levels to flat objects like the bollards, this hurt Nvidia performance but not as much as AMD performance. Nvidia paid a game dev(the price being added to your graphics cards) to add workload with literally no visual improvement at all which hurt your own graphics card performance purely because they could hurt AMD user performance more.

Another one, gddr5, AMD co-developed it with Samsung, I believe whatever card you have no benefits from that.
 
I have been saying for ever that tech should be open and proprietary is bad for us. I said a long time ago that we will need seperate GPU's to see all the candy/frames in games in a couple of years. I don't care for the fanboy talk of Nvidia are the best and AMD are the best.

Nvidia said that PhysX was open AMD said it wasn't blah blah blah.... We could go on all night about who said and done what but that doesn't give 'US', the consumers what we want and mostly I read that we all want open. Devs want it, we want it but the money men don't.

I am positive that Nvidia could get this G-Sync working with AMD GPU's but that requires a little more than a couple of lines of code and man power but that doesn't stop me wanting to see my AMD brothers enjoying the same experience.
 
Last edited:
Can we all just get off this Proprietary nonsense and try to have a grown up conversation.

Suarez7, why is it always you who starts the bias accusations? just for once it would be nice to have a conversation without it sinking into a playground revelries gutter.


I really just can't see monitor companies locking themselves in to Nvidia only, $100 extra to tell a screen to not refresh is laughable. We're talking basically a few lines of code differently inside the screen software, refresh every n ms vs when the screen buffer is loaded sending a "refresh now" signal. It's basically nothing at all in terms of software, though the hardware might not allow something that simple, but bringing out new screens with alternative controllers that can deal with variable framerate fine would be pretty trivial.

It does kind of scream of Nvidia though doesn't it, come up with something insanely basic that requires essentially no hardware, add hardware, charge Nvidia users(but no one else) extra for something that is all but free, lock it in.

Seriously the question still kind of stands, why wasn't this something that LCD's did essentially since they were created?

I've said in recent threads due to things like industries getting together years ago to pick a cable/standard for 4k it would be better for every user, but people just can't get together or do the smart thing.

Refresh rates for LCD's seem to be based around the idea that CRT's had refresh rates rather than refresh rate's being strictly needed in the same sense. Considering GPU side has been capable of updating info as required forever why have screens not been capable till now. INdustries do very stupid things, move insanely slowly, don't get together to work together sensibly and do utterly daft things for years just because that's how they used to do things.

I know your Nvidia bias is laughable and your logic generally failing, but can you really not see a difference between code that CAN'T work on other hardware and code that can that you prevent from working on other hardware?

Nvidia would gain precisely nothing at all from Mantle, they can't possibly, it's different hardware it's different shaders. I have no problem with NVAPi, which by the way is of precisely no use for AMD hardware, they've had it for YEARS.. do you see all the complaining about it?

Be honest(if that is possible) a screen currently refreshes at set intervals, the screen under g-sync only refreshes after the screen buffer is refreshed..... you think that is only possible on Nvidia hardware, you think that should be locked to Nvidia gpu users?

Here is another point, Nvidia could get together with asus, have them release a screen which has precisely no extra hardware, costs no extra, and benefits Nvidia. Instead just so AMD users can't have access to that screen they are charging you $100-120 extra so you can use it.

Are you happy with paying $100 more for something that needs no hardware or software just so I'm prevented from using it? That is where Nvidia and AMD differ, AMD offer benefits where possible to everyone, and where not possible, only to AMD users.

Nvidia happily screw Nvidia users if it screws AMD users more, it's completely pathetic.

As i understand it, and i do not pretend to be all knowing :)
Monitors, even the most modern ones simply refresh at a constant rate.

Whats needed, and my guess is this is how G-Sync works, is a device or two along the signal line, one to embed a signal in output with an input decoder hard-wired into the panels refresh rate mechanism, that would need to be able to run at a variable refresh rate, the input decoder receives instructions from the output encoder which it relays to the variable refresh rate mechanism adjusting the refresh rate according to instructions received.

My guess is that yes, the out going signal could be generated by software, even very simply embeded into the driver, in just the same way AMD's LCD Over Drive simply tells the LCD panel to simply run a higher refresh rate if it can, the technology already exists with in CCC, all that really needs to be done here, is AMD can modify thier existing technology to signal a variable refresh rate, its then just down to Monitor vendors to design monitors which insure they understand that signal and act upon it.
 
Last edited:
Suarez7, why is it always you who starts the bias accusations? just for once it would be nice to have a conversation without it sinking into a playground revelries gutter.

Someone didn't read the thread did they. DM called Greg out for being biased before anyone. Nice attempt at picking a fight with me again though.
 
My guess is that yes, the out going signal could be generated by software, even very simply embeded into the driver, in just the same way AMD's LCD Over Drive simply tells the LCD panel to simply run a higher refresh rate if it can, the technology already exists with in CCC, all that really needs to be done here, is AMD can modify thier existing technology to signal a variable refresh rate, its then just down to Monitor vendors to design monitors which insure they understand that signal and act upon it.

I see the nVidia hardware a bit like a parasite (hah) it takes a regular monitor and sits on top of the hardware governing how it works differently to how its supposed to rather than the monitor manufacturers having to modify their designs.

Currently if you signal a monitor in realtime to adjust its update rate it retrains with a black screen for a moment which rules out any software implementation at this point but hopefully in the future monitor makers will start to implement this as a basic operating feature.
 
Last edited:
Currently if you signal a monitor in realtime to adjust its update rate it retrains with a black screen for a moment which rules out any software implementation at this point but hopefully in the future monitor makers will start to implement this as a basic operating feature.

Yep, was just typing this myself and noticed the thread updated

i'm not quite sure why people seem to think that nvidia should be developing new products, in this case a panel controller, and then giving it away for free

nvidia have several patents related to dynamic monitor refresh rates, so i dont foresee anyone else copying this anytime soon
 
Yep, was just typing this myself and noticed the thread updated

i'm not quite sure why people seem to think that nvidia should be developing new products, in this case a panel controller, and then giving it away for free

nvidia have several patents related to dynamic monitor refresh rates, so i dont foresee anyone else copying this anytime soon

Those patents, what are they?
 
I see the nVidia hardware a bit like a parasite (hah) it takes a regular monitor and sits on top of the hardware governing how it works differently to how its supposed to rather than the monitor manufacturers having to modify their designs.

Currently if you signal a monitor in realtime to adjust its update rate it retrains with a black screen for a moment which rules out any software implementation at this point but hopefully in the future monitor makers will start to implement this as a basic operating feature.

I really don't know enough about how monitors work, but there are a few interesting things here.

1. Monitors support a range of input frequencies, but like you said, a change in res or input frequency triggers several seconds of black screen while the monitor adjusts.

2. This kit is supposed to be installable with just a screwdriver, right? No soldering or rewiring.

3. The panel controller and the LCD panel itself are separate, so maybe this kit replaces the panel controller... That would be my best guess.

So essentially G-sync would simply be a much more capable LCD panel controller. The controller does everything from upscaling to controlling brightness, contrast, and selecting input frequency.
 
Last edited:
I really don't know enough about how monitors work, but there are a few interesting things here.

1. Monitors support a range of input frequencies, but like you said, a change in res or input frequency triggers several seconds of black screen while the monitor adjusts.

2. This kit is supposed to be installable with just a screwdriver, right? No soldering or rewiring.

3. The panel controller and the LCD panel itself are separate, so maybe this kit replaces the panel controller... That would be my best guess.

So essentially G-sync would simply be a much more capable LCD panel controller. The controller does everything from upscaling to controlling brightness, contrast, and selecting input frequency.

Screens black out when they are not designed to be adjusted in that way, they run at a constant set rate, telling them to run at a different rate is telling them to do something they don't want to do.

Panel vendors could easily design command variable refresh rate screens if they wanted to. if they did they would be rolled out with (universal compatibility) residing with the GPU vendor, there would be no need for the G-Sync device.
 
Panel vendors could easily design command variable refresh rate screens if they wanted to. if they did they would be rolled out with (universal compatibility) residing with the GPU vendor, there would be no need for the G-Sync device.

I don't think it's anything to do with the panel. The panel is just the LCD matrix. I'm pretty sure what you're describing would be the job of the panel controller.

Monitor vendors buy panels and typically put their own controllers in them. That's why two different vendors can make monitors using the same panel and they can vary quite a lot.
 
I don't think it's anything to do with the panel. The panel is just the LCD matrix. I'm pretty sure what you're describing would be the job of the panel controller.

Monitor vendors buy panels and typically put their own controllers in them. That's why two different vendors using the same panel can vary quite a bit.

Yes, most parts of any electrical device are bought in, including the LCD panel and possibly the controllers.

My point is 'whoever' makes those controllers could easily make them to conform to variable refresh rates, its then upto the GPU vendors to conform to the controllers. result = universal G-Sync Monitors.
 
Back
Top Bottom