• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Does games need NVIDIA G-SYNC support for it to work ??

I love GTA IV and without putting spoilers in, I had to run the last mission with fraps on, limiting to 30fps because any higher it was impossible to complete. The game was a carbon copy of the console version, down to how it was meant to run frame wise. Rockstar didn't even bother giving an after thought on us PC gamers because they didn't even patch it.

Rant over and point is...Maybe these are the kind of games that just will not work. Lazy ass ports will need a miracle to get working and no matter what tool you use, it just ends up making it worserer. :D

GTA:SA was hilarious if you disabled framerate control and had a fast PC+120+Hz you'd crater and die just jumping up and down on the spot as the falling distance calculation was tied to the number of frames that past and vsync factored in somewhere to :(

EDIT: I think they did different calculations for if vsync was on or not but assumed vsync was always 60Hz or less.
 
I do think g-sync is a step forward, but I can't see it being Nvidia only. Physx is proprietary by nature, and g-sync name itself might be, but the tech behind g-sync isn't. It's not really doing things differently, just smarter. It's a screen relinquishing control of refresh rate, actually I wouldn't even go so far as that.

Ultimately currently lets say a 60hz screen and a gpu(any brand) without v-sync enabled can send every frame out the ms it's finished. Without g-sync a screen simply refreshes the earliest it can for a given hz settings, in the case of 60hz, every 16.667ms. Sometimes those frames overlap the refresh and you get tearing.

Ultimately g-sync is primarily screen side, and all it is, is the screen NOT refreshing the second it's capable of, but waiting till the screens buffer has been updated.

I'm not sure anything driver side/gpu side has to change in the slightest when you think about it. The only change is the monitor going from refreshing at set rate to refreshing only when the screen's buffer is updated with a new image. The gpu just sends out frames as they are ready as it does currently without v-sync enabled(outside of some frame pacing control, which again already happens anyway in the drivers).

ultimately I fail to see how Nvidia can patent or control or prevent AMD from using it, when the entire thing falls on the screen choosing to refresh in a different way.

g-sync likely is a patented name, but the tech is all screen side, Nvidia can't possibly hold the patent on a screen maker choosing when to refresh and AMD may have to call it something else.

Intresting analogy, i would agree with you on patents, you can't patent G-Sync in what it does, but you can patent the technology that makes it happen, thats if the technology (the hardware) is yours to patent, if its existing hardware used in a certain way to achieve its goal then you can patent that design.

All AMD would need to do is design it differently to do the same thing.
 
What I should say is, I can see that Nvidia could get Asus and co to use their own chip, and by giving Nvidia control of it(and effectively I would think Nvidia subsidising the cost of the control screen side) they can lock it to who they want. So g-sync, that chip made by Nvidia, and their hardware and software lock on particular g-sync using screens is fully possible, but the basic concept, the idea that could be locked out or patented is really just daft. Nvidia don't have to let anything they don't want through a chip they've manufactured and is used in a particular screen. But they can't stop monitor makers from implementing the feature in other screens.

The fundamental idea is so plainly easy and simple that any screen can do it(not necessarily existing ones, though depending on how their control chips work would be interesting to know if any could with a pure firmware upgrade) without adding $100 to the cost and locking it to Nvidia.

THe only thing Nvidia can patent is their software and their particular chip. The idea that they can patent when monitor manufacturers choose to refresh their panels is absurd. Though the even more complicated question appears. If Nvidia pay Asus 10million and provide chips free and as part of that agreement they refuse to make an alternative panel that just controls refresh timing itself..... then they could lock AMD out in that way. If only a few guys are making 120-144hz screens(which is the case) and there are only a few models, then if Nvidia buys those guys off to not make an alternative that could screw AMD.
 
Last edited:
What I should say is, I can see that Nvidia could get Asus and co to use their own chip, and by giving Nvidia control of it(and effectively I would think Nvidia subsidising the cost of the control screen side) they can lock it to who they want.

But the fundamental idea is so plainly easy and simple that any screen can do it without adding $100 to the cost and locking it to Nvidia. If Nvidia control that chip they add to screens and the screen makers allow Nvidia to do this, I can't see g-sync panels working with anything else, and they seem to be (not sure on this) locking out their own customers with older cards by making it 7xx series(or all kepler?) cards only. Which is a bit of a kick in the teeth seeing as cards have been pushing out their frame the ms it's finished since the dawn of graphics cards.

THe only thing Nvidia can patent is their software, and add a chip to screens that they have control of through software. The idea that they can patent when monitor manufacturers choose to refresh their panels is absurd. Though the even more complicated question appears. If Nvidia pay Asus 10million and provide chips free and as part of that agreement they refuse to make an alternative panel that just controls refresh timing itself..... then they could lock AMD out in that way. If only a few guys are making 120-144hz screens(which is the case) and there are only a few models, then if Nvidia buys those guys off to not make an alternative that could screw AMD.

Agreed, then AMD would have to put their own chip inside the monitor, which may add more cost to the Monitor, or they would have to have AMD and Nvidia versions separately.
 
Can't do it in software currently afaik - telling the monitor to change the refresh rate in realtime causes the monitor to retrain causing a couple of seconds black screen.
 
Sorry to go off-topic here but does AMD have anything like this with CCC?

What, this?

Yes and No... half?

What that does is overclock / overvolt the LCD to reduce input Lag and Ghosting. its not quite the same as syncing the LCD's refresh rate to the Game, but it does have many of the same benefits.

 
Agreed, then AMD would have to put their own chip inside the monitor, which may add more cost to the Monitor, or they would have to have AMD and Nvidia versions separately.

I really just can't see monitor companies locking themselves in to Nvidia only, $100 extra to tell a screen to not refresh is laughable. We're talking basically a few lines of code differently inside the screen software, refresh every n ms vs when the screen buffer is loaded sending a "refresh now" signal. It's basically nothing at all in terms of software, though the hardware might not allow something that simple, but bringing out new screens with alternative controllers that can deal with variable framerate fine would be pretty trivial.

It does kind of scream of Nvidia though doesn't it, come up with something insanely basic that requires essentially no hardware, add hardware, charge Nvidia users(but no one else) extra for something that is all but free, lock it in.

Seriously the question still kind of stands, why wasn't this something that LCD's did essentially since they were created?

I've said in recent threads due to things like industries getting together years ago to pick a cable/standard for 4k it would be better for every user, but people just can't get together or do the smart thing.

Refresh rates for LCD's seem to be based around the idea that CRT's had refresh rates rather than refresh rate's being strictly needed in the same sense. Considering GPU side has been capable of updating info as required forever why have screens not been capable till now. INdustries do very stupid things, move insanely slowly, don't get together to work together sensibly and do utterly daft things for years just because that's how they used to do things.
 
I look forward to your AMD version if that is so simple DM :rolleyes:. I take it you have not seen the Asus 3D monitors with built ion Nvidia 3D vision then? Why they would need to release a screen that 'locks' them into Nvidia is pointless and would cost the vendor in the long run. I can see it released as a screen that works on both AMD and Nvidia but the G-Sync only working on Nvidia (for obvious reasons). Why people keep insisting that Nvidia are the bad guys, when all I see is a good thing? If AMD had developed this and implimented it into a screen or a DIY module, that would also be a good thing.
 
I read the walls of text because like Rroff he knows a lot more than most of us. The problem is people don't always like what he's saying so shut off or scream bias.
 
I don't think people read the walls of text anyway Greg. The bias shown is laughable.

I seriously see good in this tech and if it makes gaming more enjoyable and realistic, I am all for it but some just seem to look to turn anything good that Nvidia have done into a bad thing because it is Nvidia....

Sure it is proprietary but I don't see these same people having a dig at AMD for making Mantle proprietary. If they did, I would hold their posts with more credibility.
 
Sure it is proprietary but I don't see these same people having a dig at AMD for making Mantle proprietary. If they did, I would hold their posts with more credibility.

I hate to rain on your parade Gregos but... I highlighted the important part in bold for ya old chap. ;)

“I think at this stage it makes sense for us to develop Mantle, at least in its current form, because nobody knows our hardware at the lowest level best than we do. So for us to have to do that for alternative graphics hardware [would be] almost impossible,” said Ritche Corpus, AMD’s director of software alliances and developer relations, in an interview with VR-Zone web-site.

The plan is, long term, once we have developed Mantle into a state where it’s stable and in a state where it can be shared openly [we will make it available]. The long term plan is to share and create the spec and SDK and make it widely available. Our thinking is: there’s nothing that says that someone else could develop their own version of Mantle and mirror what we’ve done in how to access the lower levels of their own silicon. I think what it does is it forges the way, the easiest way,” explained Mr. Corpus.

Sauce
http://www.xbitlabs.com/news/graphi...t_Will_Become_Widely_Available_to_Others.html
http://vr-zone.com/articles/mantling-alliances-ritchie-corpus-amd-interview/58215.html

I look forward to the Nvidia statement saying GSync/Physx won't be proprietary/patented. :D
 
Last edited:
Long term lmao, they'll open it up in a few years when there's already other options and it doesn't have as much benefit as it might have on release.

Mantle won't even phase nvidia they're busy developing bigger and better things.
 
Back
Top Bottom