• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD: You'll hear more from us on G-Sync soon

Caporegime
Joined
17 Mar 2012
Posts
49,608
Location
ARC-L1, Stanton System
http://www.tomshardware.com/reviews/amd-ama-toms-hardware,3672-7.html

Q. GSync got tongues wagging in spite of the fact that people wouldn't be able to see the difference on a compressed Youtube video. Is AMD considering a similar solution, or working towards one that's more open than GSync? Also, AMD still does not support PLP (portrait-landscape-portrait) monitor setups for gamers - will that ever change?

A. You'll hear more from us on G-Sync soon. Bezel compensation is designed to treat the bezels of matched-sized/resolution displays as an object game content passes behind, rather than an object that chops game content in half. The feature is not intended to support for mixed-sized or mixed-resolution configurations. With respect to PLP, that is a feature we have in development, but I don't have an ETA at this time.

Looks like they are looking to respond to G-Sync. let hope its also cost effective :)
 
That's what I mean by fragmentation, in the future a change of GPU is likely going to mean a change of monitor too if you want to keep feature sets.

Honestly, I'd rather they not come up with these features that are not PC-like. If it isn't going to work on any hardware, let's not have it at all. It's not progress when you are being locked to a company constantly.
 
Again,

I'll point out some fundamentals for people.

A gpu without vsync enabled sends the frame as soon as it's ready to the monitor. Nvidia gpu's using gsync are running in the same mode GPU's have been capable of since gpu's were first being made.

Hardware side, there is nothing on the gpu required, at all, full stop.

Monitor side, very simple again, you have a buffer for the screen to read and display, you have a timing program and you have a control program that says "update every 8/16/32/whatever milliseconds" depending on what mode you're in. Monitor side you need software to say, don't update every X ms, update when the buffer is refilled...........

and there we have G-sync, it's simplicity is rather impressive actually, and the fact it hasn't been done till now is hilarious. I've made many posts about how an industry does things not because they should, but because they used to do it so they do now. Refresh on lcd's has essentially always been down to the fact that old monitors used refresh rates....... and no one bothered to change it. Same way 4k screens have all kinds of odd connection methods and crap refresh rates...... because you couldn't get the industry giants in a room to agree on a standard(AMD stepped up, wrote a standard and gave it to Vesa to use fairly recently, so we should have 120hz/display port/one cable as a 4k standard amongst most screens in the future).

G-sync, is Nvidia only enabling g-sync to work on screens they deem to have paid them enough to enable it. It's also the chip on the monitor, Nvidia decided to enable g-sync mode they require an Nvidia monitor controller, and their own software, and will disable any Nvidia users not using the right screen/monitor controller.

There is NOTHING, at all, even possible that Nvidia can do to stop every single other monitor chip being updated to include a "g-sync" like mode, nothing. It may need updated asic's, it may be doable in firmware on some chips, not on others, who knows really. The actual application is ludicrously simple.

Think of g-sync like the mode that all screens have had the potential for since LCD's started, and no one thought to enable... because they are stupid. Then think of Nvidia's g-sync like anything else they do, insisting their own customers pay 3 times the actual prices for the privilege of not having the option locked out by nvidia drivers.

The only thing Nvidia can patent is their driver code, and the actual physical chip they make, they can in no way prevent monitor controllers having the same mode added, and as said, gpu hardware side, the hardware has been compatible since day 1 of gpu's being available.
 
Again,

I'll point out some fundamentals for people.

A gpu without vsync enabled sends the frame as soon as it's ready to the monitor. Nvidia gpu's using gsync are running in the same mode GPU's have been capable of since gpu's were first being made.

Hardware side, there is nothing on the gpu required, at all, full stop.

Monitor side, very simple again, you have a buffer for the screen to read and display, you have a timing program and you have a control program that says "update every 8/16/32/whatever milliseconds" depending on what mode you're in. Monitor side you need software to say, don't update every X ms, update when the buffer is refilled...........

and there we have G-sync, it's simplicity is rather impressive actually, and the fact it hasn't been done till now is hilarious. I've made many posts about how an industry does things not because they should, but because they used to do it so they do now. Refresh on lcd's has essentially always been down to the fact that old monitors used refresh rates....... and no one bothered to change it. Same way 4k screens have all kinds of odd connection methods and crap refresh rates...... because you couldn't get the industry giants in a room to agree on a standard(AMD stepped up, wrote a standard and gave it to Vesa to use fairly recently, so we should have 120hz/display port/one cable as a 4k standard amongst most screens in the future).

G-sync, is Nvidia only enabling g-sync to work on screens they deem to have paid them enough to enable it. It's also the chip on the monitor, Nvidia decided to enable g-sync mode they require an Nvidia monitor controller, and their own software, and will disable any Nvidia users not using the right screen/monitor controller.

There is NOTHING, at all, even possible that Nvidia can do to stop every single other monitor chip being updated to include a "g-sync" like mode, nothing. It may need updated asic's, it may be doable in firmware on some chips, not on others, who knows really. The actual application is ludicrously simple.

Think of g-sync like the mode that all screens have had the potential for since LCD's started, and no one thought to enable... because they are stupid. Then think of Nvidia's g-sync like anything else they do, insisting their own customers pay 3 times the actual prices for the privilege of not having the option locked out by nvidia drivers.

The only thing Nvidia can patent is their driver code, and the actual physical chip they make, they can in no way prevent monitor controllers having the same mode added, and as said, gpu hardware side, the hardware has been compatible since day 1 of gpu's being available.

Be that as it may, perhaps AMD can convince other Display manufacturers to make input refresh Displays without the need for software based propitiatory lockouts..
That would upset Asus as much as Nvidia.
 
AMD stealing another idea from Nvidia...:mad:





:p

:p

AMD has a lot of burners on the go atm, I wouldn't expect an update anytime soon. :p

Really, honestly, this is a nothing update. This is a situation where whenever the screen makers put the mode into monitors AMD will support it.

Turn off v-sync mode..... sorted, you just did everything you'll need to AMD side to support "g-sync", g-sync will be patented and locked to Nvidia... it will be called something else. Like 3d is what any 3d screen can do but Nvidia needs 3dvision to work... nothing different just their branded version.

R-sync maybe, who knows, AMD support it now, all it needs is monitors that can go "du'h, I'll just update when the buffer tells me it's been filled, not constantly for no reason" and it will be ready for the masses... without hte £150 fee.

I would pretty much almost die laughing if someone supports this before Nvidia manage to ship their version but my general feeling is most if not all monitors will need a new version of their chips(rather than firmware) to add it. Not because it's complex but because the cheapest way to make an asic is with nothing extra you don't need at all. Fixed function rather than programability. Nvidia probably has time to market in it's favour, and most likely due to going FPGA rather than asic(bigger slower, more expensive but programmable rather than fixed function more or less). By the time Nvidia can do an asic version to bring costs down or....... knowing Nvidia increase profit, every other company that makes asic's for monitors will have their versions out as well.

It's interesting and I wouldn't say no to g-sync, but given the option of slightly lower settings 90fps+ and low persistence or slightly higher quality settings, dips way below 60fps often enough to enable g-sync.. I'd choose the former every time.

In a few years we might be doing most of our gaming on oculus.... with decent res/quality screens it could be great, but read up on where they want it to go. It's 90fps + and low persistence on Oled's, which should become THE golden standard of gaming. low persistence and variable frame rate really just doesn't work.
 
The only thing Nvidia can patent is their driver code, and the actual physical chip they make, they can in no way prevent monitor controllers having the same mode added, and as said, gpu hardware side, the hardware has been compatible since day 1 of gpu's being available.

They can't even do that, the Chip they uses is a off the shelf ARM SOC a "Altera Arria V GX" going by what anandech said , and you can tell the marking on the chip are not there own

I do love the idea of G-Sync, I wish it was open, Like I wish Mantel was open, or there was an OpenCL version of hardware physics so all these features get greater penetration in games.

Although I do understand why they don't want to open them up. There time and money was spent on these technologies. Would you open them up to your competitors? I probably wouldn't. Nvidia only opened up SLI to other chipsets when they pulled out of their own chipset business.

I wish game dev's + the likes of Intel/AMD/Nvidia/Microsoft/OpenGL org got around a table to hammer out standards. Much like how companies do to hammer out a new WiFi/USB/Optical or memory standards. A pipe dream atm.
 
Last edited:
Tom Petersen said Amd can't run it as there is a chip on the gpu that enables Gsync, it isn't just software based on the gpu side.

I imagine Amd would have to license said chip if they were given access by Nvidia to run Gsync.
 
Back
Top Bottom