• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

So much for G/Sync?

They've got it introduced as a standard in the DP specification.

This is down to them, and people benefiting in the future would not have done without AMD.

This I have to totally agree with.

But if already existed and they got it introduced as the standard they didn't actually invent it did they?

oh and DM this is me bringing something to the thread an actual question.
 
Okay, I am confused now, I thought when AMD announced Freesync that Nvidia said the reason they never approached VESA about making an open standard for variable refresh is because it already existed in the current standard?

*EDIT*

Yeah they did:

He first said, of course, that he was excited to see his competitor taking an interest in dynamic refresh rates and thinking that the technology could offer benefits for gamers. In his view, AMD interest was validation of Nvidia's work in this area.

However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.

That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction.

When asked about a potential VESA standard to enable dynamic refresh rates, Petersen had something very interesting to say: he doesn't think it's necessary, because DisplayPort already supports "everything required" for dynamic refresh rates via the extension of the vblank interval. That's why, he noted, G-Sync works with existing cables without the need for any new standards. Nvidia sees no need and has no plans to approach VESA about a new standard for G-Sync-style functionality—because it already exists.

That said, Nvidia won't enable G-Sync for competing graphics chips because it has invested real time and effort in building a good solution and doesn't intend to "do the work for everyone." If the competition wants to have a similar feature in its products, Petersen said, "They have to do the work. They have to hire the guys to figure it out."

What gives? :confused:
 
Last edited:
You do know it's Nvidia guys, in discussions like these that consistently try to push that agenda.... when they have nothing substantial to add to a conversation or any "nvidia pushed standards for everyone" to counter with... they try to mock and put down.

As usual you bring nothing to a thread, well done.

This^^


The worlds forums come and read this one - if only to know how not to contribute.
 
Okay, I am confused now, I thought when AMD announced Freesync that Nvidia said the reason they never approached VESA about making an open standard for variable refresh is because it already existed in the current standard?

*EDIT*

Yeah they did:



What gives? :confused:

Nvidia Seen only £$£$£$ and pushed for Gsync. and even though Nvidia can use Freesysnc lets see if they do add it to there Drivers or will the continue to be stuck up and only allow Gsync.

Time will tell.
 
Nvidia Seen only £$£$£$ and pushed for Gsync. and even though Nvidia can use Freesysnc lets see if they do add it to there Drivers or will the continue to be stuck up and only allow Gsync.

Time will tell.

His point/question went completely over your head, but don't let that prevent Nvidia bashing :p

His post is insinuating that AMD's implementation should just work, as in... Now? As everything's meant to be in the standard we're already using.
 
His post is insinuating that AMD's implementation should just work, as in... Now? As everything's meant to be in the standard we're already using.

To be honest my post is more insinuating that T'm totally confused and struggling to make sense lmao, but I think you captured part of what I am confused over.
 
To be honest my post is more insinuating that T'm totally confused and struggling to make sense lmao, but I think you captured part of what I am confused over.

The quote is the insinuation would be better to say :p

Nvidia's saying everything's there to work as AMD have explained Freesync, as of this moment in time.
 
This I have to totally agree with.

But if already existed and they got it introduced as the standard they didn't actually invent it did they?

oh and DM this is me bringing something to the thread an actual question.

could you point to someone in this thread claiming AMD invented it?

Okay, I am confused now, I thought when AMD announced Freesync that Nvidia said the reason they never approached VESA about making an open standard for variable refresh is because it already existed in the current standard?

What gives? :confused:

Standards take a while to do, it's that simple. It's always seemed from the start that vblanking was "coming", it was in the works, and Nvidia didn't wait for official support, they made a simple chip that could do it straight away and get their first.

Everything that vblanking uses is "in" displayport, as there was always a basically unused communication standard within displayport. This is the method which is used to send a message to the screen to tell it the new refresh rate under vblanking. All that needs to happen is it becomes a standard and monitors wanting to be marked compatible with that standard need to listen at the other end of that cable.

Everything physically was there to enable vblanking within display port, there was just nothing pushing monitor makers to listen to said signal and then do something with it. It pushes monitor makers to do more than the bare minimum, well it doesn't, it just raises the bare minimum. Supporting more things means like, work, and effort and those things are contrary to profit(so most companies think).

Vblanking was already part of the eDP standard, I've got work to do so won't break out my googleFu right now so I don't know who came up with vblanking, I don't know who added vblanking support to the eDP standard. it could have been AMD, it could have been Intel or anyone else.

AMD works with the industry on standards because, that is what they've done for quite some time. People pushing performance and wanting to make and sell gpu's not surprisingly want standards for monitors to be pushed forward to make use of the gpu's, making more people want to buy newer gpu's for the new features.

AMD spearheaded getting vblanking approved to the vesa standard for DP(again I don't know if they did the same for eDP), which leads to monitors reacting with a "urrrgghh, effort, I guess we'll support that then because we have to". So the next round of monitors will likely bother to listen and adjust to said signals.

Standards are the most tedious and ridiculous thing in the industry. We should have DP way beyond where it is now. Would I take a double thickness DP cable for 120hz 4k compatibility.... **** yeah. Till recently it was dual hdmi or dual dvi(lol) cables to get 60hz at 4k. They hadn't moved either standard on officially in the past like 5-6 years really and this is some pretty simple stuff IF and only IF all these guys get in a room and just do it. Politics is holding back sooooo many advancements in the industry.

I mean the actual physical difficulties of making 10nm chips or oled screens are huge, but making a cable that can do 120hz at 4k is EASY, like really easy... yet the latter still isn't done, pure politics, not difficulty.
 
So basically (asking this as a question not a statement) this is like the Mantle thing, where one manufacturer gets fed up at the industry being slow and gives it a boot up the backside?
 
could you point to someone in this thread claiming AMD invented it?

Maybe, just maybe instead of deflecting, some one could actually answer the question.



I haven't once said that someone in this thread stated that AMD invented this, I am just asking the question.

But if already existed and they got it introduced as the standard they didn't actually invent it did they?
 
Maybe, just maybe instead of deflecting, some one could actually answer the question.



I haven't once said that someone in this thread stated that AMD invented this, I am just asking the question.

Well I answered the question as best as I could, you just didn't read or like the answer, I ALSO pointed out that you're attempting to bring up something separate that hasn't been claimed by anyone as if getting the answer you want proves something else... problem is it wouldn't mean anything.

As I said, I don't know who invented vblanking, it COULD be AMD, AMD already implement vblanking through eDP(a superset of DP) and they've spearheaded getting the same thing pushed into the normal DP standard thereby pushing all future monitors to support it.

Who invented vblanking, I don't know, you apparently don't from the general question and what you appear to want the answer to be.

if AMD invented it, got it put in eDP and then also pushed for it to be put in DP at a later date, what difference would that make?

Maybe someone else invented it, maybe it was a collaboration, between AMD and someone else or multiple entities not including AMD. What did you want to do with the answer, what was the point of the question.
 
Whilst, of course, there is no direct proportionality between wealth and invention (to an extent) it also stands that a company which has invented a whole wide range of technologies (as you claim several time that AMD has) should make quite a bit of capital from those ventures. Ergo, if AMD really had played such a huge part in everything that you have said that they have then they would be in a considerably better financial position right now.

You missed the point.

AMD don't get any money for pretty much anything they did to push the industry forward, if they did the industry would not have moved forward.
 
Well seeing as nobody knows who invented Vertical blanking, I have looked it up, and the only reference I have found is that A Robert N Hurst of the General Electric Company is the name listed as the inventor back in OCT 1990. The patent then goes on to say that it was made under a NASA contract, so looks like its safe to say it was nothing to do with AMD or ATI as it would have been back then.


http://www.google.com/patents/US5140420


Embedded Display port patent
named people Yong Hwan MOON, Hong Jun Yang, Sang Ho Kim, Yong Woo Kim

Company Silicon Works Co., Ltd.


http://www.google.com/patents/US20130278591



Anyway I will say it for those that are thinking it and for those that don't want it to be.

AMD did not invent Adaptive Sync, nor Embedded display port.

All AMD has done is stuck a daft name on an existing technology and submitted it to the VESA people, for which they should be applauded. But personally I don't think they would have have done anything if Nvidia hadn't of announced G-Sync first.


As for what I wanted the answer to be.....really :rolleyes:
 
It got very quiet in here all of a sudden ;)


Okay, I am confused now, I thought when AMD announced Freesync that Nvidia said the reason they never approached VESA about making an open standard for variable refresh is because it already existed in the current standard?

*EDIT*

Yeah they did:
He first said, of course, that he was excited to see his competitor taking an interest in dynamic refresh rates and thinking that the technology could offer benefits for gamers. In his view, AMD interest was validation of Nvidia's work in this area.

However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.

That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction.

When asked about a potential VESA standard to enable dynamic refresh rates, Petersen had something very interesting to say: he doesn't think it's necessary, because DisplayPort already supports "everything required" for dynamic refresh rates via the extension of the vblank interval. That's why, he noted, G-Sync works with existing cables without the need for any new standards. Nvidia sees no need and has no plans to approach VESA about a new standard for G-Sync-style functionality—because it already exists.

That said, Nvidia won't enable G-Sync for competing graphics chips because it has invested real time and effort in building a good solution and doesn't intend to "do the work for everyone." If the competition wants to have a similar feature in its products, Petersen said, "They have to do the work. They have to hire the guys to figure it out."


What gives? :confused:


This is very interesting find from ubersonic.

It says that monitor manufacturers will have to add a scaler into currant monitor designs, this is definatly going to cost. So I cannot see how there wont be a premium for Adaptive Sync monitors.
What will be really funny is that if monitor manufacturers decide to use the G-Sync scaler instead of designing a new one :p, of course as Peterson says, I cannot see Nvidia allowing their scaler being used to work with non G-Sync certified monitors, assuming that it can be made to work with the way Adaptive Sync works.

Certainly interesting times ahead.

Will AMD create a scaler for the monitor manufacturers to use or will they let someone else do the work. In my opinion they do have a tendency to let others do the work for their additional features, tridef for 3D springs to mind, but we will have to wait and see.
 
nvidia do their own work, amd let others do it then scream about it being free, the reason they've always been 2nd best imho
 
nb26ht.gif
 
Back
Top Bottom