• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD freesync coming soon, no extra costs.... shocker

Don't mistake PR exercises for the truth...

nVidia did infact offer other companies access to PhysX on the condition that they also ported CUDA support, no one took them up on that offer, nVidia had a temper tantrum and locked it all down. There is much more that went on behind the scenes than we'll likely ever see evidence of in the public domain.

i.e. "A clear indication that Nvidia and AMD did not even contact each other in regards to PhysX implementation on AMD." - no its just an indication that no one at AMD had any interest in talking to Roy Taylor about it for whatever reason.

It is a bit ridiculous that people seem to think its a case of nVidia phoning up AMD and saying "hi AMD would you like a slice of this PhysX thing?" :S
 
Last edited:
i.e. "A clear indication that Nvidia and AMD did not even contact each other in regards to PhysX implementation on AMD." - no its just an indication that no one at AMD had any interest in talking to Roy Taylor about it for whatever reason.

Well Roy can talk to himself now that he is VP of Global Channel Sales at AMD:D
 
lol. Its just funny people think it was so black and white - it was far more speculative communication than that but AMD (and others) didn't follow it up once it became apparent nVidia was making CUDA support a pre-condition for access to PhysX.
 
Don't mistake PR exercises for the truth...

nVidia did infact offer other companies access to PhysX on the condition that they also ported CUDA support, no one took them up on that offer, nVidia had a temper tantrum and locked it all down. There is much more that went on behind the scenes than we'll likely ever see evidence of in the public domain.

i.e. "A clear indication that Nvidia and AMD did not even contact each other in regards to PhysX implementation on AMD." - no its just an indication that no one at AMD had any interest in talking to Roy Taylor about it for whatever reason.

It is a bit ridiculous that people seem to think its a case of nVidia phoning up AMD and saying "hi AMD would you like a slice of this PhysX thing?" :S

This is the first i have heard of it, Are you sure your not misunderstanding this?

This needs a link so we can verify it.
 
Last edited:
Well, I guess that puts me straight and the articles I linked were not good enough for some to consider.

Anyways, massively ot now, so I will stop talking about PhysX :)
 
@Final8y,

I had a vested interest in Hybrid PhysX for years and followed the PhysX lockout very closely so I know where to find the info, up until recently I couldn't be ***** engaging on the matter too much due to the grief it entails.

I am NOT anti PhysX(as my posting history proves in regards to PhysX) but I'm fed up with the false claims banded about more often now since Mantle(and now Freesync) is on the horizon and as this thread shows PhysX implementation is used as an excuse for defence in discussion.

Over the years, I even helped lots of oCuk AMD users get Hybrid PhysX up and running whenever I could, now it is near impossible to achieve due to the lockout and my patience eventually ran out with it.


I well aware of the PR spin(others obviously believe you can fit a trumpet up your ****:eek:).

I'll hold my hands up and admit the RT quote was purposefully dropped in to rub salt in the wounds to the non believers.;)

The real juice was the preceding info I posted culminating with Nvidia's cease and desist or we'll sue your ass.

Nvidia also wanted pence in the pound for each AMD gpu with PhysX enabled-nothing wrong with that, but a tidy sum to give help fill your competitors pockets.
 
Well, I guess that puts me straight and the articles I linked were not good enough for some to consider.

There was nothing to consider as the info I posted came via some of your linked articles, that's the problem with those type of articles they are often economical with the truth.:)
 
I don't usually join in these debates because select users from "teams" sadly boil it down to a playground. That said I am curious in what you've stated here drunkenmaster as it peaks my interest (I believe you said it earlier in the thread too but forgive my lazy ass for not trawling through to find it).

In reality, freesync/g-sync will meet in the middle. Where in a situation like the pendulum demo g-sync would offer an advantage, how much, would be down to the AMD programming frankly, where frame rate change is fast Nvidia will be having to drop/delay frames to keep that smooth frame rate change, which is a VERY GOOD THING, without it you wouldn't have smoothness. But this also means that missing one frame here and there for AMD will happen also, and I would think freesync/g-sync will meet in the middle there in more real game situations, and when the frame rate is steady as hell... then they should prevent all tearing at a steady frame rate and offer essentially any frame rate, rather than limited via v-sync and without the lag and they'll be dead even there also.

So I'm curious about the statement where shifts from say 60 to 30 FPS will be done incrementally. Is there anything documented to say this? Because I find myself puzzled by said scenario.

If your game is running at 60 FPS (or any number for that matter), then it has a sudden drop to 30 FPS - what your implying is that G-Sync will reduce the Refresh Rate slowly (to maintain smoothness), but if content is being rendered at 30 FPS how does one render at 59Hz and so on?

I know what you're thinking, Hz != FPS and rightly so which then means they would need to hold frames back to match 59Hz and so on (otherwise tearing would occur). My obvious question then is, wouldn't this introduce input lag which is the very thing G-Sync aims to eliminate?

If it does indeed work this way then I am puzzled, G-Sync does not have to smooth out frame rate transitioning for it to be smooth (which is basically no input lag and no tearing/stutter) - they could simply drop to 30FPS and display at 30Hz?

Maybe I'm missing something though :)

Oh and I know they showed the pendulum demo with FPS slowly dropping unrealistically, but it was my impression that scenario was there to demo the tearing-free animation and nothing more. (Infact if I remember correctly that's exactly what they were pointing out as the FPS began to fall, but feel free to correct me if otherwise).
 
So is anyone excited to get freesync? I am looking forward to seeing it demoed in a quick shifting fps scenario and seeing how it holds up. If it does the same as G-Sync, this will be a plus all round.
 
So Nvidia wanted money for PhysX.

According to the extremetech link I posted above, as Rroff stated though, it may be more spin.

So is anyone excited to get freesync?

Nope, I tend to play my games at high enough fps to have almost none of the problems portrayed, it's just a revenue avenue/lock in/'get it up ye' for hardware manufactures imho.
 
Last edited:
According to the extremetech link I posted above, as Rroff stated though, it may be more spin.

I doubt we will ever know either way for sure on that aspect.

EDIT: Kind of like with GameWorks, etc. though when it was annouced there was various dev/tech conferences like the recent CES stuff and so on, it was put out there that they'd work with other parties, various people including AMD wanted more information there was a period of speculative communication - I don't know the exact details of though I've seen 1-2 emails - during which it came to light that nVidia had certain prerequisites, then things pretty much went cold as no one wanted to potentially get tied to the tech and/or adopt a tech that overlapped with their own IP and nVidia then threw the toys out the pram. nVidia didn't literally phone up AMD and ask them if they wanted in.
 
Last edited:
So I'm curious about the statement where shifts from say 60 to 30 FPS will be done incrementally. Is there anything documented to say this? Because I find myself puzzled by said scenario.

No documentation but I am a self proclaimed genius :p

Seriously though, they haven't(and maybe won't) talk about how they do it but essentially the smoothness can only come from frame smoothing. Remember that Nvidia have been talking about frame pacing and adding more and more hardware to monitor and control it for what, 3-4 years. AMD and Nvidia with frame pacing hold frames and drop frames when it means a smoother experience. The converse of this was AMD's with xfire having eleventy billion frames but plenty weren't actually being seen and the image was less smooth. Now we see less frames, drop quite a few, but we get a much smoother game from it.

From both Nvidia patents talking quite specifically about monitoring the rate of change of the change in frame rate to determine what refresh rate to be using and just coming to a logical conclusion that if it was jumping around from 60-30-60fps(just about worse case scenario for g-sync and "normal" for v-sync, without frame smoothing you would expect both to look identical because both would be updating frames at exactly the same time(with one extra refresh in the middle with no effect in the case of v-sync). This is where the stutter comes from with v-sync.


If your game is running at 60 FPS (or any number for that matter), then it has a sudden drop to 30 FPS - what your implying is that G-Sync will reduce the Refresh Rate slowly (to maintain smoothness), but if content is being rendered at 30 FPS how does one render at 59Hz and so on?
My obvious question then is, wouldn't this introduce input lag which is the very thing G-Sync aims to eliminate?

Every company likes to make bold claims, they all do it and most companies don't like explaining things in super detail to average users. But claiming it eliminates latency is a bit daft, makes it MUCH smaller than trip buffered v-sync, for sure. Latency won't be an issue, some latency, but less than basically most/all other methods is a non issue. g-sync really isn't there to eliminate latency to be honest, its 98% about the smoothness in situations where smoothness is usually an issue.

Without being good with making pretty graphs/tables it's really hard to describe but essentially, dropping frames.

Realistically the demo they showed was very slow frame rate change, but the reality is the 16.67ms frame time change that induces stutter in v-sync is noticeable, the probably 0.2-0.5ms frame time changes in the pendulum demo weren't noticeable. There will be a sweet spot in there that allows for faster frame rate change with 99% of the smoothness. IE 0,2ms awesome, 2ms only a fraction worse, 4ms noticeably worse but still decent, 8ms is pretty meh, 10ms is crappy and 16.67ms is woeful.

So if the frame rate went 60 to 30fps, I'd expect a quicker change than in the demo but around some sweet spot where you still feel 95% of the smoothness, but the quicker change reduces the number of dropped/delayed frames and gets to the new frame rate quicker.

I mean 60 to 30fps in 0.2ms gaps is 150 frames, or 5 seconds. If you could barely tell the difference at 4ms frame gap changes, then you'd be adjusted in 8 frames or less than a 1/3rd of a second.

If it does indeed work this way then I am puzzled, G-Sync does not have to smooth out frame rate transitioning for it to be smooth (which is basically no input lag and no tearing/stutter) - they could simply drop to 30FPS and display at 30Hz?

AS above people really need to understand where the stutter from v-sync comes, the stutter they're eliminating is the change in frame times, it's what they've been fighting with frame pacing for years, it is THE key to smoothness. v-sync already drops to 30fps instantly, that is the fundamental problem with it.


Oh and I know they showed the pendulum demo with FPS slowly dropping unrealistically, but it was my impression that scenario was there to demo the tearing-free animation and nothing more. (Infact if I remember correctly that's exactly what they were pointing out as the FPS began to fall, but feel free to correct me if otherwise).

g-sync would be tear free full stop, no matter when it updates, it is by design able to prevent the screen buffer updating midway through the screen refreshing, which is how you get tearing. The slow change in frame rate didn't have any effect on tearing, a signficantly faster change in frame rate would not tear under g-sync and would tear even worse(probably) without v-sync.

The key to the demo was the smooth way the frame rate changed.
 
Appreciate the breakdown DM, what your saying does make sense and your right the perceived is probably negligible (especially vs vsync/triple buffering). I hadn't considered the stutter being the instant drop and I guess they went with what looks better (probably for the better really). We'll like see further details of how all this works along with freesync when more details do spill out int the future but I have a feeling you're on the money (makes sense to me so hey :D).

I probably shouldn't worry too much about the minor details, there's only some games where FPS can go haywire but it'll be interesting to see how it fairs.

One game I know will benefit hugely is Payday 2, for whatever reason that game on a 120Hz monitor even running at 100+ FPS is a god damn tearing mess, so much I had to enable vsync where the input lag drove me insane. First world problems and all that.

Anywho thanks again :)
 
Why do you constantly put people into groups based on which GPU they own DM? It completely undermines any point you're making. As soon as I see terms such as "nVidia people" I roll my eyes and stop reading.

You can be positive or negative about AMD or nVidia and be logical about it without being grouped based simply on what GPU you have. Statements like that just start the kind of brand wars which we're now seeing. So great work.
 
I spoke with Roy Taylor, Nvidia’s VP of Content Business Development, and he says his phone hasn’t even rung to discuss the issue. “If Richard Huddy wants to call me up, that’s a call I’d love to take,” he said.


A clear indication that Nvidia and AMD did not even contact each other in regards to PhysX implementation on AMD.



I'm not going to comment on the other info about this issue posted in this thread as I don't know what happened back then, but one thing that is worth pointing out. This quote from Roy Taylor, is to be honest nothing to do with the issue at hand, seeing as at the time when this was all happening Roy talylor was working for Nvidia.

Vice President of Telco Relations

NVIDIA

Public Company; 5001-10,000 employees; NVDA; Computer Hardware industry

March 2008 – November 2010 (2 years 9 months) Santa Clara, California

Established new division to develop relationships and explore technology development opportunities with the world’s leading telephone operators including China Mobile, Orange, Vodafone, Verizon, AT&T.

http://www.linkedin.com/in/roytaylor
 
If your game is running at 60 FPS (or any number for that matter), then it has a sudden drop to 30 FPS - what your implying is that G-Sync will reduce the Refresh Rate slowly (to maintain smoothness), but if content is being rendered at 30 FPS how does one render at 59Hz and so on?

I know what you're thinking, Hz != FPS and rightly so which then means they would need to hold frames back to match 59Hz and so on (otherwise tearing would occur). My obvious question then is, wouldn't this introduce input lag which is the very thing G-Sync aims to eliminate?

If it does indeed work this way then I am puzzled, G-Sync does not have to smooth out frame rate transitioning for it to be smooth (which is basically no input lag and no tearing/stutter) - they could simply drop to 30FPS and display at 30Hz?

Maybe I'm missing something though :)

Oh and I know they showed the pendulum demo with FPS slowly dropping unrealistically, but it was my impression that scenario was there to demo the tearing-free animation and nothing more. (Infact if I remember correctly that's exactly what they were pointing out as the FPS began to fall, but feel free to correct me if otherwise).

Try to remember how v-sync works. Tale a typical game scenario where FPS go drastically from high to low. Say looking at clear sky then panning down to see a thick forest.

With V-sync on your FPS will be pegged at 60FPS, then drop to 30FPS. No transition in FPS, just an instant and dramatic drop. This drop causes massive lag and is very noticeable. Don't lose sight of the fact that the FPS may be ~40-50 FPS but because of how v-sync works it can only sow 30 FPS. If the FPS drop to 29 it shows as 20 with v-sync on.

To get an idea of how g-sync will work just turn off vsync entirely and cap your FPS to your monitors refresh rate. Of course g-sync will eliminate tearing but you will not be stuck at 60, 30, 20, 15, 12 FPS.

The transition with g-sync (or v-sync off) will not be sudden and will be less noticeable because input lag does not change with FPS.
 
Jay Lebo is a Product Marketing Manager at AMD. His postings are his own opinions and may not represent AMD’s positions, strategies or opinions. Links to third party sites are provided for convenience and unless explicitly stated, AMD is not responsible for the contents of such linked sites and no endorsement is implied.
- See more at: http://community.amd.com/community/...ng-the-work-for-everyone#sthash.0nMpKI7N.dpuf


Speaking casually:

We don't really tell PR an awful lot and frankly some of the stuff they say is a bit backward so just in case here is a disclaimer to avoid any further backlash or upset

No info on FreeSync either
 
Last edited:
I'm not going to comment on the other info about this issue posted in this thread as I don't know what happened back then, but one thing that is worth pointing out. This quote from Roy Taylor, is to be honest nothing to do with the issue at hand, seeing as at the time when this was all happening Roy talylor was working for Nvidia.



http://www.linkedin.com/in/roytaylor

read the quote again .... Slowly :)
Quote:
I spoke with Roy Taylor, Nvidia’s VP of Content Business Development, and he says his phone hasn’t even rung to discuss the issue. “If Richard Huddy wants to call me up, that’s a call I’d love to take,” he said.
 
Back
Top Bottom