• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD freesync coming soon, no extra costs.... shocker

Hey Jock,

To be honest, the story is quite weak with little in the way of either. Another part of that story is:



So seems from that, all nVidia were doing was going up against Intel and wanted AMD to join that fight.

Of course, there could have been far more to it that we will never hear of but taking that article on the whole, I read it as AMD just said no. There was the other article which shows nVidia offered AMD PhysX as well.

http://www.bit-tech.net/custompc/news/602205/nvidia-offers-physx-support-to-amd--ati.html

Maybe it is me reading this how I want but to me, nVidia did make the attempt.

Edit:

I did some further digging and again Bit-Tech confirm that nVidia were backing NGOHQ.com to get AMD (ATI then) PhysX working on AMD HW.

http://www.bit-tech.net/news/2008/07/09/nvidia-helping-to-bring-physx-to-ati-cards/1



Anyways, that is enough from me on this subject in this thread, as I am massively off topic and that is upsetting people :)

even if true i dont think you can compare PhysX with g-sync as something that needs to be shared
its not a needed feature to me
g-sync/3dvision/freesync/mantle these to me are ones that divide
i dont know anyone that bought a nvidia card just for PhysX its just a nice little extra...if u play batman :p

maybe amd just didnt want it! more trouble than its worth etc
 
I was just wondering if anyone could point me towards one of these free-sync supporting monitors that won't cost any extra?

I've tried looking but I can't find any.
I'm guessing other people must have if they can confirm that there is no price increase for monitors supporting the required variable VBLANK stuff?

I'm also confused when seeing a few people mention it as being a free standard when it is more likely a VESA standard, which it seems historically is anything but free to manufacturers (that aren't part of the consortium or whatever it is) that want to use their standards, especially new standards, which I presume this is or will be?
I'm curious as to why Samsung (for example) would pay VESA to use this standard and then not increase the cost of their monitor that uses it. Still good on them for doing that.
 
I have been reading up a bit but still a little green, so if someone could verify on things, that would be great.

Freesysnc that was running on those laptops were using eDP and this isn't in any current desktop monitor?
Freesync Will still require a controller board (according to PCPer), which to me will add cost to the monitor, so it wouldn't be free (unless AMD pay for it of course).
Freesync requires DP1.3 to work and this isn't even guaranteed, as it isn't even a finalized standard.

If someone could clarify if these are correct or wrong, I would appreciate that :)
 
I have been reading up a bit but still a little green, so if someone could verify on things, that would be great.

Freesysnc that was running on those laptops were using eDP and this isn't in any current desktop monitor?
Freesync Will still require a controller board (according to PCPer), which to me will add cost to the monitor, so it wouldn't be free (unless AMD pay for it of course).
Freesync requires DP1.3 to work and this isn't even guaranteed, as it isn't even a finalized standard.

If someone could clarify if these are correct or wrong, I would appreciate that :)

I think there is also a big question mark about the suitability of Freesync for gaming. Apparently the standard being referred to for Freesync usage works by 'guessing' what the next action should be and may be unsuitable in a variable frame rate environment.
 
I think there is also a big question mark about the suitability of Freesync for gaming. Apparently the standard being referred to for Freesync usage works by 'guessing' what the next action should be and may be unsuitable in a variable frame rate environment.

Surely if that is true, that can't possibly work, as so many variables can happen when gaming. Drops from 120fps to 50fps are not uncommon in big explosion areas for BF4 for instance.
 
Surely if that is true, that can't possibly work, as so many variables can happen when gaming. Drops from 120fps to 50fps are not uncommon in big explosion areas for BF4 for instance.

I'm not savvy enough to know/explain the technical detail. A particular quote from the original news article is as follows and seems to have been made by someone who has some understanding of it:

FreeSync uses a VESA standard to change VBI speculatively depends on what the driver thinks the next VBLANK should be. There is software overhead first, and it won't work for the most important frames, when the framerate fluctuates, so you will still see tearing and stutter occasionally. If the app runs in constant frame, like how AMD's demo is doing, then the driver should be able to speculate properly and get the correct VBLANK configured. With the pendulum demo NVIDIA has however, since the framerate can fluctuate, FreeSync won't work nearly as well.
 
Ahhh, that would explain why the rather crude windmills were very static and not spinning around and adjusting frames like the G-Sync pendulum. Still, it is a start but seems a long way off to me.
 
Yes, except that is all complete nonsense. There is literally no question Nvidia has to do a lot of heavy prediction and monitoring work... they already do this, it's called frame pacing.

AMD will need the same, they already have it, it's called frame pacing.

The "you will still see tearing" is COMPLETE rubbish, using vblank, despite guessing at the next frame rate you still have COMPLETE control over when you send your own frame, so it will either update the existing image in the frame buffer or you can choose to send a new one. It will eliminate tearing completely. as for the occasional stutter, it will of course have some, but to claim g-sync eliminates it is also stupid. If you're spinning around at 30fps in a game... you're going to get stutter even with perfect syncing, there is more than one reason for stutter, low fps is a very large reason for stutter during more dramatic movement(jumping/spinning), while running in a straight line will induce inherently little as the image will change much less.


There doesn't seem to be much point in trying to explain it, you all decide for yourselves that Nvidia needs no prediction, and will happily jump from 60 to 30 fps with no stutter because the frame time change makes absolutely no difference....... even though that behaviour is identical to v-sync.

Nvidia HAS to have frame smoothing, there is no two ways about this, that means they HAVE to be predicting when the next frame is coming, and predicting frame rate is mathematically, simple.

We're not talking complex maths, keeping track of a few averages. The entire point of g-sync is that is limits the RATE of frame time change. This absolutely lends itself very well to frame rate prediction.

If your current frame time's are 16.67ms, and you already know that anything beyond 18ms wouldn't be smooth, then you already know you're not going to refresh later than say 18.5ms to the next frame, without any actual calculation you already know the very narrow window you will be in.

This is IF AMD/monitor makers don't implement even more control than current vblanking allows. I know particular users don't like to think about the future, and the ability to change things.

Just because they used Vblanking in the demo does NOT mean they are limited to this method for ever.

as for a controller, Vblanking is something very basic, it's not a $30 controller, it's one minor little feature in some existing controllers. A monitor needs some kind of controller, to, control it, no screens have no control otherwise they wouldn't work. New features are added to screens all the time, this doesn't suddenly cause the controller to cost $50 more, nor does supporting one tiny feature mean adding an entire extra chip.

In terms of software overhead, it's laughable, we're talking about keeping track of some basic frame times, doing some pretty basic calculations, and most of this will be done through frame pacing hardware(if not all).

The reality is not having absolute control of the frame rate isn't perfect, but it's not a huge issue anyway, it can still be made significantly smoother than 30-60fps runs on a current screen. Second, I said at the time, the pendulum demo shows entirely unrealistic frame rate change. When was the last game you saw where it dropped to 30fps from 60 one frame at a time, then went back to 60fps at exactly the same slow steady change? Answer, never. In reality, freesync/g-sync will meet in the middle. Where in a situation like the pendulum demo g-sync would offer an advantage, how much, would be down to the AMD programming frankly, where frame rate change is fast Nvidia will be having to drop/delay frames to keep that smooth frame rate change, which is a VERY GOOD THING, without it you wouldn't have smoothness. But this also means that missing one frame here and there for AMD will happen also, and I would think freesync/g-sync will meet in the middle there in more real game situations, and when the frame rate is steady as hell... then they should prevent all tearing at a steady frame rate and offer essentially any frame rate, rather than limited via v-sync and without the lag and they'll be dead even there also.
 
Last edited:
AMD said:
In our industry, one of the toughest decisions we continually face is how open we should be with our technology. On the one hand, developing cutting-edge graphics technology requires enormous investments. On the other hand, too much emphasis on keeping technologies proprietary can hinder broad adoption.

It’s a dilemma we face practically every day, which is why we decided some time ago that those decisions would be guided by a basic principle: our goal is to support moving the industry forward as a whole, and that we’re proud to take a leadership position to help achieve that goal.

The latest example of that philosophy is our work with dynamic refresh rates, currently codenamed "Project FreeSync”. Screen tearing is a persistent nuisance for gamers, and vertical synchronization (v-sync) is an imperfect fix. There are a few ways the problem can be solved, but there are very specific reasons why we’re pursuing the route of using industry standards.

The most obvious reason is ease of implementation, both for us from a corporate perspective and also for gamers who face the cost of upgrading their hardware. But the more important reason is that it’s consistent with our philosophy of making sure that the gaming industry keeps marching forward at a steady pace that benefits everyone.

It sometimes takes longer to do things that way — lots of stakeholders need to coordinate their efforts — but we know it’s ultimately the best way forward. This strategy enables technologies to proliferate faster and cost less, and that’s good for everyone.

The same philosophy explains why we’re revealing technology that’s still in the development stage. Now’s our chance to get feedback from industry, media and users, to make sure we develop the right features for the market. That’s what it takes to develop a technology that actually delivers on consumers’ expectations.

And Project FreeSync isn’t the only example of this philosophy and its payoffs. We worked across the industry to first bring GDDR5 memory to graphics cards— an innovation with industry-wide benefits. And when game developers came to us demanding a low-level API, we listened to them and developed Mantle. It’s an innovation that we hope will speed the evolution of industry-standard APIs in the future.

We’re passionate about gaming, and we know that the biggest advancements come when all industry players collaborate. There’s no room for proprietary technologies when you have a mission to accomplish. That’s why we do the work we do, and if we can help move the industry forward we’re proud to do it for everyone.

Source
http://community.amd.com/community/amd-blogs/amd-gaming/blog/2014/01/08/doing-the-work-for-everyone
 
it's worth noting that Intel while having some of their own lock in stuff and trying to cash on with stuff like, Thunderbolt(is that the Intel one, or Apple, god knows, it's daft as crap though), but Intel works on MANY open standards, Apple has. Nvidia is consistently the only big name player who refuses to play ball, and it slows down gaming advancement as game dev's and hardware companies don't like to support proprietary standards in general.


Nvidia always seem to miss the general thing, if in 6 months I have a freesync screen, and they are refusing to support anything but g-sync, then if I wanted a Nvidia card, I'd have to get a new monitor as well(or would potentially want to) where if they simply got together and helped push "freesync" forward first, it would have happened quicker, potentially work better, and be of a huge benefit to ALL gamers including all Nvidia card buyers.

Intel and AMD work on loads of open standards and have a very long history of doing so, Apple made openCL for the industry and handed it off.
 
l m a o

We gave you GDDR5 first Okies? Remember guyz. I didn't realise we were talking about 2008.

It's designed to operate at 95c guys it uses these threshold to further performance ok.

Was Roy whispering in his ear when this was put out.
 
Last edited:
Hi greg,

Sorry to jump back to the story you highlighted, but i have read a number of times on this forum, in the main from you, how Nvidia offered AMD physx and they turned them down & its nice to see the news story that you claim backs this up,

however and to be honest maybe i'm been a little stupid but that's not what that news story is saying,

It's saying a third party company (NGOHQ.com) may have reverse engineered the driver against the EULA to make it work, which to be honest if that was only a hint at the time no wonder AMD walked away from it,

extended a few lines from your quote was the following which explains that

"clearly a case of them downplaying the issue to evade possible action from NVIDIA on supporting the use of its proprietary technologies in violation of license agreements"

There was no way AMD where going to allow that to be added to its driver, Nvidia would have the lawsuit ready to go the second that driver went out & right about now we would have a single GPU company...

The way you make it sound its as if Nvidia went to AMD HQ knocked on the door with a parcel marked Physx and AMD slammed the door on them, which in fact based on that news link was just AMD passing on a court case using the articles own words

"After being denied support from a company they were banking the most on, they were left to their own plight against NVIDIA who have a history of an aggressive business model, even more so after it was known NGOHQ.com may have reverse engineered drivers, a clear violation of the EULA".

To be honest it that article just reads AMD dodge bullet

I was just about to respond to Gregster in this way, but i think you explained it better than i could, thanks.

Did you not read it first, Greg? no wait, you must have done as you quoted a part of this paragraph.

The team sought support from AMD in the form of a Radeon HD4800 series sample, so they could work out a similar solution for it, but AMD rejected it telling it wasn't a venture worth investing product samples and PR information on, clearly a case of them downplaying the issue to evade possible action from NVIDIA on supporting the use of its proprietary technologies in violation of license agreements. After being denied support from a company they were banking the most on, they were left to their own plight against NVIDIA who have a history of an aggressive business model, even more so after it was known NGOHQ.com may have reverse engineered drivers, a clear violation of the EULA.
Just not all of it so quoting it completely out of context.
 
Last edited:
Lol you know it. Bed time now. This thread has been rancid for a while.

It was rancid from the baiting OP (which he happily claims he did) and it was never going to be a good thread after getting all angry like he did but at least most have tried to remain civil in spite of the OP flaming nVidia user's.
 
Back
Top Bottom