• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD freesync coming soon, no extra costs.... shocker

I don't suppose there's a way of checking which monitors support this VESA standard? I've just bought a 27" 1440P panel of ebay and I would love to know if there's any chance it run Freesync, if so a Radeon card is my next upgrade!

According to the Guru3D Nvidia response article, none of them do outside of laptops.
 
NVIDIA responds to AMD FreeSync

"Well it was bound to happen sooner rather then later, but Nvidia spoke about AMD FreeSync.
The interview is with Tom Petersen who spoke with the guys from techReport. So the credits and everything related to this post goes to them.
Basically Petersen says he's excited to see a competitor take an interest in this but he claims that as things now stand,
it will be close to impossible to implement free sync on a desktop display due to the use of different display architectures than on laptops."


source

however, the displayport 1.3 standard, which looks like it will be a properly finalised standard in 2-3 months, will support variable refresh rates, taking that argument right out of the window: http://www.pcper.com/reviews/Graphi...h-FreeSync-Could-Be-Alternative-NVIDIA-G-Sync



@drunkenmaster: Thanks for the explanation of frame pacing. I can see how that would be very useful if the frame rate of the game dipped below the maximum time the monitor could hold the frame in place without changing it since it has the potential to drastically reduce the delay between the frame being drawn and the frame being displayed. However, if it stays above this number then I just cant see how frame pacing will help, unless you are physically delaying the game itself from drawing the next frame.

My thinking is that if, for example, you had one frame that took 20ms, then the next one took 30ms, then the next took 25ms, then you would have 20ms worth of movement between the first and second frame, then 30ms of movement between the second and third frame, then 25ms worth of movement between the third frame and the frame after it. Surely if this is the case then it would be better to display the frames as soon as they got to the monitor, rather than having the monitor or the graphics card sit on the frames for a while because that would mean you would get the wrong amount of movement for the amount of time that the frame is displayed for, which would give you a less extreme version of the stuttering you get with vsync when you are between 30 and 60FPS (on a 60FPS monitor).

If frame pacing delayed the start of drawing the frame, meaning the amount of movement between each frame didn't change very much from frame to frame, then that would certainly help make the game look smoother, but this would have very little to do with the monitor at all and more to do with the game and the graphics card. I can also see this method being full of problems since it is incredibly difficult to predict how long it will take to draw a frame
 
That means that upcoming DP 1.3 panels COULD support variable refresh technology in an identical way to what we saw demoed with the Toshiba laptops today.

Could, notebook EPD are fundamentally very different and there's no way to know whether or not we will see it. You're still stuck with the piggy in the middle ASIC scaling.

I'll put this more succinctly for you and others.

Worst case for v-sync is 60fps to 30fps to 60fps, this is the absolute worst case possible(officially not, but it's the worst case Nvidia has showcased at any g-sync demo for v-sync) of stutter, frame times go 16.67 - 33 - 16.67ms.

MjFrosty believes g-sync is crap because you can't see such small times, you can't, he's debunked it, it's all a joke, you can't notice the difference at all.

I'm apparently pro Nvidia because I said the technology is good, but MjFrosty flat out insists v-sync is perfect because you can't notice this difference at all.............

Yup the hate is strong, from one of us, it's pretty clear which one that is.


The real question is how G-Sync is smoothing between pacing because people around the world are noticing a difference. You on the other hand were trying to depict how it's doing it within ms time frame and how it's holding it between delayed frames, but nobody knows exactly how. If they did I'm sure certain people would implement something similar. (That's your queue to tell me how AMD could have implemented it years ago and how we don't need it as you were earlier)

You're once again clutching at straws and slowly going full circle to say you're now pro G-Sync lol. :rolleyes:

Like me I don't think you realised that internal scaling on EPD isn't possible on desktops. I believe what NV is saying is true and that there is no way to manipulate it other than with panels that are VESA EPD 1.3 compliant at notebook level.

Carry on :)
 
Last edited:
The real question is how G-Sync is smoothing between pacing because people around the world are noticing a difference. You on the other hand were trying to depict how it's doing it within ms time frame and how it's holding it between delayed frames, but nobody knows exactly how. If they did I'm sure certain people would implement something similar. (That's your queue to tell me how AMD could have implemented it years ago and how we don't need it as you were earlier)

You're once again clutching at straws and slowly going full circle to say you're now pro G-Sync lol. :rolleyes:

Carry on :)

It's funny because your argument before said nothing of the sort, in fact you specifically said we couldn't notice those kinds of differences.... backing away from that and pretending we never said it are we, because it was ridiculous nonsense?

"The real question is how G-Sync is smoothing between pacing"

what, is this supposed to be gibberish pretending to be knowledgable, pacing IS smoothing, how are they smoothing between the smoothing, I don't know, the real question is surely how are they smoothing between the smoothing BETWEEN the smoothing though, isn't it?

Frame pacing is a pretty well explained field, from both sides, and we roughly speaking know how both do it because there is relatively on one rough way to do it... by you know, smoothing.... the frames......

I wonder if you can be honest. If you had a bunch of frames that were I dunno, 25ms, 45ms, 30ms, 16ms, 55ms, 25ms, 30ms.... how would you "smooth" these so you got a smooth set of frames rather than constantly changing rate? Would you attempt to smooth the difference by keeping track of a rough average and holding frames a couple ms here and dropping a frame there.

I would because what else would you do, add in 10 duplicate frames at one point and drop 30 frames elsewhere. There is only one goal in frame pacing... pacing the frames. Nvidia/AMD are on record everywhere explaining the goal, how they achieve it doesn't really matter, the goal is what matters, I was explaining the goal, not the method(though due to the Nvidia patents I have a very very very good idea on Nvidia's general method, not the algorithm or parameters they would use those.

I'm not pro g-sync, and I'm not clutching at straws, you claimed I was debunking it and now you're claiming I'm pro g-sync, you change your argument ENTIRELY every other post to suit yourself, and it's all been gibberish.

My stance hasn't changed since I first saw it. It's got potential to do a lot from 30-60, but it will be extremely lessened on a 120-144hz screen(every single review agreed on this point), it will do relatively little above 60fps and almost nothing about 90fps.

It's better to have, than have not, depending on your hardware and screen you may see almost no benefit, or loads.

I don't much care about freesync or g-sync in use, I like the technology though, is that very much okay with you? I have 2x290's and a 120hz screen, freesync will, outside of the odd insanely powerful game or something that won't run xfire, do very very little for me, well nothing considering the screen I have almost certainly doesn't support it. The technology simply interests me, is that allowed?
 
Like me I don't think you realised that internal scaling on EPD isn't possible on desktops. I believe what NV is saying is true and that there is no way to manipulate it other than with panels that are VESA EPD 1.3 compliant at notebook level.

Carry on :)

Actually I didn't, but the main point is, Nvidia didn't say no way, they said nearly impossible AS it stands, nothing more or less. G-sync doesn't work on current monitors either... I'm not sure what the point is.

What do people think a laptop monitor is, will only work off battery power, uses different pixels. Package any existing freesync supporting monitor in a desktop stand and you have a freesync monitor..... with AMD hardware capable of running and supporting either...

There are not the options or extra validation done in desktop monitors because no one bothered, this doesn't mean can't be done, in the slightest. Maybe it will be entirely because Nvidia pushed them, or maybe it was coming anyway(which I believe is what Vesa/AMD were surprised by), but this is all stuff that is painfully, ridiculously easy to package up as a desktop monitor.

It's a panel, :o, and a controller :o and a cable :o and AMD hardware which supports the cable and output and standard.

My theory was a couple months ago that it would become industry standard when the monitor makers had a chance to refresh their chips with the required options, now it turns out the controllers and chips exist, just no one bothered to make a desktop screen out of them. It's easier than I thought it was going to be, not harder or impossible, but easier. They can merely re purpose existing technology rather than come up with new technology.

EDIT:- more to the point, freesync/g-sync benefits those with lower frame rate on lower hardware the most.

Laptops are where g-sync/freesync will be MOST effective to begin with, and that is where it already works with AMD cards and already released laptops, no Nvidia laptops support g-sync. Funny that.
 
Last edited:
This is hilarious, let me quote something from that article that you've made these claims based on shall I



I don't know if this really needs explaining for you, I'll bold a few words for you.

Do I really need to go further, okay, typically have a scaler... so not all do, okay. Nearly impossible... another term for this is.... NOT impossible, or POSSIBLE, and "as things stand now".

As things stood 2 months ago, nothing in the world worked with g-sync. As things stand today isn't particularly relevant to how things stand tomorrow.

eDP/DP, almost identical, both(afaik) support Vblanking.

But based off an article that DOESN'T claim it's impossible NOW, certainly doesn't say it can be widely done in the future and actually has the Nvidia guy SPECIFICALLY SAY that they hope g-sync pushes the industry towards variable refresh rate scalers....... you claim it will never work on desktop?

Yeah but you only have to look at recent history to see where this will go...

Chances are AMD will simply cry about NVidia not allowing them to use the G-sync module rather than license the technology from them or get off their own backsides and provide their customers with it themselves. They expect everything NVidia invest in to be given to them for free, if NVidia did that they wouldn't stand a chance because AMD sell their cards much cheaper due to their lack of investments.

Here's a hint, laptop screen, uses eDP, amd support eDP, it uses display port cables. So if you took a "laptop" screen, put it on a stand and used the existing cable connection......errm.... nope, that would work, quite obviously

Thats why people are usually willing to pay extra for NVidia, they package things up nicely much like Apple without their customers having to resort to botching things and relying on non-official/third party support.
 
Youve missed his point again, he is saying that IF Amd have added hardware and Nvidia havent then nvidia cards will still not be compatible with freesync, the end result will still be amd will only work with freesync and Nvidia will only work with gsync, that one is an "open standard" becomes irrelevant IF nvidia cards lack the hardware

Gsync is closed, Freesync is open, an 'open standard' is relevant as it's free to the end user with support via gpu's, the closed standard is locked to Nvidia at a cost to the user via extra hardware.

It's Nvidia's(and any other manufacturers) choice to implement said hardware or not, the end result is in Nvidia's hands not AMD's.


I didn't miss his point.

If it needs the hardware that is present in AMD GPU's then it wont work on non AMD GPU's

Again andy, that statement is incorrect, ANY gpu will work if said manufacturer implements the specs required.

If it needs the hardware that is present in AMD GPU's then it wont work on non AMD GPU's unless others implement the required hardware/software.

I wouldn't have bothered if he wrote that, as it would be correct-going by the info we currently have.
 
Gsync is closed, Freesync is open, an 'open standard' is relevant as it's free to the end user with support via gpu's, the closed standard is locked to Nvidia at a cost to the user via extra hardware.

It's Nvidia's(and any other manufacturers) choice to implement said hardware or not, the end result is in Nvidia's hands not AMD's.

And? What's the problem?

nVidia develop a bespoke nVidia solution end to end. GPU to Monitor. Designed from the ground up with this in mind. If you want that you buy into nVidia and pay for it. If you don't then...tough? Why is anyone entitled to it? Because a seemingly undeveloped, somewhat similar VESA standard exists that MIGHT do a similar thing?

I would bet solid money that the G-Sync solution will have better overall support with games and perform better from a gaming point of view IE Input Lag, Response Times etc. As far as I can tell G-Sync is being developed with gaming PURELY in mind.

Sure, eventually we can draw comparisons between G-Sync and Freesync, but it's far too early to make any meaningful ones now. The only thing we have is what we know about both technologies so far, and speculation of how it will play out.

My personal opinion? (Read - OPINION, not fact) AMD have latched onto a VESA standard that has some uses in the mobile device space for power savings. A standard that has been entirely undeveloped for gaming use because it never had a gaming use intention. G-Sync is announced and there is a possibility that AMD can leverage a VESA standard to do a similar thing. It's open, can be supported universally without too much hassle and presents a huge PR/Marketing opportunity. It's nowhere near going to market but might as well leverage it as a fight against nVidia.

From a business point of view, solid tactics.

As for us gamers, yeah - Pinch of salt methinks. Writing off G-Sync and claiming Freesync is the holy grail of smooth gaming that costs us nothing now? Please!
 
You are deliberately misreading what he said (and me by association) to make a fight over nothing

IF AMD cards have hardware specifically for this and NVidia currently don't, then the situation is that freesync is effectively locked to AMD, the end result is the same, the reasons why are irrelevant to an end user, they either choose AMD GPU + Freesync monitor, or they choose NVidia card and Gsync monitor

what happen in 2 or 5 years time is irrelevant to someone looking at buying a new GPU or mintor now

as above, this is going to need entirely new monitors for freesync, so who knows how long it will take AMD to convince them to do it

Thank you for somebody actually reading what I wrote, rather than just reading what they want to read.
It happens all the time on this forum and I myself have been guilty of it on occasion, of course there will now be a massive post telling me just how wrong I am and how what I said wasn't actually what I said and these words don't actually mean what I mean.
This thread isn't interesting anymore, its like one of those threads on other forums that has adverts between the posts that are a pain to scroll through.
 
I didn't miss his point.

Again andy, that statement is incorrect, ANY gpu will work if said manufacturer implements the specs required.

If it needs the hardware that is present in AMD GPU's then it wont work on non AMD GPU's unless others implement the required hardware/software.

I wouldn't have bothered if he wrote that, as it would be correct-going by the info we currently have.

So which other currant GPU's will this work on then? If it needs the hardware that is on the AMD GPU's.....oh would that be none....so my original statement at this point in time is factually correct, now if I had of written it in 5 months time who knows if it would be correct or not, but right now it is correct.
 
It's funny because your argument before said nothing of the sort, in fact you specifically said we couldn't notice those kinds of differences.... backing away from that and pretending we never said it are we, because it was ridiculous nonsense?

"The real question is how G-Sync is smoothing between pacing"

what, is this supposed to be gibberish pretending to be knowledgeable, pacing IS smoothing, how are they smoothing between the smoothing, I don't know, the real question is surely how are they smoothing between the smoothing BETWEEN the smoothing though, isn't it?

Frame pacing is a pretty well explained field, from both sides, and we roughly speaking know how both do it because there is relatively on one rough way to do it... by you know, smoothing.... the frames......

I wonder if you can be honest. If you had a bunch of frames that were I dunno, 25ms, 45ms, 30ms, 16ms, 55ms, 25ms, 30ms.... how would you "smooth" these so you got a smooth set of frames rather than constantly changing rate? Would you attempt to smooth the difference by keeping track of a rough average and holding frames a couple ms here and dropping a frame there.

I would because what else would you do, add in 10 duplicate frames at one point and drop 30 frames elsewhere. There is only one goal in frame pacing... pacing the frames. Nvidia/AMD are on record everywhere explaining the goal, how they achieve it doesn't really matter, the goal is what matters, I was explaining the goal, not the method(though due to the Nvidia patents I have a very very very good idea on Nvidia's general method, not the algorithm or parameters they would use those.

I'm not pro g-sync, and I'm not clutching at straws, you claimed I was debunking it and now you're claiming I'm pro g-sync, you change your argument ENTIRELY every other post to suit yourself, and it's all been gibberish.

My stance hasn't changed since I first saw it. It's got potential to do a lot from 30-60, but it will be extremely lessened on a 120-144hz screen(every single review agreed on this point), it will do relatively little above 60fps and almost nothing about 90fps.

It's better to have, than have not, depending on your hardware and screen you may see almost no benefit, or loads.

I don't much care about freesync or g-sync in use, I like the technology though, is that very much okay with you? I have 2x290's and a 120hz screen, freesync will, outside of the odd insanely powerful game or something that won't run xfire, do very very little for me, well nothing considering the screen I have almost certainly doesn't support it. The technology simply interests me, is that allowed?


It's not gibberish. You were talking about the differences in delay times and how they'd translate on screen? I think so anyway, you inflate what could be said in a matter of words to pure pages with flaming here there and in between.

I was lets say 'speculating' (So that you don't put too much crap again) that the real question is how the pacing / smoothing (again you're getting hung up on terminology, stop being a such a hermit, I meant to say between frames) is translated overall, and the time differences are what the scaler is there for in the first place. So how it is interpreting those signals and holding them is a question you should maybe ask Nvidia instead of jabbering on about it to yourself lol!

All you are doing is saying "Oh I wonder how exactly it does this, as I don't understand how" and that some how makes it almost void of purpose. I think you should just stop worrying about the unexplained. Lets not forget you're the one who came on here screaming bloody murder that it's a free alternative on the same proposed VESA standard, when in fact it's probably not at all.

If you understood pacing as much as you're making out then you should have probably realised that EDP1.3 isn't comparable to G-Sync's scaling. Enough. You've gone from saying we don't need it, to saying it should be free look at this, to saying I can understand it's purpose. What else is there to say.
 
Last edited:

And nothing, should be simple enough to grasp.

What's the problem?

Best direct that to bru/andy/others(generally Nvidia users), that either can't grasp or are attempting to deflect from the meaning of 'open standards'/free to the end user(in this case AMD's users) to suit a pov.

It's as simple as one is free, the other one must be paid for, I can't help it if certain users try to imply an open standard as proprietary tech.

Free/Gsync have little to no interest personally.

:)

@bru,

What you wrote, doesn't add up, your statement is incorrect, unless every other gfx solution currently does not have the capabilties.
 
Last edited:
Lets me get this straight, Nvidia say Free-Sync will be nearly impossible for AMD to implement because most Desktop Panel vendors don't currently support it.

While at the same time Most Panel vendors don't currently support Nvidia's G-Sync.

Whats the problem here?
 
So which other currant GPU's will this work on then? If it needs the hardware that is on the AMD GPU's.....oh would that be none....so my original statement at this point in time is factually correct, now if I had of written it in 5 months time who knows if it would be correct or not, but right now it is correct.

You will waste your time and life debating this Bru with some individuals purposefully reading what you type and then purposefully twisting it to suit their own agenda. The bottom line is nVidia have G-Sync in the 'here and now' and AMD demo'd a laptop. If AMD have this nailed, they would be showing it off with a 290/X on the big screen (like nVidia did).

I will be giving my review of G-Sync on the Asus monitor for those who are interested and I can run SBS demonstrations for with and without G-Sync :)

I also look forward to an AMD user giving his feedback on his own testing of freesync. :)
 
@bru,

What you wrote, doesn't add up, your statement is incorrect, unless every other gfx solution currently does not have the capabilties.

if you read his very first post on the subject, he was referring to the article that was linked - the article says that AMD GPU's have had the hardware to support this for the last 2-3 generations, Bru asked what about NVidia hardware, is NV hardware missing this hardware and if so...

for the end user it doesn't really make a difference if freesync is an open standard or not, if only AMD support it then the end result is that you need an AMD GPU and a freesync monitor to get freesync, or you need an NVidia GPU and Gsync monitor to get Gsync

people are getting all pro-AMD-pro-OpenStandard these days, but as has already been linked to on these forums, PhysX was turned down by AMD and they even refused to provide any support to a third party that was trying to make a tool to get PhysX working on AMD hardware (when Nvidia had been helping them)

it is a simple fact that if you want PhysX and/or gsync you have to buy Nvidia, if you want freesync or mantle you need to buy AMD... how this might change in the future is anyone's guess, but based on what we know right now these are facts

Lets me get this straight, Nvidia say Free-Sync will be nearly impossible for AMD to implement because most Desktop Panel vendors don't currently support it.

While at the same time Most Panel vendors don't currently support Nvidia's G-Sync.

Whats the problem here?


the difference is there are 6 manufacturers who definitely have signed up to provide gsync monitors and we KNOW they are coming in the coming few months

AMD have not even started working with any manufacturers to get a freesync monitor up and running
 
people are getting all pro-AMD-pro-OpenStandard these days, but as has already been linked to on these forums, PhysX was turned down by AMD and they even refused to provide any support to a third party that was trying to make a tool to get PhysX working on AMD hardware (when Nvidia had been helping them)

:rolleyes::rolleyes::rolleyes:
 
All Tomy is saying is AMD and possibly Intel get G-Sync in another form on compatible screens for no extra cost, while with Nvidia you get G-Sync on compatible screens that cost $150 more.

The good thing is Nvidia could adopt the same version AMD and Intel would be using if they wanted to.

Whats wrong with that?
 
Last edited:
@Greg

If that was aimed at me then that's bs.

@Andy

I'm well aware what works on what, I only picked up on that particular quote as bru tends to be negative in anything AMD so I skipped the rest.

The physx part, can you point out out source where AMD specifically declined Nvidias offer and what the conditions were please, all I have ever seen is along the lines of Nvidia stating 'we are open to other vendors using PhysX' while actively going out their way blocking Physx on their own hardware when an AMD gpu is present in the system.
 
Back
Top Bottom