• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD freesync coming soon, no extra costs.... shocker

I know that using comments in a tech article is a pretty poor source (especially since they are translated from another language), but if what they are saying is true then it will make freesync virtually useless for gaming because it doesn't work very well with a constantly varying refresh rate (which would explain why AMD went with a demo that had a perfectly constant framrate):
http://be.hardware.info/nieuws/38467/ces-amd-toont-gratis-alternatief-voor-g-sync-freesync



I don't pretend to understand the technical details of how vsync and gsync are working, and I also freely admit that the source I'm getting this from is a bit crap, but if this is actually the case then it would certainly explain why nvidia went with the gsync route when they must have known about the vesa standard that was in development that freesync is using.

If this guy is talking from his behind then do please carry on as you were :)


Originally Posted by Buzz Buzz
Free Sync will not work as well as G-Sync.
FreeSync must framerate predict and that the refresh rate / VBI dynamically set. This costs performance because it has to be calculated and V-sync must be on. At G-Sync card can tell the monitor when to change again. This may be so in FreeSync not!
However, the question will be whether the difference in quality is worth the price of G-Sync. Wait until there is objective comparisons are made ​​with real setups.
Lets look at what he said.

Free Sync will not work as well as G-Sync.
FreeSync must framerate predict and that the refresh rate / VBI dynamically set. This costs performance because it has to be calculated
Well, i really don't know what he's driving at with predicting the frame rate, it gets that information from the GPU. there is no predicting involved, predicting frame rates makes no sense, it defeats the whole objective of vesa standard.

and V-sync must be on.
Why? thats just a blanket statement. some form of frame rate limiting needs to be on both G-Sync and Free-Sync as 200 FPS will not work on a 120Hz screen.

At G-Sync card can tell the monitor when to change again. This may be so in FreeSync not!
Again, blanket statement, an emotional one at that.

Well, anyway, thats an...? interesting? thing to say when he linked the Anand article which said

The system on the left is limited to 30 fps given the heavy workload and v-sync being on, while the system on the right is able to vary its frame rate and synchronize presenting each frame to the display's refresh rate.
Exactly like G-Sync, everything Buzz Buzz said there is the opposite to Anands quote. did he even read it? perhaps he was to emotional to read it.
 
Last edited:
I know that using comments in a tech article is a pretty poor source (especially since they are translated from another language), but if what they are saying is true then it will make freesync virtually useless for gaming because it doesn't work very well with a constantly varying refresh rate (which would explain why AMD went with a demo that had a perfectly constant framrate):
http://be.hardware.info/nieuws/38467/ces-amd-toont-gratis-alternatief-voor-g-sync-freesync



I don't pretend to understand the technical details of how vsync and gsync are working, and I also freely admit that the source I'm getting this from is a bit crap, but if this is actually the case then it would certainly explain why nvidia went with the gsync route when they must have known about the vesa standard that was in development that freesync is using.

If this guy is talking from his behind then do please carry on as you were :)

He isn't, but he doesn't realise this is what g-sync must do.

If you watch the pendulum demo the key to smoothness isn't variable framerate but the smooth change in frame rate.

Going 60 to 30 to 60fps, means frame times of 16.67 to 33.33ms, to 16.67ms.

Now think v-sync, drops below 60fps, hits 30fps.... goes back to 60fps. This is precisely the stutter g-sync attempts to eliminate. It CAN'T do this by just changing frame rate, it has to smooth the frame rate. The pendulum demo goes from 60 to 59 to 58fps and so on. This means 16.67ms, to around 17ms, to 17.5ms between each frame. With a smooth change that tiny difference(sub 1ms) between frames is what induces the smoothness.

You literally can't explain this as possible unless g-sync is predicting the next frame rate. If 10 frames in a row at 16.67ms apart but the next one is at 30fps and is going to come 32ms later, how do you make that smooth?

Well, you have hardware keeping track, it knows the previous frames came 16.67ms apart, the only way to maintain smoothness is to decide the biggest gap after 16.67ms is acceptable to stay smooth, say 2ms max, and refresh with the same frame again. So 18ms later, the next frame is lets say comes in after 32ms(from the original frame) which is going to be(32-18 = ) 14ms later. so it has this, but knows the last frame time was 18ms... and knows the last frame took 32ms to calculate. So it says, we need it smooth, but know the current frames are being produced at 32ms apart. So it will hold this frame to say 20ms, then show it. Because it wants to maintain smoothness to the previous 18ms frame time... but it knows the frames are being created 32ms apart... so it needs to essentially work towards that one small step at a time. When frame rate increases it's as important to smooth the frame rate the other way, though I suspect it will do quicker steps as though bigger frame time differences, the actual faster frame rate itself likely compensates for that.

Frame smoothing is the absolute main feature of g-sync, not variable frame rate. Frame prediction/tracking will be a monumental part of this, for both companies. Without frame smoothing g-sync is only as smooth as the frame rate change, which could be perfect, or absolutely awful. I'm literally 100% certain g-sync has to absolutely do frame smoothing and will be doing loads of it's own prediction to tell the gpu when it's best to send the next frame.

There will be consistently lots of marketing bull crap from both companies about it because explaining it simply is what 99.9999% of users want, most won't read what I posted let alone if Nvidia/AMD tried to explain it to most people. It's significantly more complex than what I've stated, but essentially impossible to produce g-sync or an AMD equivalent without a huge amount of calculating and tracking. I think/guess Nvidia has done this in hardware and maybe only done this since Kepler. Likely an addition of frame pacing features. ultimately this is like 80% of the frame pacing tech(monitoring frame rates and keeping it smooth) with 20% more stuff being done with matching these changes to refresh rates in the smoothest way possible.

If it needs the hardware that is present in AMD GPU's then it wont work on non AMD GPU's

This is the original post you were responding to and your response

Quote:
Originally Posted by weldon855 View Post
its a vesa standard please tell me how you think AMD plan to lock this?
If it needs the hardware that is present in AMD GPU's then it wont work on non AMD GPU's.

Didn't the original article say it was a proposed VESA standard? or have I got confused during the reading of this thread?

It was specifically in reference to how AMD would lock this in to them and your insistence that if AMD use hardware to make this work then it can't work on non AMD gpu's. This is wrong, as I pointed out.

It's fairly obvious as well, there are pretty much thousands of standards gpu's, cpu's, apus, memory, hdd's all adhere too that allow them to communicate with the same messages and enable the same things, yet the hardware that generates those signals is completely different.

I can plug any of a hundred hdd's into my computer, many/most of them will use different algorithms, memory types, controllers, buses, yet they all send the same messages to my mobo and it can read them all despite the hardware in those devices being different. Likewise one hdd can send and receive data the same from an Intel or AMD motherboard which both have entirely different hardware for doing the same task.

If the monitor is using an industry standard, it does not matter how AMD generates their signal, how they determine what frame rate they want, Intel will be able to generate the same signal from their own hardware any way they please and use the same monitor with the same modes.

It's the entire reason for industry standards, to do precisely what you are saying they won't do. So no you aren't right, AMD wouldn't be able to lock it in by any info we've heard and no a monitor with this option wouldn't need an AMD gpu.

Again to HDD's, how can AMD read and send data to a industry standard sata 3 hdd, and Intel can do the same, using different hardware?

No wonder this industry is in such a mess when standards body (ie VESA) is basically run by no one else but those MAD guys (amd). Sometimes someone has to take a charge and make things happen for REAL

Yup, the more things change the more they stay the same.

They're all insane, greed plays a huge part in it.
HDMI usage costs a company $10k a year and $0.04 per device. Sell 100mil monitors with hdmi ports and that is $4million in income, in one year. Add up every laptop, every monitor, every console, every bluray player, every tv that use HDMI, multiple by the ports available on those devices and realise what the HDMI guys(i forget who else is in it except Sony, there are others) and you realise there are basically $100's millions at stake a year over which format moves forward to become the defactor industry standard and you realise why Display Port, royalty free, (yes a fee to join but it's not insane for big businesses and numerous millions less, effectively free vs using things like HDMI) has encounted such fierce opposition and slow adoption.

AFAIK certification of display port devices has increased like 63% in the past year. It's finally getting there, Intel finally threw it's support behind it, Apple is one of the biggest pushes of it outside AMD. It's pretty much the only viable option for 4k(afaik still the only current single cable that can push 4k above 30hz).
 
Last edited:
Frame Pacing, drunkenmaster :) What your describing in hardware frame tracking i think is Frame Pacing.

Yup, I mentioned it if you read far enough :p

I would be very shocked if both AMD/Nvidia weren't using their implementations of frame pacing monitoring hardware to do most of the work. It will be slightly more complex as it's got to take into account refresh rate to determine which frames to drop, which to push, when to send them, etc. But frame pacing will be the foundation of it for sure.

Someone asked why if AMD frame pacing was fine why they'd put it in hardware, I'm absolutely not saying this is why, I just found it funny. Anything will run faster in fixed function than software, but frame pacing, tuning would be a pain but the actual calculations, keeping track of a rate of change and things is really pretty simple in reality, it's not hardware intensive, but it could certainly be latency intensive and having everything done on hardware on the gpu without having to come back to the CPU would certainly save some clocks, but ultimately AMD/Nvidia will always put anything worthwhile into fixed function hardware. the entire industry has gone that way for anything they can offload.

But people really do need to realise, the variable frame rate can act exactly as bog standard rubbish v-sync, if a game goes 60-30-60-30 then g-sync would perform exactly the same as v-sync, which it doesn't and that would be absolutely worst case scenario which is actually where g-sync is at it's best.

Frame smoothing is the fundamental tech behind g-sync/freesync, variable framerate is essentially the next step in frame pacing. Frame pacing produces the smooth frame rates, variable refresh rate monitors allows it to show it's full value.

The entire concept of frame pacing for any company is based around monitoring, calculating and prediction, so anyone suggesting this as a downside for AMD doesn't know what they are talking about.

Nvidia's patents linked to in the other thread are precisely about using some fixed function hardware to track the rate of frame rate change to calculate when to update refresh rate.
 
I would be very shocked if both AMD/Nvidia weren't using their implementations of frame pacing monitoring hardware to do most of the work. It will be slightly more complex as it's got to take into account refresh rate to determine which frames to drop, which to push, when to send them, etc. But frame pacing will be the foundation of it for sure.

Right, i don't see how it can work without it. its like throwing an item at someone unsuspecting and then shouting "incoming" after its landed on the floor behind him.
 
AMD hardware has to comply with the Vesa standard.

NOT-The Vesa standard has to comply with AMD's hardware implementation.



I won't argue there.:p

Youve missed his point again, he is saying that IF Amd have added hardware and Nvidia havent then nvidia cards will still not be compatible with freesync, the end result will still be amd will only work with freesync and Nvidia will only work with gsync, that one is an "open standard" becomes irrelevant IF nvidia cards lack the hardware
 
Youve missed his point again, he is saying that IF Amd have added hardware and Nvidia havent then nvidia cards will still not be compatible with freesync, the end result will still be amd will only work with freesync and Nvidia will only work with gsync, that one is an "open standard" becomes irrelevant IF nvidia cards lack the hardware

Well firstly he was talking about AMD locking this tech due to having this "hardware" in their own gpu making it NOT WORK on non AMD gpu's. SO that isn't what he was saying, as he was quite specific. Second, Intel already support the same option in hardware that AMD do by all accounts(I get the gist that if you support eDP, which is currently a superset of DP 1.2, then it's almost certain the hardware supports the feature). There is nothing pretenting Nvidia adding this hardware capability into future hardware.

There is EVERYTHING preventing AMD adding the capability to use g-sync into future hardware.

So open standard becomes irrelevant, no, because that is the point of an open standard, it's open, Nvidia can choose to support it whenever the hell they like, choosing not to support something and not letting something work is NOT based on AMD letting or not letting them do this. It's a free choice for Nvidia, hence, open standard. It is NOT a option for AMD to freely choose to use the g-sync module and Nvidia driver to use g-sync hence not open standard.
 
NVIDIA responds to AMD FreeSync

"Well it was bound to happen sooner rather then later, but Nvidia spoke about AMD FreeSync.
The interview is with Tom Petersen who spoke with the guys from techReport. So the credits and everything related to this post goes to them.
Basically Petersen says he's excited to see a competitor take an interest in this but he claims that as things now stand,
it will be close to impossible to implement free sync on a desktop display due to the use of different display architectures than on laptops."


source
 
Last edited:
All the "nVidia ripping their customers off" is nonsense. Nobody is holding a gun to their head saying they must buy new monitors or else they can't game.

It's a new feature, an optional feature and if you are interested in it and you want to try it then you weigh up the cost/benefits like you would do in any other scenario and make a decision. I have no idea why people are getting so worked up over it.
 
You literally can't explain this as possible unless g-sync is predicting the next frame rate. If 10 frames in a row at 16.67ms apart but the next one is at 30fps and is going to come 32ms later, how do you make that smooth?


30hz @ 33.3ms. The scaler then holds the frame in memory so that it can be compared to the incoming frame. If the frame doesn't arrive in time it simply redraws the last frame. You're working on the assumption that G-Sync isn't fast enough, but then that is why we've got an upgraded scaler surely. One you keep pointing out we don't need.

Your hate is strong.
 
Well firstly he was talking about AMD locking this tech due to having this "hardware" in their own gpu making it NOT WORK on non AMD gpu's. SO that isn't what he was saying, as he was quite specific. Second, Intel already support the same option in hardware that AMD do by all accounts(I get the gist that if you support eDP, which is currently a superset of DP 1.2, then it's almost certain the hardware supports the feature). There is nothing pretenting Nvidia adding this hardware capability into future hardware.

There is EVERYTHING preventing AMD adding the capability to use g-sync into future hardware.

So open standard becomes irrelevant, no, because that is the point of an open standard, it's open, Nvidia can choose to support it whenever the hell they like, choosing not to support something and not letting something work is NOT based on AMD letting or not letting them do this. It's a free choice for Nvidia, hence, open standard. It is NOT a option for AMD to freely choose to use the g-sync module and Nvidia driver to use g-sync hence not open standard.

You are deliberately misreading what he said (and me by association) to make a fight over nothing

IF AMD cards have hardware specifically for this and NVidia currently don't, then the situation is that freesync is effectively locked to AMD, the end result is the same, the reasons why are irrelevant to an end user, they either choose AMD GPU + Freesync monitor, or they choose NVidia card and Gsync monitor

what happen in 2 or 5 years time is irrelevant to someone looking at buying a new GPU or mintor now

as above, this is going to need entirely new monitors for freesync, so who knows how long it will take AMD to convince them to do it
 
NVIDIA responds to AMD FreeSync

"Well it was bound to happen sooner rather then later, but Nvidia spoke about AMD FreeSync.
The interview is with Tom Petersen who spoke with the guys from techReport. So the credits and everything related to this post goes to them.
Basically Petersen says he's excited to see a competitor take an interest in this but he claims that as things now stand,
it will be close to impossible to implement free sync on a desktop display due to the use of different display architectures than on laptops."


source

According to this article, it's a fuss about nothing as it can NOT be done on desktop screens using current hardware nor software. Forget Vblank etc.

Gsync module is the only way of doing what it says on the tin outside of laptops. So there is nothing similar actually coming from AMD ever, as it cannot be done, they just want to take the attention out of Gsync with a purposely non-informative demo so people/fanboys can fill in the speculation and not buy a Gsync monitor.

For everything else, there's Masterca.. I mean, for laptops, there's Freesync, but I am also expecting to see gaming laptops with built-in Gsync module.
 
I wasn't aware Laptop LCDs were void of scalers. Makes sense though.

Completely different technology then. Well that's no surprise. Might be to some though.

eDP is a superset of dp 1.2, and doesn't do much extra, it's absolutely not entirely different technology, think about the source, which also goes on to say dynamic refresh rate is possible today on current display port cables with no update to the vesa standard..... so it is possible then Mr Nvidia?

30hz @ 33.3ms. The scaler then holds the frame in memory so that it can be compared to the incoming frame. If the frame doesn't arrive in time it simply redraws the last frame. You're working on the assumption that G-Sync isn't fast enough, but then that is why we've got an upgraded scaler surely. One you keep pointing out we don't need.

Your hate is strong.

Seriously, just what? Please explain the hate for Nvidia in saying frame smoothing is what makes g-sync good?

I'm not making any assumption that g-sync isn't fast enough, I literally said nothing close to it and nothing that can be interpreted as such. Second, explain what happens when the frame rate dips from 60fps to 30fps in your scenario.
G-sync is going along at 16.67ms frame difference, then the next frame doesn't arrive for 33ms so it just refreshes the same frame at 33ms again right.....then what another frame comes up and it does it 16.67ms later?

Let me explain what would happen here with v-sync........

(I stopped implying it was the same thing, because what the first part was describing was basically v-sync, and it wouldn't be "g-sync" smooth).

The ENTIRE point of g-sync is if that next frame appeared after 34ms, on a 60hz screen, in your scenario it would refresh at 33ms, and have to wait another 15ms to show that new frame.

If you attempt to smooth out the frame rate by saying "the last 10 frames were 16.67ms, so to keep it smooth the next one has to be no later than 18ms" when no new frame appears in that time they refresh the last one.

so how long from the original frame can it show this new frame now based on predicting and smoothing frame rate?

First scenario 16.67ms, same frame refreshed 33.33ms later, new frame arrives at 34ms, earliest it can be drawn 15ms later.

In the g-sync/smoothed frame rate scenario, first frame 16.67ms, timer limit for frame to not arrive triggered so refreshes at 18ms to keep a frame difference small, new frame comes in at 34ms, this can be shown 16.67ms after the last refresh which instead of 1ms previously(without frame smoothing) is now 16ms, so the new frame can actually be shown .67ms later instead of 15ms later.

You can't do g-sync without a hell of a lot of tracking and monitoring to determine when to update and when to not. The entire point of it is bringing the frame pacing style smoothing to the screen as well. There has to be some mechanism for smoothing out the refreshes as the stutter on screen comes from larger changes in frame time, which when you can control the refresh rate will show as big differences in refresh times.

Watch the pendulum demo and see the exceptionally smooth frame rate change, that is the entire point. Jumping from 60 to 30 to 60 to 30 is what v-sync does, with no smoothing g-sync could only do the same, smoothing is the very difference between the two technologies.
 
Why are you depicting miliseconds as if you'd be able to notice the difference between them? (Apart from worst case maybe). Literally you're debunking the technology without seeing it first hand. It's pacing is clearly going to be much better than using V-Sync, given peoples reactions from seeing it first hand.

This is me giving you the last word chap.
 
Why are you depicting miliseconds as if you'd be able to notice the difference between them? (Apart from worst case maybe). Literally you're debunking the technology without seeing it first hand. It's pacing is clearly going to be much better than using V-Sync, given peoples reactions from seeing it first hand.

This is me giving you the last word chap.

lol, please ask Nvidia that same question on twitter. Apparently your argument is ms don't matter because we can't see them, that is golden. Also in what way am I debunking the technology at all?

Read ANY review on AMD/Nvidia frame pacing and realise the entire thing about it is the same millisecond differences.


It's actually the perceived difference of rates that are important than the rates themselves but you will find that realistically the very biggest ms difference between g-sync and v-sync will be waiting one full frame refresh.

So you made something up I didn't say, again, decide it was pure hate when I said it was smart and good, then decided to base your argument on being unable to see such small ms differences........ despite g-sync being incapable of showing a much bigger difference than one frame, ever. It's the smoothness of said frames which is what it's all about. The reduced latency over v-sync comes from that 1ms vs 15ms update to the screen of the newest frame drawn, the smoothness 16.67-18ms-16.67ms is actually what "makes" g-sync.

It's the entire point of it, go and ask about frame time differences on any other forum, Nvidia's own, see what they say, read some frame pacing reviews on Nvidia, read Nvidia's interviews on frame pacing and then come back and say a few ms don't matter at all.

Again, because you don't seem to get it, v-sync jumping from 60 -30-60 fps constantly is going 16.67-33-16.67ms. That is v-sync, the time differences are obviously very small, and apparently... you can see them. The change in frame time IS the stutter, that entire 16.67ms IS the problem, can't see it, its the entire thing g-sync is out to change, that 16.67ms change in frame time and YOU are the one debunking g-sync by suggesting this change in frame time is not perceivable.
 
Why are you depicting miliseconds as if you'd be able to notice the difference between them? (Apart from worst case maybe). Literally you're debunking the technology without seeing it first hand. It's pacing is clearly going to be much better than using V-Sync, given peoples reactions from seeing it first hand.

This is me giving you the last word chap.

I'll put this more succinctly for you and others.

Worst case for v-sync is 60fps to 30fps to 60fps, this is the absolute worst case possible(officially not, but it's the worst case Nvidia has showcased at any g-sync demo for v-sync) of stutter, frame times go 16.67 - 33 - 16.67ms.

MjFrosty believes g-sync is crap because you can't see such small times, you can't, he's debunked it, it's all a joke, you can't notice the difference at all.

I'm apparently pro Nvidia because I said the technology is good, but MjFrosty flat out insists v-sync is perfect because you can't notice this difference at all.............

Yup the hate is strong, from one of us, it's pretty clear which one that is.
 
I don't suppose there's a way of checking which monitors support this VESA standard? I've just bought a 27" 1440P panel of ebay and I would love to know if there's any chance it run Freesync, if so a Radeon card is my next upgrade!
 
According to this article, it's a fuss about nothing as it can NOT be done on desktop screens using current hardware nor software. Forget Vblank etc.

Gsync module is the only way of doing what it says on the tin outside of laptops. So there is nothing similar actually coming from AMD ever, as it cannot be done, they just want to take the attention out of Gsync with a purposely non-informative demo so people/fanboys can fill in the speculation and not buy a Gsync monitor.

For everything else, there's Masterca.. I mean, for laptops, there's Freesync, but I am also expecting to see gaming laptops with built-in Gsync module.

This is hilarious, let me quote something from that article that you've made these claims based on shall I

generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitoras things now stand.

I don't know if this really needs explaining for you, I'll bold a few words for you.

Do I really need to go further, okay, typically have a scaler... so not all do, okay. Nearly impossible... another term for this is.... NOT impossible, or POSSIBLE, and "as things stand now".

As things stood 2 months ago, nothing in the world worked with g-sync. As things stand today isn't particularly relevant to how things stand tomorrow.

eDP/DP, almost identical, both(afaik) support Vblanking.

But based off an article that DOESN'T claim it's impossible NOW, certainly doesn't say it can be widely done in the future and actually has the Nvidia guy SPECIFICALLY SAY that they hope g-sync pushes the industry towards variable refresh rate scalers....... you claim it will never work on desktop?

I think I know which side you're on ;)


Here's a hint, laptop screen, uses eDP, amd support eDP, it uses display port cables. So if you took a "laptop" screen, put it on a stand and used the existing cable connection......errm.... nope, that would work, quite obviously.
 
Back
Top Bottom