• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD: FAQ for Project FreeSync is up

"There are three key advantages Project FreeSync holds over G-Sync: no licensing fees for adoption, no expensive or proprietary hardware modules, and no communication overhead.
The last benefit is essential to gamers, as Project FreeSync does not need to poll or wait on the display in order to determine when it’s safe to send the next frame to the monitor.
Project FreeSync uses industry-standard DisplayPort Adaptive-Sync protocols to pre-negotiate supported min/max refresh rates during plug’n’play, which means frame presentation to the user will never be delayed or impaired by time-consuming two-way handshakes."



This will be interesting, maybe this will help with Input lag? Seen Gsync you must lower Frame rate to keep the Input lag down.

Really hope this the case, I want input lag, like using no sync for the best performance possible.

"What is the supported range of refresh rates with FreeSync and DisplayPort Adaptive-Sync?
​AMD Radeon™​ graphics cards will support a wide variety of dynamic refresh ranges with Project FreeSync. Using DisplayPort Adaptive-Sync, the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display. Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz."


Another interesting plus, Doesn't Gsync only perform better when the frate rate is about 40fps?
Freesync seem to be able to go lower.

It is our current understanding that Project FreeSync does indeed support a wider range of dynamic refresh rates than NVIDIA G-Sync.
 
Well the 4k gsync monitor costs no more than the regular 4k monitors.

Also for the dynamic refresh rates during gaming you need a new series card.

So potentially for people wanting to use it with their current monitor or card it's going to be expensive.

Only some users may require a new GPU to enable dynamic refresh rates while gaming. Please see this FAQ for more details.
 
It is our current understanding that Project FreeSync does indeed support a wider range of dynamic refresh rates than NVIDIA G-Sync.

Two questions about this - both linked.

Is there any benefit of the sync rate dropping as low as 9hz/fps? Synced or not 9fps will feel like ****.

Up to 240hz...it's my understanding at the moment amd cards can't go north of 120hz, is this set to change? Or is the current restriction down to 144hz monitors and not the cards?
 
Some monitors could support it, but they likely won't.

We know that some monitors already have the necessary hardware to support the DisplayPort Adaptive-Sync specification and, by extension, Project FreeSync. Such monitors would only require a firmware upgrade to enable support for a dynamic refresh range that’s dictated by the LCD. However, AMD has no expectation that monitor vendors will simply make new firmware available to customers.

From the perspective of the monitor vendor, user-initiated firmware upgrades represent an uncontrollable scenario that could trigger a rash of expensive RMAs. For example, the user could power down their monitor mid-upgrade and render the display inoperable—or perhaps a power outage achieves the same outcome.

I’m certainly not saying knowledgeable enthusiasts would do this, but not every user is a master or in control of the weather and power grid. For this reason, it is much more likely that new monitors will be released to ensure the quality and reliability of the product. I think that’s reasonable and to be expected.
 
So 7000 series owners need to upgrade to game using freesync?

When paired with a FreeSync-ready monitor, HD 7000 Series owners can enable Project FreeSync to save power by automatically reducing the refresh rate in a static scene. Project FreeSync can also be used to reduce the refresh rate to match video playback framerate, which can improve the smoothness of video. For gaming, however, an upgrade to one of the GPUs specified in this FAQ will indeed be required. This is not an arbitrary/software decision on AMD’s part, as there are specific hardware requirements for dynamic refresh rates that are only met by many of the newer GPUs.
 
The question I've been asking over here for a while now still hasnt been answered, so i'l ask again here. It stems from a quote in this article about the workings of freesync: http://www.techpowerup.com/196557/amd-responds-to-nvidia-g-sync-with-freesync.html?cp=2

Specifically, this part


Is this actually the case (or close to it)? Is AMD's freesync solution using some sort of prediction algorithm to guess how long it will take to render the next frame, and using that as the vblank?

It seems like a very odd method to use, which is why I ask if you really are using it, rather than the method the article says nvidia are using, which is to make the monitor hold the vblank for as long as is needed

TechPowerUP is unfortunately inaccurate in their speculation. I just added a new Q&A, inspired by this question, that should clear things up.
 
It still requires a controller module and I can't see monitor manufacturers just popping these in for free or with no incentive. Don't misunderstand me, I want this to be a reality but the Q&A is so vague and with no solid info at all to be perfectly honest.

Practically every monitor has a “controller module” of some sort already. It’s called the “display scaler.” With NVIDIA G-Sync, the monitor does not use a standard scaler and instead uses NVIDIA’s proprietary display module. This module can add up to $100 to the cost of the monitor vs. the same display without. In contrast, FreeSync does not require the monitor or scaler vendors to abandon their existing R&D efforts. We know that many scalers are already compatible with the DisplayPort Adaptive-Sync specification.

Further, monitor vendors design many of their scalers with the DisplayPort specification in mind, and that specification now includes dynamic refresh rate. If their existing scaler technologies do not already support something like FreeSync, there is now a free and industry-standard solution to that problem, which makes FreeSync easy to place on the roadmap and integrate in a future product.
 
Last edited:
Its a VESA standard, so won't be any BETA. Should be very easy for AMD to actually implement too via a simple driver patch, which I'm sure they've already got fully working otherwise they wouldn't have pushed for the tech in the first place (if they couldn't do it, why push for it?)

Hi JamesJ,

Please see post #265
 
Either I missed this bit in the FAQ when writing my previous post, or it has been updated since writing that post with this statement:



I'm not sure how to interpret this. On the one hand, you could interpret this as saying that the graphics card just fires off new frames when the monitor is ready for them, since it knows how to do so. Presumably the monitor would just hold the frame for as long as it needs to with this system.
On the other hand you could say, especially with that last sentence, that the monitor is being told when it receives a frame that it needs to hold it for a time predicted by the graphics card (which the graphics card knows is an acceptable time since it has determined the monitor's minimum and maximum refresh rates), rather than the monitor constantly asking "do I need to keep holding this frame" and the graphics card replying with its answer.

Again, If I had to guess the method without frame time prediction seems more likely, but there might be some complex reason for requiring frame time prediction. Just a simple yes or no answer to "does freesync utilise a method for predicting how long a frame will take to render, or how long it will need to stay on the monitor" would be great :)

Hi Reaper,
Please see post #267
 
Hello everyone,

As you may have noticed we provided answer for some of your questions. I hope they help clarify some of the discussion (questions) you had.

And by "we provided" I mean that I worked with my colleague Robert H (a.k.a. Thracks), I'm sure some of you know him already. So a big thank you to Robert and the rest of the AMD folks at HQ who helped as well.

Also, I'm still going to try my best to get you more answers to your other questions here. It will take me some time. As always, I'll keep you up to date on my progress.
 
@Warsam, just thought of a new question.

Will all new GPUs, regardless of pricing level, support project freesync tech to some varying degree? Obviously the refresh of the 7000 series as 280's doesn't work for gaming, but does for video. This was a point that richard huddy raised during an interview with maximumPC over a month ago, but he couldn't clarify the position at the time. Do you have anything further to add to his comments? The competing product from Nvidia will support all of their new GPU's due to their proprietary tech as you know, so I wondered if you can compete on the same level using the VESA standard?
 
It is our current understanding that Project FreeSync does indeed support a wider range of dynamic refresh rates than NVIDIA G-Sync.

Great stuff :D one thing I noticed people doing for csgo on Gsync was lower the frame rate. Not something I would like to do. So if Freesync let's you run 300fps and keep input lag down I be very happy.

Thanks sam
 
I think he said 240hz was the maximum, If the monitor supports it. I think going past that 240 fps number would then give no added benefit, unless I'm mistaken.

Yeah just seen, 240hz would be class but think that's while off yet. I'll just after see what it's like once reviews etc start coming.

On csgo frame sync to 120 vs 300 I notice a massive difference in smoothness and motion.
 
Back
Top Bottom