Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Hello everyone,
I've looked at the various questions you've posted here and currently working with my colleagues on providing the answers and clarifications you've asked for. I will provide you an update within the next few days.
"There are three key advantages Project FreeSync holds over G-Sync: no licensing fees for adoption, no expensive or proprietary hardware modules, and no communication overhead.
The last benefit is essential to gamers, as Project FreeSync does not need to poll or wait on the display in order to determine when it’s safe to send the next frame to the monitor.
Project FreeSync uses industry-standard DisplayPort Adaptive-Sync protocols to pre-negotiate supported min/max refresh rates during plug’n’play, which means frame presentation to the user will never be delayed or impaired by time-consuming two-way handshakes."
This will be interesting, maybe this will help with Input lag? Seen Gsync you must lower Frame rate to keep the Input lag down.
Really hope this the case, I want input lag, like using no sync for the best performance possible.
"What is the supported range of refresh rates with FreeSync and DisplayPort Adaptive-Sync?
AMD Radeon™ graphics cards will support a wide variety of dynamic refresh ranges with Project FreeSync. Using DisplayPort Adaptive-Sync, the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display. Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz."
Another interesting plus, Doesn't Gsync only perform better when the frate rate is about 40fps?
Freesync seem to be able to go lower.
Well the 4k gsync monitor costs no more than the regular 4k monitors.
Also for the dynamic refresh rates during gaming you need a new series card.
So potentially for people wanting to use it with their current monitor or card it's going to be expensive.
Some monitors could support it, but they likely won't.
Does anything need doing to the drivers or is it already in them?
So 7000 series owners need to upgrade to game using freesync?
The question I've been asking over here for a while now still hasnt been answered, so i'l ask again here. It stems from a quote in this article about the workings of freesync: http://www.techpowerup.com/196557/amd-responds-to-nvidia-g-sync-with-freesync.html?cp=2
Specifically, this part
Is this actually the case (or close to it)? Is AMD's freesync solution using some sort of prediction algorithm to guess how long it will take to render the next frame, and using that as the vblank?
It seems like a very odd method to use, which is why I ask if you really are using it, rather than the method the article says nvidia are using, which is to make the monitor hold the vblank for as long as is needed
and no 280X, what's the point of AMD even selling this card?
It still requires a controller module and I can't see monitor manufacturers just popping these in for free or with no incentive. Don't misunderstand me, I want this to be a reality but the Q&A is so vague and with no solid info at all to be perfectly honest.
Its a VESA standard, so won't be any BETA. Should be very easy for AMD to actually implement too via a simple driver patch, which I'm sure they've already got fully working otherwise they wouldn't have pushed for the tech in the first place (if they couldn't do it, why push for it?)
Either I missed this bit in the FAQ when writing my previous post, or it has been updated since writing that post with this statement:
I'm not sure how to interpret this. On the one hand, you could interpret this as saying that the graphics card just fires off new frames when the monitor is ready for them, since it knows how to do so. Presumably the monitor would just hold the frame for as long as it needs to with this system.
On the other hand you could say, especially with that last sentence, that the monitor is being told when it receives a frame that it needs to hold it for a time predicted by the graphics card (which the graphics card knows is an acceptable time since it has determined the monitor's minimum and maximum refresh rates), rather than the monitor constantly asking "do I need to keep holding this frame" and the graphics card replying with its answer.
Again, If I had to guess the method without frame time prediction seems more likely, but there might be some complex reason for requiring frame time prediction. Just a simple yes or no answer to "does freesync utilise a method for predicting how long a frame will take to render, or how long it will need to stay on the monitor" would be great![]()
Been looking at monitors lately, i want to upgrade to a 27" 144mhz monitor, guessing i should hold fire right now until the Freesync ones are available?
I think Warsam said the drivers will be out when the first Free-Sync monitor arrives.