• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD freesync coming soon, no extra costs.... shocker

Most likely... consoles are Jaguar APUs and they are supported on desktop, sooo.... ?

But based on what cores they use in the PC space, that'd be a no surely? (78/77)

But they don't have to be identical, and the R9 290 would have been about before the consoles launched too.

So yeah, really need AMD to comment :p
 
But based on what cores they use in the PC space, that'd be a no surely? (78/77)

But they don't have to be identical, and the R9 290 would have been about before the consoles launched too.

So yeah, really need AMD to comment :p

The PS4 GPU seems to have enhanced compute over GCN1.0,so its quite possible they support the updated feature set.
 
I think console chips are always future-proofed / more programmable somewhat, the Xenos in the 360 was able to support new features post-launch.
 
Hmm, that's a good point then.
However, you can get revisions of consoles, Xbox 360 launched without HDMI.

Honest question, do many TVs have a DisplayPort?
Surely most console users will use a TV rather than a monitor?

Just seems a small audience to do a revision for if TVs don't.
Also, does DisplayPort do sound?
 
The first discrete GPUs compatible with Project FreeSync are the AMD Radeon™ R9 290X, R9 290, R7 260X and R7 260 graphics cards. Project FreeSync is also compatible with AMD APUs codenamed “Kabini,” “Temash,” “Beema,” and “Mullins.” All compatible products must be connected via DisplayPort™ to a display that supports DisplayPort™ Adaptive-Sync

So unless its one of those gpus and has aDisplayport imo your out of luck
 
Honest question, do many TVs have a DisplayPort?
Surely most console users will use a TV rather than a monitor?

Just seems a small audience to do a revision for if TVs don't.
Also, does DisplayPort do sound?

I don't know many TV's that have DP (However you'd hope they'd push for it more).

I don't even own a TV, monitors all the way :p

Some console gamers do opt for monitors over HDTV's, but I assume HDTV's dominate the sales.

Display Port does sound.

DP, as it stands right now does everything HDMI does and more, and royalty free. It's simply superior as far as I'm aware?
 
Well that was tracks quote,, While they may or may not be compatible is up in the air i wouldnt hold my breath, I'm sure the gpu's were sorted before release and are most likely not compatible,, but time will tell

If it was a simple thing to add I think the 280x would have had it added instead of just renaming the cards
 
But the 290 would have been sorted well before launch too.
They may well not be compatible, but you can't judge them on their desktop sibling.
 
Interesting technical note on the freesync prototype shown in the video earlier is it has a narrow variable refresh rate of 40-60hz. Gsync goes lower than that to 30hz and if you go below the variable threshold, you get a stutter fest.

Wonder if this will change with different panels, or if its a limitation of the spec?
 
Interesting technical note on the freesync prototype shown in the video earlier is it has a narrow variable refresh rate of 40-60hz. Gsync goes lower than that to 30hz and if you go below the variable threshold, you get a stutter fest.

Wonder if this will change with different panels, or if its a limitation of the spec?

That's due to the monitor used. I think if a better monitor is used the range increases.

Forbes said:
What is the supported range of refresh rates with FreeSync and DisplayPort™ Adaptive-Sync?
thracks said:
AMD Radeon™ graphics cards will support a wide variety of dynamic refresh ranges with Project FreeSync. Using DisplayPort™ Adaptive-Sync, the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display. Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz.
Source
http://www.forbes.com/sites/jasonev...adaptive-sync-gets-added-to-displayport-spec/

Anand said:
AMD’s release also contains an interesting note on supported refresh rates: “Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz.” While the upper-bounds of those ranges are in-line with numbers we’ve seen before, the sub-30Hz refresh rates on the other hand are unexpected. As you might recall from our look at G-Sync, even though LCD monitors don’t suffer from anything quite like phosphor decay as CRT monitors do, there is still a need to periodically refresh an LCD to keep the pixels from drifting. As a result G-Sync has a minimum refresh rate of 30Hz, whereas AMD is explicitly promising lower refresh rates. Since the pixel drift issue is an underlying issue with the LCD technology there is presumably something in Adaptive-Sync to compensate for this – the display is likely initiating a self-refresh – though at the end of the day the variable refresh rate means that you can always set the refresh rate to a multiple of the targeted refresh rate and get the same results
Source
http://www.anandtech.com/show/8129/computex-2014-amd-demonstrates-first-freesync-monitor-prototype
 
Last edited:
Nice information matt. Am very interested in trying this vs sync off. If this can give me same frame performance and response time vs sync off plus the added bonus of screen tear removed. I be very impressed.
 
In terms of which cards will support gaming freesync, this is likely something to do with hardware implemented frame pacing and possible XDMA implementation. Don't forget that if every gpu forward has hardware frame pacing and mostly people dumping money on new monitors will be on newer gpu's(particularly with the shift towards higher res screens finally) then ultimately programming for the current and future hardware becomes sensible and writing something new to support old hardware that needs different code becomes difficult.

When you look at how much effort MS put into the XO chip to enable something to happen with very low latency for Kinect, latency dependant things are often down to the hardware and thus implementing good frame pacing(which is ENTIRELY prediction based for Nvidia and AMD), is something hard to do with software due to latency problems.

Anyway, software isn't always possible for extremely latency sensitive situations like frame pacing, if it takes 3ms longer that is the difference between missing a frame update and not, it's make or break.

AMD said that variable refresh has been doable for a while, and it has because... it's ridiculously easy.

monitor side we are talking about updating the screen after the frame buffer is full or asking for a specific time to count before updating again, both insanely simple.

Cable side, DP simply has an extra channel, a simple piece of copper that can send a signal, that's it, nothing complex. It's just a case of monitors committing to listening to that signal. GPU side the variable refresh itself is literally just a case of having that extra channel hooked up to the gpu. Same way some gpu's have a dvi output but it is missing the analogue pass through, the pins/holes are there, they just aren't connected. So AMD some generations ago made their gpu's completely capable of sending the required message down that cable, this is trivial stuff.

Variable refresh rate itself does not provide smooth gaming, frame pacing is the MUCH more complex side of this, and Nvidia/AMD can do it in hardware and do it less well in software.

Nvidia has multiple patents, linked to in the original g-sync threads incorrectly stating they were patents on variable refresh, they aren't. It was patents on their method for monitoring rate of change of the frame rate. Not the current frame rate, but how big the change in frame times is. It's critical to smooth gaming and g-sync/freesync.

By the looks of things and which gpu's AMD is doing gaming freesync for, it's coupled to which gpu's have hardware frame pacing checking. On Nvidia this means some dedicated transistors on die that run hardware accelerated algorithms, from the patents they basically compare one frame to the last and compare pixel to pixel and decide how fast the rate of change is by how much of the image has changed, then keeping a buffer of the previous X number of frames and keeping a running guess on the next frame time.

Anyone that thinks frame pacing for either company isn't massively and deeply involved in prediction of frame time is incredibly badly mistaken.

It's a shame older gen cards don't support it, yet, it's possible in the future that they can improve the software enough to get the latency to where it needs to be be but software run comparisons will NEVER be able to beat a piece of dedicated circuitry that doesn't need to go back to the software to be checked on. it may get low enough that it's fine but hardware will always be better.
 
Back
Top Bottom