• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD freesync coming soon, no extra costs.... shocker

Somehow I think that wont be the case. Would be pleasantly surprised if the new 1.2a standard has features that are backwards compatible but I doubt it....

From 5000 series Techpower say
FreeSync taps into a lesser known feature that AMD Radeon GPUs have had for the past three generations (i.e. since Radeon HD 5000 series), called dynamic refresh rates. The feature allows GPUs to spool down refresh rates to save power, without entailing a display re-initialization (the flicker that happens when a digital display is sent a signal with a new resolution and refresh rate), on supported displays. Dynamic refresh is reportedly also a proposed addition to VESA specifications, and some (if not most) display makers have implemented it. On displays that do, AMD Catalyst drivers already run dynamic refresh rates. For display makers, supporting the technology won't require buying licenses, or integrating specialized hardware into the displays.

http://www.techpowerup.com/196557/amd-responds-to-nvidia-g-sync-with-freesync.html
 
my guess is there will be a small boom on 4k monitors dp 1.2a market, me too i wanted to change my monitor, but thought i would wait to see if freesync will be adopted.
just need price to drop around 300€ and would be perfect :D

Why?

4K still requires exotic hardware for playable frame rates, Adaptive refresh rate technologies will only help alleviate lower FPS concerns, not eradicate them.

Nobody will be playing 4K modern titles on a single GPU, G-Sync/Adaptive-Sync or not.
 
Source?

The specification has only just been ratified so until I see it running on existing hardware I'll reserve my judgement. Seems like too good of a bone for all the hardware vendors to pass on when they can flog you new kit.

Awaiting tedious/backwards/stupid reason why 1.2a will not work on existing kit :D

AMD already demonstrated the technology on existing laptops
 
Source?

The specification has only just been ratified so until I see it running on existing hardware I'll reserve my judgement. Seems like too good of a bone for all the hardware vendors to pass on when they can flog you new kit.

Awaiting tedious/backwards/stupid reason why 1.2a will not work on existing kit :D

AMD already demonstrated the technology on existing laptops

Correct. :)
 
Using eDP, a standard that has existed for some time with the sole purpose of power saving. Is that a cast iron guarantee it will work on external monitors with discrete desktop GPUs?

Let's not jump to conclusions just yet.
 
Why?

4K still requires exotic hardware for playable frame rates, Adaptive refresh rate technologies will only help alleviate lower FPS concerns, not eradicate them.

Nobody will be playing 4K modern titles on a single GPU, G-Sync/Adaptive-Sync or not.


exotic at 3000€ yes, not at 300-400€, beside having 4k monitor doesnt force you to play 4k full settings maxed, or does it force you to play at full resolution, you can play 1080p on 4k monitor untill you have the horse power, but at least you will have the monitor already there :p
 
Last edited:
Using eDP, a standard that has existed for some time with the sole purpose of power saving. Is that a cast iron guarantee it will work on external monitors with discrete desktop GPUs?

Let's not jump to conclusions just yet.

2OfMO4a.jpg.png
 
exotic at 3000€ yes, not at 300-400€, beside having 4k monitor doesnt force you to play 4k full settings maxed, or does it force you to play at full resolution, you can play 1080p on 4k monitor untill you have the horse power, but at least you will have the monitor already there :p

I was not talking about the monitor, you need exotic Multi-GPU set-ups to make 4K worth while.

WHY would you buy a 4K monitor NOW to play at 1920x1080 only to be able to play at 4K years down the line when you can afford the GPU power..... You are stuck with an out dated 4K monitor that would have cost you a fraction of the price if you just waited. Or, if you waited and spent the same, got a superior 4K monitor.

This is nothing but false logic.

Also this from the hardware.fr post is interesting:

Summary
Extend the "MSA TIMING PARAMETER IGNORE" option to DisplayPort to enable source based control of the frame rate similar to embedded DisplayPort.

Intellectual property rights
N/A

Benefits as a result of changes
This enables the ability for external DisplayPort to take advantage of the option to ignore MSA timing parameter and have the sink slave to source timing to realize per frame dynamic refresh rate.

Assessment of the impact
The proposed change enable per frame dynamic refresh rate for single stream devices that expose dynamic refresh rate capability in EDID for DisplayPort interface. The source will be able to enable this with an SST interface or MST hub with physical ports. Logical MST port support of the feature is not included as part of this SCR. A generic framework to enable such feature for logical port is required that can accommodate other feature where stream related configuration is programmed in DPCD.

Analysis of the device software implication
SST device which support "MSA TIMING PARAMETER IGNORE" option will be able to expose the capability in EDID and DPCD to let source enable dynamic refresh rate.
Source driver would have to be updated to parse EDID and enable "MSA TIMING PARAMETER IGNORE" feature when source want the sink to be refreshed based on its update rate.

Analysis of the compliance test and interop implications
Currently this feature is tested as part of eDP CTS. New test would have to be added as part of DP LL CTS and EDID CTS.
 
Last edited:
Again, no official source/confirmation.

Additionally the whole thing is an OPTIONAL part of the standard, monitor vendors might not even bother and just wait for 1.3.

Anshel Sag (Bright Side Of News Editor) has seen it working first hand.
 
Again, no official source/confirmation.

Additionally the whole thing is an OPTIONAL part of the standard, monitor vendors might not even bother and just wait for 1.3.

if it's added to 1.2a wouldn't just be automaticly transfered to 1.3 or whatever DP comes after ? or do they remove all previous features and add completly new ones.
then for it being optional just lol, this is a tremondous added value to the monitors targeted at gamers, any manufacturer who doesnt add would be just stupid and have no vision, some will put it to add value and increase price, others will add it to sell more volume for bigger market share, competition will do the work and prices will drop and i can buy a 300€ 4k monitor...banzai
 
Last edited:
Anshel Sag (Bright Side Of News Editor) has seen it working first hand.

On the original eDP demos or in it's current 1.2a ratified state on a discrete GPU and external DP monitor?

Seriosuly LtMatt, I thought you did a bit more thinking than this? Ask the right questions, not the questions you know the answers to that you want to hear! :p

if it's added to 1.2a wouldn't just be automaticly transfered to 1.3 or whatever DP comes after ?

Not necessarily. Adaptive-Sync is an optional part of the 1.2a spec now. Which means monitors do NOT need to include support to be VESA 1.2a compliant.
 
Well full credit to nVidia for getting this out first and good to see AMD not sitting back. nVidia took the bull by the horns and have the tech out now and AMD will have it out at some stage.

The choice is...Pay the extra for early smoothness in games now with G-Sync or save a few quid and buy when Adaptive-sync is available (Possibly a years time).

Good stuff.
 
G-sync = you have to buy a new supported monitor.
Freesync = you have to buy a new supported monitor.

Can someone explain where the free comes in? aside from AMD's marketing that is.
 
Back
Top Bottom