• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Samsung the First FreeSync Display?

Soldato
Joined
25 Nov 2011
Posts
20,679
Location
The KOP
AMD used its Future of Compute event in Singapore as a platform for announcements from its partners and clients in support of its own technology and ecosystem-building efforts. Samsung, which currently holds 60 percent of the global market for 4K UHD monitors, announced broad support of the FreeSync standard in all products starting in 2015. The Korean giant introduced five new UHD monitors: the UE 590 series will come in 23.6-inch and 28-inch models, while the UE 850 series will come in 23.6-inch, 27-inch and 31.5-inch models. AMD also announced that the world's three biggest manufacturers of UHD monitor scalers would be building in support for FreeSync.
The standard brings about smoother gameplay without frame tearing and stuttering by synchronizing a monitor's refresh rate with the stream of frames being fed into it. FreeSync has already been adopted by VESA and will be part of the DisplayPort 1.2a specification. Richard Huddy, Chief Gaming Scientist at AMD referred to Nvidia's competing GSync standard, calling it inferior because of its dependence on proprietary hardware and the fact that Nvidia charges partners a license fee.

http://gadgets.ndtv.com/games/news/...re-enix-and-more-announce-partnerships-623546
 
Can't say I'd be keen on UHD at 24" IMO its pushing usability at 28".

IMO for gaming 28" 1440p at 120Hz combined with access to DSR is kind of the sweet spot with gsync or freesync just being the icing on the cake, just need a bit better image quality than current TNs.
 
What Nvidia users didn't get and likely won't admit when we all said it. Standards take time to come up with/make, then get accepted, then be adopted, when they are they are surprisingly, shockingly.... standards.

You move the industry forward with standards, DX10 then 11 etc became the standard and the industry moved forward for the most part. Standards drag the industry forward kicking and screaming but they go there, random exclusive locked in things hold the industry back(including those people who get the exclusives).

They take longer and are less benefit to a single company but better for all users, in this case freesync will help Nvidia users, while G-sync only locked in Nvidia customers to overpriced versions of the same screens. At worst freesync will force Nvidia/g-sync panel makers to drop their prices, at best they'll support the standard and Nvidia users will have a far larger range of products to choose from.

AMD could have had the idea, made a damn simple FPGA and had it to market a year ago as well, but there is a right and wrong way forward for the industry as a whole and it's absolutely no surprise at all which way AMD, or Nvidia went with this.
 
What Nvidia users didn't get and likely won't admit when we all said it. Standards take time to come up with/make, then get accepted, then be adopted, when they are they are surprisingly, shockingly.... standards.

You move the industry forward with standards, DX10 then 11 etc became the standard and the industry moved forward for the most part. Standards drag the industry forward kicking and screaming but they go there, random exclusive locked in things hold the industry back(including those people who get the exclusives).

They take longer and are less benefit to a single company but better for all users, in this case freesync will help Nvidia users, while G-sync only locked in Nvidia customers to overpriced versions of the same screens. At worst freesync will force Nvidia/g-sync panel makers to drop their prices, at best they'll support the standard and Nvidia users will have a far larger range of products to choose from.

AMD could have had the idea, made a damn simple FPGA and had it to market a year ago as well, but there is a right and wrong way forward for the industry as a whole and it's absolutely no surprise at all which way AMD, or Nvidia went with this.

I guess we can be thankful if/when AMD get there in the end.
 
Can't say I'd be keen on UHD at 24" IMO its pushing usability at 28".

IMO for gaming 28" 1440p at 120Hz combined with access to DSR is kind of the sweet spot with gsync or freesync just being the icing on the cake, just need a bit better image quality than current TNs.

Meh, I find 23-24" 1080p not sharp enough for me, I've wanted a significantly higher res panel in similar size for ages. Now 4k might be a little much at 24" but I think 1440p at 24" is a sensible step and 4k screens in the 27-30" bracket makes a lot of sense.

Personally I never wanted to go from a 23" 1080p to a 27-30" 1440p I don't really want a massive monitor to start with.

I just hope these idiot monitor makers don't give up on 120hz with some bozo's like the guys at ubisoft deciding that because of freesync/g-sync more than 60hz doesn't matter at all now.

120hz, ips, freesync, 1440p at 24" or 4k at 28" and I'll sell a kidney to buy one.... for an oled, I'd have a kid just to sell it to buy the screen :p

I don't keep hugely up to date with screens, since they finally made a standard for 4k/60hz and single cable(AMD again making it and giving it to Vesa to get it done when no one else would), has there been anything announced with 2x display port cables and 120hz/4k screens?

Considering most early 4k/60hz screens had 2x dvi/hdmi cables, you'd think that the next step once moving to 60hz on a single cable would be to also get 120hz versions from newer 2 cable versions of the monitors.

EDIT:- looks like Asus and others are waiting for DP 1.3 to be done which will bring 120hz/4k over single cable. But as above, considering they were happy to bodge together two cable/two tiles to make 4k screens for the past few years till they could do that on one cable, why aren't they willing to do the same with 2 display port cables now, grrr.
 
Last edited:
Don't have any argument against 1440p at 24", nothing on the horizon afaik for 4K 120Hz, theoretically it should be possible with MST - if they can get 2560x1440@144Hz then you'd think 2 tiles at 1920x2160@120Hz each would be possible.
 
Last edited:
Don't have any argument against 1440p at 24", nothing on the horizon afaik for 4K 120Hz, theoretically it would be possible with MST - if they can get 2560x1440@144Hz then you'd think 2 tiles at 1920x2160@120Hz each would be possible.

Updated previous post, they are coming, last time I googled it there was nothing. As you say considering they did 2 tiles/2dvi cables to get 60hz for 4k it seems odd that when the cables got updated they couldn't still do 2 tiles but 60hz a cable rather than 30hz.

DP 1.3 coming with 120hz/4k spec and 60hz at 8k, some serious bandwidth coming with that cable. Shame though, so many people like a news story I read where Asus said they were 1-2 years away(stated May 2014), saying along the lines of "but you'd need serious gpu power to run 120fps at 4k anyway". Idiots, 120hz is better if you're running anything from 1 to 500fps, they have faster response, noticeably less motion blur, they are massively better at any frame rate. Scrolling a webpage is vastly smoother and better at 120hz than 60hz, all usage is better with 120hz over 60hz as far as I'm concerned.

No chance in hell I'd drop back to 60hz for a 4k screen, just not happening.

Oled, I'm not even sure how they work tbh, if they have a given hz or not, they supposedly have zero motion blur and insane response rates. I think with OLED Hz may actually not be an issue and motion blur/ghosting will be a thing of the past.
 
Last edited:
but would you go and buy a new monitor now knowing its only 6 weeks?!

No way, who knows it could be crap or it will simply do what it's meant to do, don't even know if I'll jump on Freesync though.

From reports, Gsync is problematic using dual or more gpu's, it works under specific driver revisions then they break it again, 3rd driver in a row where it isn't working.:(

Depends on how it's implemented, if it is an all encompassing profile, it might fair better than Nvidia's game specific profiles(if that's how it's employed), then it may better game support wise.

All speculation on my part, but big thanks goes out to Nvidia for debuting the tech and even bigger thanks to AMD for not milking the tech.:D
 
No way, who knows it could be crap or it will simply do what it's meant to do, don't even know if I'll jump on Freesync though.

From reports, Gsync is problematic using dual or more gpu's, it works under specific driver revisions then they break it again, 3rd driver in a row where it isn't working.:(

Depends on how it's implemented, if it is an all encompassing profile, it might fair better than Nvidia's game specific profiles(if that's how it's employed), then it may better game support wise.

All speculation on my part, but big thanks goes out to Nvidia for debuting the tech and even bigger thanks to AMD for not milking the tech.:D

Yeah I the same I have my questions and i'll be one the first one here reporting back!
My reason for buying new display is because I really want to move away from 1080p.
 
http://www.blurbusters.com/faq/oled-motion-blur/

quite interesting piece on motion blur. Oled not 100% the be all and end all of motion blur, though this was a comparison with one particular monitor. Effectively lightboost/strobing on an LCD screen is doing a good job though it's not perfect yet and on my panel feels quite flickery/straining. It would be nice to have a good freesync panel with a decent and quick way to flick lightboost and freesync on/off, see which works on a particular game. Above probably 70-80fps you're likely to find lightboost better than Xsync, anything Ubisoft and Xsync will be preferable :p

That page effectively describes why we get motion blur and why 120hz is beating the living pants off a 60hz screen under any kind of usage.

I still think it means an OLED in normal mode is significantly more responsive than a LCD not in lightboost mode, so OLED + xsync will offer a lot less blur than LCD's, also much better contrast. LIghtboost is a good LCD motion blur beating method but has downsides, worse contrast(which is already way lower than OLED's to start with), inducing more flicker, doesn't work with g-sync(and likely freesync, both are based on changing frame rates while fundamentally lightboost has to know the gap between frames and it has to be consistent).
 
I don't keep hugely up to date with screens, since they finally made a standard for 4k/60hz and single cable(AMD again making it and giving it to Vesa to get it done when no one else would), has there been anything announced with 2x display port cables and 120hz/4k screens?

But as above, considering they were happy to bodge together two cable/two tiles to make 4k screens for the past few years till they could do that on one cable, why aren't they willing to do the same with 2 display port cables now, grrr.

:rolleyes:

it wasn't 2 cables, it was 2 tiles using a single cable via MST, AMD had a hand in getting MST added to DP

MST monitors are the ones that don't really work right, they have now moved over to SST, which is nothing to do with AMD, DP1.2 always supported 4K/60hz but the scaler makers were slow to respond and monitor makers lashed 2 scalers together via MST to get the early ones working

the problems everyone has had with MST is probably why they aren't bothering with it for 4K/120, that and no one has produced a panel that would support it, even the 4K/60 ones show signs of being overdriven

iirc there was a very early 4K monitor that supported 4K via 2 HDMI cables, but it was stupid expensive, like £15k or something
 

Cheers nvidia. 3 driver revisions and still two top proprietary features don't play nice togetheron sli systems.

Getting complacent, need to pull their finger out big time atm.

Greg posted a thread and it contained the same-SLi is poor and Gsync only works via one Titan.

If AMD can't get CrossFire working generally from the off, then it will miss a purchase from myself.
 
Back
Top Bottom