• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD freesync coming soon, no extra costs.... shocker

Caporegime
Joined
18 Oct 2002
Posts
33,188
http://www.anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-alternative-at-ces-2014

So AMD can use something they've supported in hardware for 2 generations and is on at least Kaveri and Kabini, likely Trinity forward at a guess.

It uses a VESA standard to enable variable refresh rates, it's something some existing screens can already do, it's just not completely finished.

Way to go Nvidia, taking a standard and a move towards "freesync" and jumping the gun early, adding hardware, and cost and then charging ONLY Nvidia customers the extra.

Nothing like taking something a monitor can do(3d) calling it something new(3dvision) and only charging your own customers more to use a feature you advertise as being a feature of your cards.

It's almost like that is exactly what I said was the case, that this could be done insanely easily, little did I know that the actual feature required on monitors is part of the Vesa standard, but I did say that in the future all screens would support this as essentially a normal feature.

So Nvidia fans, are you once again happy to be charged extra for something monitor makers already fully intended to support FOR FREE, and that only Nvidia hardware buyers will be forced into the extra cost or locked out via drivers for the product you've paid for?
 
Last edited:
As brilliant as this news possibly is (I'm going to assume/hope that this may work with my Samsung 700D with my R9 290 all kosher?)
AMD had to wait this long why?

It will work with the 290 for sure, a 7970 for sure, I think probably 5870 or maybe 6970 onwards.

As for time, Vesa standards, I've made so many posts pointing out the hilarity of the awfulness of industry standards. In terms of them being set, and adopted. One of the key reasons for screens above 1080p and higher def smaller screens is a lack of a standard. Mobile devices don't have this issue as they design their own chips and can connect to the screens however the hell they want inside their own device. Samsung/Asus/however have to choose and use a industry standard and add particular ports to their monitors and then ship them so they work with everything, very different situation. 4k screens come with all sorts of substandard stupid arrangements purely because no one could decide the best way forward.

AMD designed a standard, handed it off, one display port cable, sorted, could have been done 5 years ago and saved people the insanity that is two dual link dvi cables for one screen with screens with controllers that split the screen into two and have multiple issues.

When this feature got added to Vesa, who knows, when a screen out today was finalised in design, who knows.

Maybe we'll see that a huge number of existing screens support it, the much more likely scenario is some do, potentially some can be firmware flashed, but realistically as with most things. Industries don't add extra features they don't think are required. I would hazard a guess it means new screens, but it will be a basic feature of the normal controller chips inside screens, no $150 cost increase for a stupid FPGA. nvidia is doing it solely to lock in their own customers so they can disable this mode on screens they don't want it working on.

Obviously the new Asus ROG screen could have the free standard feature, but will cost more, have G-sync branding and use Nvidia's method.

If it is supported on older screens like also my 700d, I'll be pretty surprised, I can hope but I'm not expecting it.

As I've said in previous threads, I simply expect Nvidia's g-sync to be a lock in and a way to get to market first, pay some monitor makers to push a g-sync version first and make both parties some extra cash at one group of users expense. From the relatively near future I expect most screens to support it.

This is fundamentally useful tech for power saving, it's semi used in mobile already but in very different ways/reasons and most of Nvidia's patents linked to in another thread were about mobile power saving applications.

This is likely where the time to market comes, coming up with good algorithms to determine when you can drop refresh rate, save power but without effecting the end user. Get that right and you can save power on every screen produced, which for mobile is a killer feature with the screen usually the biggest power draw.
 
Every post I read of yours is pure hatred towards Nvidia isnt it?

You want to sure us on the little green doll where they touched you? :D

right or wrong about this issue it gets tiresome

I find it funny that when I post a fact that shows Nvidia in a bad light, it's just hate, it's never Nvidia doing something meh that people shouldn't actually support, it's purely me hating on Nvidia.

The entire g-sync situation which I have seemingly called perfectly from the start, has been describe as hate by Nvidia people on this forum, rather than simply technical information which is all it was.

Go back and read my posts and the responses. I said Nvidia can't patent it, that gets called hate, I point out how simple the idea is, I get called as posting rubbish because I hate Nvidia. I was actually just describing the situation and what would almost certainly happen and I used examples(3dvision, sli, etc) to point out that this is generally how Nvidia has done things for donkeys years.

I dislike what Nvidia do in general to try and lock their own customers in, AMD out, and screw the industry as a by product.... I've yet to see anyone explain why these are good things or why loving the idea would be "normal".

If nvidia turn around and stop screwing everyone over as the default mode, I'll be happy, if they do something genuinely good for the industry you'll see me support it. If AMD intentionally screw over their customers, you'll see me complain about it. Maybe you could e-mail AMD and ask them to do more bad things for me to complain about so I can prove it to you.


Obviously there are now some laptops using DP to connect their internal screens and have PSR compatible panels, so AMD can show off FreeSync. As for it working on existing monitors, no chance at all I suspect. PSR requires additional hardware to work (thus increases manufacturing costs) and up till now it's been aimed at saving power more than anything else, so monitor manufacturers haven't had much motivation to support it.

In reality power saving is something monitor makers are VERY in to and would support, but it would be heavily biased towards mobile and trying to get manufacturers to put in any more effort than required when they don't need to becomes a blood from stone type of situation. I could see them happily do this on all their screens meant for mobile but if there was a 1p cost per monitor in desktop to add the feature they'd hold back as long as possible.

The screen industry is a complete joke, 1080p the best you can still do at 24" screen(I think I maybe have seen like one higher res smallish scree) while you can get a 4" mobile screen at 1080p......... it's insane, it's always been insane. At least, thank ****, we've seen some 4k 28" screens being done. It probably wouldn't have surprised me to see 4k screens starting at 40".

I've wanted a couple super high def but 22-27" screens for the past 5 years, and frankly, no reason at all they haven't been standard for that long.

One of the issues is seriously stupid Vesa standards, no one agreeing on cabling and general standards for higher res. Nothing over 1080p at 120hz for so long is mostly down to the lack of choosing or creating a cable and standard that can run say a 1600p, or 4k screen at 120hz. Pretty much all it would take is some sensible thinking and agreeing on a choice between everyone....... asking grown men to agree on something in business is insane.

You have one company who probably has some stake in HDMI, would get 2p per screen so is arguing for that, another guy who wants to save the 6p per screen having 3hdmi ports would mean and so wants display port, etc.

Essentially there hasn't been much reason since lcd's were made for "g-sync" to have been a fundamental feature of the first screens made. Just, the industry was set in their refresh rate thinking, and so that's what they continued to do, madness. It's not AMD/Nvidia/Industry who did it first, it's, why didn't any of these idiots do this 10 years ago. It's pretty much mental.
 
lol is it actually called "Freesync" or is that just what it is been coined? Great troll if so.

If they have called it that, I shall laugh a lot, I think.... but am not sure, that it's Anandtech trolling g-sync a bit. If they did call it that it wouldn't be long before someone reviewing "freesync" would compare them as free-sync vs pay-sync.
 
Is this really that big a deal?
Didn't a lot of AMD owners (I'm sure there were a few Nvidia ones too)say how uninteresting and rubbish GSync looked?
Now all of a sudden this is something awesome?

Can you see anyone in here say g-sync or freesync is awesome? I didn't, I haven't spotted anyone else.

As before, from probably 90-120fps you will see essentially no difference, may depend on the screen, some screens/games seem to tear a lot, others don't. If you've got 90fps+ v-sync/trip buff you'll get almost zero lag anyway, no tearing and be smooth as hell, at 120fps it's just awesome anyway.

From 30-60fps, it should improve, it WILL vary depending on game, scene. Pendulum is still the absolute ideal way to do it, as discussed the increasing decreasing framerate thing I'd need to see in real life. THe demo purposefully changed framerate steadily, this is the perfect world situation that won't ever happen. Frame pacing will be used to smooth quicker changes in frame rate, that will be interesting to see, it will induce some lag but hopefully smooth it out very well, I can see a situation in which framerate is jumping all over the place that g-sync could look iffy or offer very little benefit.

I said before, at worst, g-sync should be never worse, usually a small benefit, occasionally a huge benefit, but still mostly in a lower frame rate range. If you're gaming about 90fps there will be very little benefit, above 90fps low persistence will be better than g-sync, but pretty much oled's are the real key to great low blur/low persistence gaming............ where the **** are they though :(
 
Wait, are you talking about the one that looks like a bin? I've heard it has heat issues - reaching 95°C under load apparently, don't have a source at the moment though.

It's just so different, yeah, a shiny dark metal rounded mini bin looking thing. I usually hate Apple stuff, hate white phones and the like, and don't really care for small form factors in general but it just looks so ridiculously small for a dual gpu high powered workstation. 95C probably isn't a problem as that is where the gpu's are designed to work, I can't remotely imagine a case like that running well and cool, it's freaking ridiculously small for a workstation, maybe too small... it's very apple in that sense.

EDIT:- might be getting ahead of myself, I keep thinking something new from Apple must have new AMD gpu's but obviously its the last gen firepro's in it(probably). It's a bit form over function which I never usually do, but I'd actually not mind having a case like that sitting on my desk or by a tv.
 
Last edited:
I can see the appeal for some people and for those who are in the film industry and are used to Apple products, but I just don't really like it personally - plus it's majorly overpriced in my opinion ($3000 I think, and its specs aren't really that great - you can build a much better workstation PC for that price).

I'm thinking you know, if I win the lottery I'd buy one. I might buy a case like that though if I see one and make a little Kaveri computer.
 
Well Anand said it didn't look as good as what Nvidia have shown so far, and you can bet your ass this was an absolute current best case scenario from AMD.

Which is better bet. A more expensive but better solution, or cheaper but worse? You pay your money and make your choice. Much like 3dvision mentioned in the OP. Was more expensive and locked down by Nvidia, most certainly.

Software support for 3dvision and nothing to do with the technology difference, it's a 120hz screen and a pair of glasses that sync to the 120hz, nothing more or less, being locked out of 3d mode based on the screen device id, there was nothing different. The software stack doesn't change the hardware having zero difference, either it paid the 3dvision branding charge, or it didn't.

Likewise ANandtech did NOT say g-sync was better, it said Nvidia had a better demo, nothing more or less. They also said specifically the biggest downside to g-sync was lack of monitor support and freesync appears to be down the path to fixing that.

This was also two notebooks with Kabini's in, not a 780gtx being used. It was existing retail bought laptops that supported the feature.

I'm sure Nvidia and AMD will improve their drivers for it in the future, but they didn't say the feature was worse than g-sync only that the demo wasn't as good. It was run on vastly different hardware and very different screens.

Nvidia won't have an extra cost for hardware, or better hardware, they won't continue to add a $150 cost through a FPGA the monitor makers certainly don't want to use long term.

FPGA is essentially a slower custom chip you can program to act(to a degree) in any way you like. It will use vastly more power, hugely more transistors than a fixed function silicon solution. The difference is using an existing FPGA and programming it is just stupidly faster than waiting for a full product tape out and chips to be ready. My opinion then, and even more so now(because there is essentially no other reason to both use a FPGA here nor stick it onto and replace an existing chip in the existing monitor) than time to market, they did this to get monitors out first and to be able to lock them in.

In the future when given the choice of that final silicon at a fraction of the price and power usage being available vs the expensive FPGA, it's a non contest.

Future screens will have this feature as standard in the normal controller that costs say $5 a screen and won't use a $100 FPGA instead. In the future Nvidia will be using standard hardware to use this feature. They could, possibly not lock Nvidia users to g-sync branded screens, more chance of it if their users kick up a fuss. As with 3dvision/sli mobo's/other stuff, they will likely continue to charge the likes of Asus say $10 a screen to be branded compatible and not locked out via drivers.
 
So no chance they actually thought it was impressive? I think it is great that AMD are embracing this monitor refresh but don't for one minute think they would have jumped on this if nVidia hadn't already done it.

So they've had it in drivers waiting on monitor makers....... but were never going to do it till Nvidia launched it? Nvidia launched a way to do variable refresh rate using an ..... industry standard method provided which screens haven't adopted yet, but Nvidia was the only one that was going to use it even though it's going to be absolutely bog standard in mobile devices for power saving and AMD/Intel were going to utterly ignore this?

Tosh, it's fairly clear that this was coming. Lets say everyone in the industry knew this was coming in June as standard on all new monitors launched after June, Nvidia decided, lets do it in FPGA(quicker, vastly more expensive) and launch early.

If they didn't do this, AMD,Nvidia, Intel would all have supported it in June anyway. As also shown during the g-sync threads, AMD people, and many people have talked about the need for something better than v-sync for seemingly years.
 

Yes they were, I wasn't, there was a standard coming, it was going to be supported by everyone on every monitor, so Nvidia came along, branded their own version and decided to massively increase the costs, only for their own users..... to get it a couple months early, while also pretending they came up with it.

Vesa would be surprised that someone would take what was a standard feature and pretend it was their own, anyone who has ever read anything about Nvidia should absolutely not be surprised though, it's what they do.

Layte, it's okay, you seemingly purposefully misinterpreted what Anandtech said, then mentioned 3dvision hardware as a reason the software was different, and decided me responding directly to both these points wasn't relevant....... a sure sign that you have no argument, I'll miss you.
 
I just want a high res 24" screen, with 120hz, 4k maybe, if prices take too long I'd happily take a 1600p or something. I just don't want one giant 30" screen. I game, but I code and watch tv, browse. I have 2 24" screens currently and very frequently game on one screen while having other crap going on with the other screen. It just doesn't really work doing that on 2x massive screens. I might go as far as a 27", though 25-26 would be nicer. 30" is just too big, it reduces the pixel density too far anyway and makes the dual screen thing too insane.

oled + 4k + 27-28" screen is probably going to be my next major screen upgrade. I think some 1080p, maybe even a 1440p 120hz freesync screen will probably be an in between step as oled + 4k being anywhere near the £400 mark is still at least a couple years off if not more.

I wish one of the manufacturers just got some balls, kitted out all their fabs for oleds, starting selling them for low prices, but they'd corner the market in months, have a huge lead on the competition. If you told me today I could get an oled for £1000 but the blue led's might die in 3 years, I wouldn't risk it. if it was £400-500 and might only last 3 years, the response time, colour quality, refresh rates and low persistence I'd easily spend that much and just buy again in a few years if the blue led thing really is an issue.
 
They'll likely jump straight to pushing 4k as a standard because of content. The push towards and then mostly getting stuck at 1080p was content available for it. Average user knows film/tv comes in 1080p so that is what they want. 4k content is the next standard so what the industry is now pushing towards. Not a bad thing, just a shame. They could have pushed 1440/1600 into 24" screens for good pricing years ago. THe industry seemed to have no drive to do so, because they are stupid.

High def pc gaming somewhat stuck at 1080p........ when mobile phones use the same definition, it's ridiculous.

Haven't seen well priced 4k screens yet, some crappy tv 39" that went to like $900 but it could only do 4k at 30hz, so almost useless for gaming. A sub £500 60hz really good 4k screen, some ways off, oled version even further off.
 
The NVIDIA "fans" have 120hz or 144hz monitor, because many use Nvision. And they are not affected by the issues that appear on 60hz monitors.

Do your research, before you troll again :)

Errm, maybe do your research before you troll again, g-sync/freesync has precisely nothing to do with issues on 60hz monitors. Every g-sync monitor as yet listed is a 144hz monitor, most of the first ones are the most widely used current 144hz Nvidia branded monitors(3dvision, whatever other branding their monitors get) that are available from Asus/BenQ.

So you can buy a Asus 144hz screen without g-sync, or pay $150 or was it $200 more for the g-sync version of the 144hz screen, with g-sync working from, is it 30-144hz, or was it 45-144hz, either way.... g-sync is not a feature purely for 60hz screens so you're quite clearly talking rubbish.
 
So several of the dare I say usual people are making bold claims like, it's Toshiba's technology, not AMD's. :(

It's not Toshiba's technology, Toshiba have merely implemented a proposed Vesa standard which ANYONE can do at any time, this feature works because it is ALREADY IMPLEMENTED IN THE DRIVER. This will ALREADY WORK on any screen that implements this feature, this worked because this has already been in drivers. It's not AMD technology, not completely anyway, the driver/hardware support is, but a screen needs to support dynamic refresh rates, there is a proposed industry standard method to do this(non final methods being used before they are finalised is fairly common), not many have implemented it.

AMD/Vesa are all surprised by it because as I suggested might be the case months ago, Nvidia were simply aware of an up and coming technology, decided to jump the gun and rather than wait for the natural progression of Asus's new screens(and everyone elses) implementing this, have struck a deal to use an Nvidia chip allowing Nvidia to lock their own users in.

from Tech Report
Koduri explained that this particular laptop's display happened to support a feature that AMD has had in its graphics chips "for three generations": dynamic refresh rates. AMD built this capability into its GPUs primarily for power-saving reasons, since unnecessary vertical refresh cycles burn power to little benefit. There's even a proposed VESA specification for dynamic refresh, and the feature has been adopted by some panel makers, though not on a consistent or widespread basis. AMD's Catalyst drivers already support it where it's available, which is why an impromptu demo was possible.

There are few monitors that support it, like every single new feature. Can it be added through firmware updates, possibly but lets think about who that benefits.... the end user, great, but the manufacturer? So they can sell you a new freesync compatible screen or enable a great option for free which will make them zero money...... I don't hold high hopes on that one ;)

I expect when a decent gaming desktop screen is released that supports the Vesa feature, AMD will tell gamers about freesync. Till some desktop screens support it, what is the point of shouting at the rooftop about it?

For the record, my somewhat gloaty "lol Nvidia" opening post wasn't aimed remotely at Nvidia, but at the people on here who basically ganged up together to tell me how much I hate Nvidia and how wrong I am because I dare suggest AMD, and Intel and everyone else(because that is so pro AMD) will have this for free because it's a painfully basic idea that Nvidia can't possibly patent, that will end up as another lock in Nvidia feature. I got pure rubbish constantly thrown back at me for nothing but realistic posts about the future of freesync and what it means for Nvidia users.
 
Last edited:
You're just twisting the truth, they're the first company to use this feature and implement it within their LCDs. Standards get proposed every day. You seem to be on the assumption that the two technologies are the same too with the little info, and you took the opportunity from a CES stand which shows a 30FPS demo with VBLANK alteration to slate G-Sync. That makes you a bit of a weapon frankly. And no walls of text will change that :D

I'm twisting what truth exactly? Firstly where did I slate g-sync? Second, how is claiming it's Toshiba's technology in referring to AMD driver support and a proposed Vesa standard(which would make it industry wide) make it Toshiba's technology.

Not strictly true as the Toshiba panels they're demonstrating are able to make VBLANK adjustments. However at this stage I would definitely tend to agree with it being a publicity stunt - especially give its Toshiba's technology they're demonstrating. So it's free to people who haven't paid for it when buying the notebook in question LOL.

Yeah, your intention is not clear at all.


Here's a hint for those arguing over if freesync is free or not. When you have a monitor that normally costs $300, now costing $500 to support g-sync, for the SAME monitor, that is not free. When you get a future version that is $300 and supports freesync, but not g-sync, yes, that is free.

Someone even decided to point out freesync isn't free because monitor makers have to pay for membership to Vesa.......


Yes, I'm the one twisting the truth, and we know nothing about freesync at all, except every article on it says it did what g-sync does and made the demo smooth "fixing" for want of a better word, the issues with v-sync. We don't know for certain precisely how g-sync works internally, but that's okay. But when AMD show the same thing on a different demo showing the same improvement we don't know enough about it?

Except we know it will be free, it gives the same effect as g-sync, it will be adopted in the future by more monitor makers, it won't add $200 cost to a monitor. What more do you need to know, precisely, about freesync that you know precisely about g-sync?

Lastly, do you have any information to back up your claim that Toshiba were the first company to implement it? I haven't seen that claimed anywhere, though it could be true, however to claim that it's Toshiba's technology is still absurd.
 
Last edited:
Twisting -

Toshibia's panels - not AMDs

VESA standard - implemented by Toshiba.

Where did I say they were AMD's panels, where did I say it was AMD technology, Vesa standard, implemented by Toshiba, how does this indicate Toshiba technology, this precisely indicates Toshiba using someone elses technology. Are you sure it's even a Toshiba panel?
 
Twisting -

Toshibia's panels - not AMDs

VESA standard - implemented by Toshiba.

Ok so Toshiba may not own the technology but if you want to get hung up on that then I guess we need to get back to how it's going to combat G-Sync.


So you've just said it gives the same effect as G-Sync, having not seen either - and not knowing enough about how G-Syncs scaler works. That's a lot of nonsense for a 'I told you so' thread.

I'll do some twisting of my own. You're saying that Nvidia have purchased this particular standard and then sold it as their own?


Really? Really DrunkenMaster?

I'll have to quote again as you edited.

Firstly, again the twisting thing, I didn't twist anything, the claims of things I never said don't mean I'm twisting words. I did not suggest these were AMD panels, I did not claim it was AMD technology, YOU claimed it was Toshiba technology, even though I bet you don't know if Toshiba are even using their own panel. It's an AMD driver and an AMD chip that are part of the solution that make freesync work, but I only said that the driver and hardware implementation of AMD's, is AMD's technology. SO once again when asked what I was twisting, you made something up that I didn't say.

As for the g-sync vs freesync thing. Again where did I claim to know how the g-sync scalar worked. Did I say I'd see either, no but all the articles based on freesync are written by people who have seen both, and THEY say it gives the same effect. I said EFFECT, I did not say same technology, I did not say same method, I said nothing of the sort. I said same effect because people who have seen it in person have said it gives the same effect.

99.999999% of people on this forum have not seen g-sync in anything but video, I have seen g-sync AND freesync in video. I didn't see you having a go at Nvidia users who like the effect of g-sync who haven't actually seen it in person......... funny that.

Lastly, you implied I had twisted something which makes it okay for you to make crap up, no I neither said Nvidia sold a Vesa standard as their own, in fact you'll find I have in most g-sync threads said something vastly different.

Making up things I didn't say then accusing me of (twisting)lying, is actually just you lying, then using your lying as an excuse to lie further, makes you a bigger liar... nothing else.
 
You're clutching at straws with the Toshiba thing. Toshiba normally out source to LG, not always but in some cases. However the scalers are made by smaller firms in the middle east so you're just going around in circles there with the text walls.

I asked you whether or not you thought Nvidia are using a VESA standard as their own, if that is not the case, then it cannot be fundamentally the same technology as you're implying in your gouging OP.

Both maybe need to be seen along side each other before you applaud your own bashing in that case buddy :)

I'm a liar etc. Keep digging.

You've failed to address any of the things you made up, the bold claims you've made that are incorrect or really anything else.

a very quick recap
1/ you claim it's toshiba technology, not amd's
2/ I call BS
3/you say I'm twisting things
4/ you imply I was saying it was an AMD panel but it's a toshiba panel
5/ i point out I never claimed it was AMD tech, nor an AMD panel but it's rather ridiculous to claim it's toshiba tech when you don't know if it's a toshiba screen.

I'm not clutching at straws, it was pointing out the weakness of YOUR argument. Your counter, as is usual for someone who has nothing to stand on, was to imply this was the basis of my argument... even though you are the one clutching to the idea it's Toshiba tech, not me.

Why would Nvidia patent a Vesa standard, beyond the impossibility of it there is no implication of this, nor accusation from anyone besides, by your own words, you twisting what I've said to imply this.

Nvidia has 3dvision.... nothing in 3dvision isn't an industry standard. A 3dvision screen updates the screen 120 times a second. It's a 120hz screen, nothing more or less. 3dvision is a brand name. You can trademark brand names for pretty much any technology you want.

The same way almost every tv manufacturer has a different name for their 100/200hz implementations. Sony call it truevision, Panasonic call it "football that is less sucky to watch". Trademarks for almost identical technology from multiple companies is incredibly common, in fact pretty much industry standard. I have consistently said that this is what I believe g-sync will eventually end up has, identical hardware 6 months from now, Nvidia call it one name on their feature list, AMD call it something else, Intel call it something else.

EDIT:- disclaimer, I made up these brand names, they all have some stupid name for bumping up refresh rate and I don't care what they are.

Once again, I haven't claimed that g-sync works identically to freesync, I said it gives the same effect, which is all the end user remotely cares about.
 
Last edited:
We have neither, we will get mantle but only may get freesync.

Mantle is an unknown, we do not know real-world performance or how well it will be adopted by developers.

Freesync might be adopted by Vesa, which might be supported by AMD at some point. But then as an open standard it might be supported by NVIDIA too.

Just to be official, freesync already is available on those Toshiba laptops, it was using release drivers, with retail bought screens. You can today buy those screens and use freesync, so it's not a case of may only get freesync. Today, it's available.
 
Back
Top Bottom