• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Workaround: FreeSync on nVidia GPUs

Regardless of being peeved or not, it's a lost sale.

It might be a lost sale for AMD in the short term, but, it's also a lost sale for Nvidia, no Gsync monitor sold. And one less Gsync monitor sold means one less person tied in to Nvidia's eco system.

And since AMD's market share is so small, the differences are stacked in AMD's favour. Just random figures to illustrated what I mean.

If 10% of AMD GPU owners currently with a Freesync monitor decide to go Nvidia because of this, then that will be a small loss. Now If 10% of Nvidia GPU owners decide to buy Freesync monitors for the same reason and then change to an AMD card for their next purchase, well that will be a massive net gain for AMD and Nvidia will be losing twice because they haven't sold a Gsync monitor either.
 
AMD make all the graphics cards in both consoles. So technically they ARE buying them in droves. They can now use freesync too and Samsung have recently enabled it on their TVs.
 
AMD make all the graphics cards in both consoles. So technically they ARE buying them in droves. They can now use freesync too and Samsung have recently enabled it on their TVs.
this is what i was meaning earlier when talking about getting it into the hdmi spec, at the moment its optional in the 2.0b/c and 2.1 spec but going forward 2.2,2.4 we might see it becomes apart of the standard depending on what happens with hdr etc.
 
Last edited:
Just completed an Nvidia Advisory Panel survey which had questions about G-Sync, FreeSync and BFGD. Perhaps they are reassessing the market?
 
Just completed an Nvidia Advisory Panel survey which had questions about G-Sync, FreeSync and BFGD. Perhaps they are reassessing the market?
Well AMD have overtaken them and beat them to market with BFGD so they'll have to as in a few years that end of the market will run higher/lower vrr natively without any extra hardware/cost because people tend to purchase bigger with TV's not to mention more are gaming worldwide on AMD graphics than any other vendor.
 
Just completed an Nvidia Advisory Panel survey which had questions about G-Sync, FreeSync and BFGD. Perhaps they are reassessing the market?

When next year the majority of the TVs will have HDMI 2.1 with VRR, Nvidia will be losing mindshare even from the common folks.
Already if someone visits the Xbox forums, going to see that many who have Xbox (S or X) are not only switching to Freesync TVs and monitors, but replacing their GPUs to mid range AMD cards also (RX580/570) because they bought a FS TV/Monitor for their Xbox.

It worth for many to pay a visit on console forums outside here like the official ones. The biggest boon going to be next year with the PS5.
20,000,000+ new AMD Freesync customers....

That on top of the 25 million Apple Macs sold just in 12 months with AMD gpus already in them. (14 million this year alone)
 
Well, not to sound like a ****, but it's because I wasn't voicing an opinion merely pointing out some facts. AMD have been trying to get other companies to adopt Freesync for years (like with Mantle before that), they aren't going to randomly get upset about it becoming available on Nvidia via exploits.

Seriously that's like saying the USA would be upset if North Korea gave up it's nukes and surrendered to the South lol.

No they really haven't, and anyone using the term Freesync without AMD's say so is going to be in deep water. because Freesync is proprietary to AMD, it is their baby. Adaptive sync the standard it is based upon is a totally different kettle of fish.
 
Last edited:
Some need to remove the stick out their backside in regards to people associating FreeSync and vrr in loose association.:p

Don't know how many times it's been pointed out now that FreeSync is AMD's end of the handshake therefore I'll presume @ubersonic was talking about AMD pushing Vesa to bake Adaptive Sync into DisplayPort 1.2a so that their users didn't pay whatever extra cost G-Sync involves(be it $1 to $1000 extra) so that it became an industry standard that everyone can use.

Adaptive Sync isn't AMD's tech so why are they going to get upset if Nv use it as they will spin the fact Nv adopt it?

Sure they'd lose sales but can also gain sales off an otherwise closed off vrr user audience.
 
No they really haven't, and anyone using the term Freesync without AMD's say so is going to be in deep water. because Freesync is proprietary to AMD, it is their baby. Adaptive sync the standard it is based upon is a totally different kettle of fish.
Nope. G-Sync is proprietary, FreeSync is royalty-free and built on open standards.

From the FAQ:

AMD has undertaken efforts to encourage broad adoption for Radeon FreeSync technology, including:
  • Royalty-free licensing for monitor manufacturers;
  • Open and standardized monitor requirements (e.g. no non-standard display controllers or ASICs);
  • Industry-standard implementation via the DisplayPort Adaptive-Sync amendment to the DisplayPort 1.2a specification; and
  • interoperability with existing monitor technologies.
 
When next year the majority of the TVs will have HDMI 2.1 with VRR, Nvidia will be losing mindshare even from the common folks.
Already if someone visits the Xbox forums, going to see that many who have Xbox (S or X) are not only switching to Freesync TVs and monitors, but replacing their GPUs to mid range AMD cards also (RX580/570) because they bought a FS TV/Monitor for their Xbox. <snip>
Damn good points. If you are a console gamer then a FreeSync TV makes far more sense. Especially as you could get an OLED set for the price of the BFGD. I guess a subset of these will also be using the same display for PC gaming and so it would make sense for them to buy AMD cards. Others may have a separate dedicated monitor though.

The survey only focused on PC games. Which I suppose makes perfect sense if all the consoles are AMD. Perhaps they are considering including FreeSync support on the BFGDs to widen their appeal.
 
It worth for many to pay a visit on console forums outside here like the official ones. The biggest boon going to be next year with the PS5. 20,000,000+ new AMD Freesync customers....

Once again, PS4 Pro can supported Freesync if Sony wanted but Sony not interested to added Freesync support while Sony developed HDMI 2.1 VRR at the time. When PS5 launch either in 2019, 2020 or 2021, Freesync and Gsync will be dead because they are not industry standard, HDMI and DisplayPort are both industry standard. Sony is one of HDMI founders. Don't think Xbox Scarlett will support Freesync 2 when it only support up to 4K 60Hz but it did not work at 4K 120Hz on HDTVs, HDMI 2.1 VRR is the only way to handle 4K 120Hz. There had been no mentions of Freesync 3 so far so Freesync 2 probably will be the last before AMD dump it and finally adapted HDMI 2.1 VRR.

That on top of the 25 million Apple Macs sold just in 12 months with AMD gpus already in them. (14 million this year alone)

Freesync did not worked on Apple Macs, Ipads or Iphones.

https://egpu.io/forums/mac-setup/amd-freesync-on-macos/

It seems Apple did not supported Freesync as they created their own VRR proprietary standard called ProMotion just like AMD did with Freesync and Nvidia did with Gsync.

There are now 5 VRR standards:

AMD Freesync (proprietary standard)
Nvidia G-Sync (proprietary standard)
VESA Adaptive Sync (industry standard)
HDMI 2.1 VRR (industry standard)
Apple ProMotion (proprietary standard)

https://en.wikipedia.org/wiki/Variable_refresh_rate

There was been rumours about Intel iGPUs support Freesync but Intel finally confirmed future Intel iGPUs and GPUs will not support Freesync but will support industry standard VESA Adaptive Sync, Intel made very smart move. Intel will add industry standard HDMI 2.1 VRR support when HDTVs and monitors wide adapted HDMI 2.1.

VRR are mess with 5 so many standards fight each other to became industry standard just like HDR also in mess with 6 standards while many HDTV makers refused to supported all 6 HDR standards but chose 1-3 standards while LG seem is the only maker supported all standards to make life easier for watching HDR. Would be nicer if LG will support all 5 VRR standards for HDTVs and monitors.
 
It seems Apple did not supported Freesync as they created their own VRR proprietary standard called ProMotion just like AMD did with Freesync and Nvidia did with Gsync.

There are now 5 VRR standards:

AMD Freesync (proprietary standard)
Nvidia G-Sync (proprietary standard)
VESA Adaptive Sync (industry standard)
HDMI 2.1 VRR (industry standard)
Apple ProMotion (proprietary standard)

https://en.wikipedia.org/wiki/Variable_refresh_rate

Wow you just added your opinion onto a wiki but tried to pass it as written by Wikipedia!:eek::o
L5UmMkE.png



AMD's FreeSync runs via their proprietary driver on industry standard Vesa Adaptive Sync and HDMI 2.1 VRR tech whereas you are trying to band it into the same category as Nv's closed system, every vendor uses proprietary drivers.

Why the need to obfuscate people that don't have a clue about this tech than enlighten them by including your slant to a wiki article?
cpkwBVR.png


AMD make gfx that handshake with an industry standard vrr spec with zero license/hardware fees.

HmlXOMe.png

NV make gfx that don't run on an industry standard, you need their closed ecosystem vrr spec which includes license/hardware fees that doesn't run an industry standard.

But most of us know this, yet every single time G/FreeSync is mentioned, from under a stone, posts pop up implying/trying to convince everyone that both implementations are the same closed ecosystem.
 
Wow you just added your opinion onto a wiki but tried to pass it as written by Wikipedia!:eek::o
L5UmMkE.png



AMD's FreeSync runs via their proprietary driver on industry standard Vesa Adaptive Sync and HDMI 2.1 VRR tech whereas you are trying to band it into the same category as Nv's closed system, every vendor uses proprietary drivers.

Why the need to obfuscate people that don't have a clue about this tech than enlighten them by including your slant to a wiki article?
cpkwBVR.png


AMD make gfx that handshake with an industry standard vrr spec with zero license/hardware fees.

HmlXOMe.png

NV make gfx that don't run on an industry standard, you need their closed ecosystem vrr spec which includes license/hardware fees that doesn't run an industry standard.

But most of us know this, yet every single time G/FreeSync is mentioned, from under a stone, posts pop up implying/trying to convince everyone that both implementations are the same closed ecosystem.


Wow, AthlonXP1800 has sunk to a new low.
 
Some need to remove the stick out their backside in regards to people associating FreeSync and vrr in loose association.:p.

They only do it when it suits some point they are trying unsuccessfully to make. They know very well that Freesync and Adaptive sync have become to mean the same thing to a lot of people.
 
What card should I use for this, 7950 or RX480?
My initial thought was 7950 would be best as it's the older / worse card but I think it has higher power consumption so maybe RX480 best?

Also keen to understand the input lag situation, not convinced it is worth the hassle if it means marginally smoother but also more lag, which would be kind of akin to just using vsync to start with.
 
They only do it when it suits some point they are trying unsuccessfully to make. They know very well that Freesync and Adaptive sync have become to mean the same thing to a lot of people.

Exactly.

What card should I use for this, 7950 or RX480?
My initial thought was 7950 would be best as it's the older / worse card but I think it has higher power consumption so maybe RX480 best?

Also keen to understand the input lag situation, not convinced it is worth the hassle if it means marginally smoother but also more lag, which would be kind of akin to just using vsync to start with.
480 as 7950 as think doesn't do FreeSync, wouldn't go out and buy one of those to try it, if you try it you'll find out if lag is an issue or not.
 
Once again, PS4 Pro can supported Freesync if Sony wanted but Sony not interested to added Freesync support while Sony developed HDMI 2.1 VRR at the time. When PS5 launch either in 2019, 2020 or 2021, Freesync and Gsync will be dead because they are not industry standard, HDMI and DisplayPort are both industry standard. Sony is one of HDMI founders. Don't think Xbox Scarlett will support Freesync 2 when it only support up to 4K 60Hz but it did not work at 4K 120Hz on HDTVs, HDMI 2.1 VRR is the only way to handle 4K 120Hz. There had been no mentions of Freesync 3 so far so Freesync 2 probably will be the last before AMD dump it and finally adapted HDMI 2.1 VRR.

I am not talking about future products but CURRENT products.
I do not give crap what PS4Pro could support, i know it doesn't support FS, hence I have relegated it gathering dust, while using my XboneX for console gaming, paired with an 55NU8000.
Which also from time to time, I use it to play games with the Vega 64.

Freesync might not be "industry standard" since is it the AMD drivers that using the tech. However it is using the VESA Adaptive Sync, an industry standard, to implement it.
Similarly as Nvidia is using VESA Adaptive Sync, to deliver a pretty poor gsync to laptops, as they do not have dedicated module. And I wrote "poor" because the Nvidia drivers are a pile of crap, not having at this age some normal features like FPS capping. Forcing the users to use Vsync if they do not want their screens full of tearing. Making the whole "gsync" just a pointless marketing sticker. (I have Predator 15 and talking from experience).


Freesync did not worked on Apple Macs, Ipads or Iphones.

https://egpu.io/forums/mac-setup/amd-freesync-on-macos/

It seems Apple did not supported Freesync as they created their own VRR proprietary standard called ProMotion just like AMD did with Freesync and Nvidia did with Gsync.

The link you posted is about eGPU from 2017.

All Apple Macs are using forced Vsync at 60hz.
My comment probably poorly worded but you need to take it within the context, of customers and sales.

The rest of the waffling is going over, as all Freesync monitors will be compatible with the Intel iGPU implementation of Adaptive sync.
As for VRR, AMD announced back in January that by end of the year all current (and future) RX (Polaris & Vega only) cards will get HDMI 2.1 and VRR with driver update.
 
Back
Top Bottom