• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

HDMI 2.1 VRR AMD FreeSync nVidia G-SYNC 4K TV explained

So i recently got a 4k Blu-Ray player and the manual actually states if using a regular high speed HDMI cable to use one thats shorter than 5 meters. They included a 2m regular HDMI cable, which was interesting.

Also, im using a 2080 Ti now and the 4k60 image & 5.1 sound decoding is running nice and stable on a 4k tv (using a Philips premium/Ultra high speed HDMI cable.) Only thing that doesnt work in some games like Destiny is HDR, it just completely messes up the colour contrast and i think its a software problem in the game, not the TV or cables im using. HDR works everytime in other titles at 4k60 like FarCry 5 for example.
 
FreeSync is just as proprietary as G-Sync, at this point in time, this may change in the future if another player in the market starts to use VRR technology.

That's wrong.
Freesync is the name AMD gave to their way of using the variable refresh rate open standard when it's available on a monitor, Monitor makers use the Freesync name in marketing, they're using that name in the marketing because no-one else is using the tech the monitor comes with, When you buy a Freesync marketed monitor it comes in a box stating it's got Adaptive Sync Support not that it's got Freesync, I've bought 3 to date and none said Freesync on the box, it's only Freesync in the marketing. The tech Freesync uses is not propietary so it's not comparable to G-sync, Anyone could plug into an adaptive Sync monitor & use the variable refresh rate tech if they weren't locked out of it by their hardware supplier. Nvidia choose to not let them so you implying otherwise is wrong, G-sync is propriatory, Freesync is a marketing name for software that uses an open standard.
 
Freesync is not open standard it is AMD's through and through. The open standard is Adaptive-Sync, how else was AMD allowed to copyright the term Freesync?
 
Furthermore, Adaptive-sync is not actually open-source, it is an industry standard, part of Display Port standard. It also isn't free as you have to be a paying member of Vesa to have access to the standard and to be able to have certified products.

And Free-sync isn't free, it just doesn't have a licensing cost.

Free-sync also uses proprietary hardware and software
 
Furthermore, Adaptive-sync is not actually open-source, it is an industry standard, part of Display Port standard. It also isn't free as you have to be a paying member of Vesa to have access to the standard and to be able to have certified products.

And Free-sync isn't free, it just doesn't have a licensing cost.

Free-sync also uses proprietary hardware and software
Freesync is not open standard it is AMD's through and through. The open standard is Adaptive-Sync, how else was AMD allowed to copyright the term Freesync?

For years both of you are on same mantra without having a clue what are you talking about

a) VESA Adaptive Sync, is the OPTIONAL standard drafted by the Video Electronics Standards Association (VESA). These guys oversee some very common and royalty-free standards (like those mounting brackets for displays). VESA Adaptive Sync is an implementation over the DisplayPort standard beginning with DP 1.2a. It's not required for usage so DP 1.3/1.4 displays do NOT have to use this feature. In order to use this feature, the monitor must support it, the GPU must support it, and the display drivers must implement it.
And find me a monitor manufacturer who is not VESA or HDMI member.

b) For a monitor to support VESA Adaptive Sync requires to have a related scaler who handles the signal. This hardware is incorporated to the existing hardware of the port handling, and it's cost is tiny.
There are more than a handful of manufacturers of the port hardware, and monitor/TV manufacturers can pick freely from which ever supplier they want. Is it not made by AMD nor royalties paid to VESA or AMD.
Contrary to Nvidia Gsync which we all know that standard module is $250 paid to Nvidia and HDR module north of $700 ($500 the FPGA alone)

To put in perspective, the cost of the Adaptive Sync hardware is so low (couple of USD) , that you won't find new monitor these days not having Freesync support even bad one, assuming they do not have Gsync.
Doubt there are any hardware manufacturers who manufacture DP/HDMI electronics without it.

c) Freesync/VRR/Adaptive Sync is the AMD software implementation to piggyback over the Vesa Adaptive Sync. It is Open Source on all 3 forms. Anyone can go and download the code, which is found in the Linux repositories also, uploaded by AMD. So anyone like Wine can go and implement support for Freesync on the games they run on top. Or anyone who wants to create an AMD driver for any operating system can go and develop it, with Freesync support.

On the other side, Nvidia software implementation using the VESA Adaptive sync is called "Gsync" also, and is used on the laptops which they do not have Gsync scaler
Is closed source and only Nvidia made drivers support it. (even on Linux).

d) Technically Nvidia tomorrow morning can support VESA Adaptive sync (or actually since February 2016 on stand alone GPUs), they have the drivers to do so for their Pascal & Turing products (used in laptops), and the hardware is already there on the port electronics, on all Pascal & Turing cards.
However is better to fleece their customers.
 
For years both of you are on same mantra without having a clue what are you talking about

AMD own Freesync, or do you deny this simple fact.
It is their baby (and stroke of marketing genius) if they allow other companies to use it or not well we will just have to wait and see.

Will Intel want to use the term Freesync or use their own term, at this point in time we have no idea.

As for not having a clue what I'm talking about, well we are never going to agree on this so have a roll eyes for good measure.:rolleyes:
 
AMD own Freesync, or do you deny this simple fact.
It is their baby (and stroke of marketing genius) if they allow other companies to use it or not well we will just have to wait and see.

Will Intel want to use the term Freesync or use their own term, at this point in time we have no idea.

As for not having a clue what I'm talking about, well we are never going to agree on this so have a roll eyes for good measure.:rolleyes:

Except you keep mixing up Freesync and adaptive sync. When you are talking about ownership, AMD own Freesync, it's their term for how they connect to an adaptive sync monitor, but, they don't own adaptive sync monitors. Adaptive Sync monitors have come to be called Freesync monitors, but, there is nothing stopping Intel or Nvidia from connecting to any Freesync(adaptive sync) monitor out there. And when that happens the monitors won't just list Freesync as a tech they support, they will also list isync or nsync or whatever terms Nvidia/Intel use for their method of connecting to an adaptive sync monitor.

Why would other companies want to use Freesync? It's AMD's way of connecting to an Adaptive sync monitor. And AMD can't stop anyone connecting to an adaptive sync monitor, it's a display port feature and is owned by VESA.
 
That's wrong.
Freesync is the name AMD gave to their way of using the variable refresh rate open standard when it's available on a monitor, Monitor makers use the Freesync name in marketing, they're using that name in the marketing because no-one else is using the tech the monitor comes with, When you buy a Freesync marketed monitor it comes in a box stating it's got Adaptive Sync Support not that it's got Freesync, I've bought 3 to date and none said Freesync on the box, it's only Freesync in the marketing. The tech Freesync uses is not propietary so it's not comparable to G-sync, Anyone could plug into an adaptive Sync monitor & use the variable refresh rate tech if they weren't locked out of it by their hardware supplier. Nvidia choose to not let them so you implying otherwise is wrong, G-sync is propriatory, Freesync is a marketing name for software that uses an open standard.

Exactly this. But, that doesn't suit their argument so they will ignore it and come back with the same pedantic nonsense. But, but, Freeysnc is proprietary just like gysnc. :rolleyes:
 
I have a 15m long high speed HDMI cable I bought back in 2014 to hook up my PC to a FHD telly. This year, the household telly became a 4K telly so I thought I wouldn't be able to do UHD@60Hz through that monstrously long cable as the general consensus, from what I had read online, is that HDMI cables over 3 - 5m in length just don't have the bandwidth.
I was very pleasantly surprised when I hooked the telly up to the PC and got UHD@60Hz! Granted, the PC won't do HDR10 (not that it matters, the telly's only an 8-bit+FRC panel) but it does HDR in 8-bit YCbCr4:2:0.
If I turn on UHD Colour (or whatever the deep colour setting is called) on the telly, my PC throws a hissy fit at anything above 24Hz in 4K.

My wife's PC is also hooked up to the telly; she's using a 5m long high speed HDMI cable and she can get HDR10 (8-bit w/ dithering YCbCr4:4:4) in UHD@60Hz with UHD Colour on.

After having learned more about 4K TVs, I regret our purchase as we've had the panel replaced twice due to excessive backlight bleed, and global dimming that can't be turned off is infuriating!
 
Except you keep mixing up Freesync and adaptive sync. When you are talking about ownership, AMD own Freesync, it's their term for how they connect to an adaptive sync monitor, but, they don't own adaptive sync monitors. Adaptive Sync monitors have come to be called Freesync monitors, but, there is nothing stopping Intel or Nvidia from connecting to any Freesync(adaptive sync) monitor out there. And when that happens the monitors won't just list Freesync as a tech they support, they will also list isync or nsync or whatever terms Nvidia/Intel use for their method of connecting to an adaptive sync monitor.

Why would other companies want to use Freesync? It's AMD's way of connecting to an Adaptive sync monitor. And AMD can't stop anyone connecting to an adaptive sync monitor, it's a display port feature and is owned by VESA.

Er I would love to know where I have mixed up Freesync and Adaptive-sync, not saying I never have, but I would love to know where.
We seem to be in agreement, that AMD own Freesync, just as I said above, it is their way of connecting to Adaptive-sync, which is the open standard part of the Display Port protocol.

I like the name Isync, I could see Intel going with something like that quite easily.:)
 
I like the name Isync, I could see Intel going with something like that quite easily.:)

That'd actually be a good choice for the name. It wouldn't surprise me if Raja Koduri came up with it due to it's simplicity, just as he did at that promotional show last year when he gave his "Vega will be called Vega" announcement :D
 
Rounding all this up in one place for new TV & GPU buyers. It's amazing how confused people are across the web at the moment concerning refresh rates and GPU compatibility:

  • HDMI 2.1 is the future new connection standard for 4K TV's that allows Variable Refresh Rate (VRR) - TV's with full HDMI 2.1 are expected to release anywhere between now and 2020!
  • HDMI 2.0 TV's that are already out now can actually add VRR support to their sets with a software update. (Samsung are doing this for some sets already)
  • HDMI cables - [EDIT] Your common 'High Speed' HDMI cable should allow 4k @ 60hz, but any higher you actually need to buy a new higher bandwidth cable (max 120hz) called 'Ultra High Speed' or '48G Cable'. [DOUBLE EDIT] I can get 4k @ 60hz on a ~1 meter long cable, but not on a 3 meter long cable. All sorts of GPU and Display errors happen on the longer length cable.
  • FreeSync is just AMD's name for VRR essentially as it's the same refresh rate standard underneath. AMD's latest GPU's and CPUs will give you VRR gameplay on an HDMI 2.1/2.0 VRR enabled 4K TV.
  • VRR as a TV standard is also going to be supported by AMD GPUs alongside its own FreeSync (the set may say its VRR, not specifically FreeSync, but it should still work)
  • G-SYNC is nVidia capitalising on their dominant market position to lock their customers into their own proprietary hardware ecosystem. G-SYNC is a competitor to VRR and is unlikely to appear in 4k TVs. GeForce GPUs currently don't support HDMI 2.0/2.1 VRR - but that could change in the future...
  • 4K VRR - Well, HDMI 2.0 VRR will get you a 4K resolution at 48-60hz VRR. Pretty slim window really, would like to see a demo of that. It will do 1080p between 20-120hz however, if the TV is a 120hz native panel.
  • DisplayPort. 4k TVs dont have 'em!

Hope that helps clarify what you need to game on a 4k TV without screen tearing, regardless how good the next gen of GPU's perform. Could be a big swing factor for AMD?
Hi you forgot to add that TVs has it's own GAME MODE... I think from 2017 .. I have Hisense 4k@60 TV but I don't have any of those Free-sync or g-sync,vrr, but I have game mode and connected PC and it runs perfectly ..
 
Back
Top Bottom