• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

HDMI 2.1 VRR AMD FreeSync nVidia G-SYNC 4K TV explained

Associate
Joined
11 Mar 2016
Posts
361
Rounding all this up in one place for new TV & GPU buyers. It's amazing how confused people are across the web at the moment concerning refresh rates and GPU compatibility:

  • HDMI 2.1 is the future new connection standard for 4K TV's that allows Variable Refresh Rate (VRR) - TV's with full HDMI 2.1 are expected to release anywhere between now and 2020!
  • HDMI 2.0 TV's that are already out now can actually add VRR support to their sets with a software update. (Samsung are doing this for some sets already)
  • HDMI cables - [EDIT] Your common 'High Speed' HDMI cable should allow 4k @ 60hz, but any higher you actually need to buy a new higher bandwidth cable (max 120hz) called 'Ultra High Speed' or '48G Cable'. [DOUBLE EDIT] I can get 4k @ 60hz on a ~1 meter long cable, but not on a 3 meter long cable. All sorts of GPU and Display errors happen on the longer length cable.
  • FreeSync is just AMD's name for VRR essentially as it's the same refresh rate standard underneath. AMD's latest GPU's and CPUs will give you VRR gameplay on an HDMI 2.1/2.0 VRR enabled 4K TV.
  • VRR as a TV standard is also going to be supported by AMD GPUs alongside its own FreeSync (the set may say its VRR, not specifically FreeSync, but it should still work)
  • G-SYNC is nVidia capitalising on their dominant market position to lock their customers into their own proprietary hardware ecosystem. G-SYNC is a competitor to VRR and is unlikely to appear in 4k TVs. GeForce GPUs currently don't support HDMI 2.0/2.1 VRR - but that could change in the future...
  • 4K VRR - Well, HDMI 2.0 VRR will get you a 4K resolution at 48-60hz VRR. Pretty slim window really, would like to see a demo of that. It will do 1080p between 20-120hz however, if the TV is a 120hz native panel.
  • DisplayPort. 4k TVs dont have 'em!

Hope that helps clarify what you need to game on a 4k TV without screen tearing, regardless how good the next gen of GPU's perform. Could be a big swing factor for AMD?
 
Last edited:
A bit of clarification on what is or isnt classed as HDMI 2.1 - and its still confusing:

"Certification starts in 2018
The first phase of HDMI 2.1 certification starts in the second quarter, with full certification expected to begin in the third or fourth quarter of the year. Products with the official stamp of approval can be launched following successful certification.

During its press conference, the HDMI Forum said that HDMI 2.1 was delayed due to technical input from its many members.

Manufacturers can launch HDMI 2.1 enabled products before full certification has been carried out but these are not guaranteed to work 100% according to specification. You may recall that the first 48Gbps HDMI cable was made available late last year. It has not (yet) been certified."

https://www.flatpanelshd.com/news.php?subaction=showfull&id=1516615699
 
Cable length matters a lot, I have an expensive 15m cable running round my front room and it will not display @4k 60Hz at all, where as if I move the PC over to the TV, even a cheap cable will do 4k @60.

That's good to know - I hate the thought of needing new cables and this is the first time for ages that its been listed as a requirement for HDMI, but it may just be a lottery how well each works as others mentioned.
 
I use a 60 inch sony bravia. (non 4k)
The dream is to have a 65 inch OLED that supports FreeSync 2 HDR or G-SYNC or some native VRR GPU compatibility. Ill basically buy whoever's GPU next year supports VRR on tv sets. I'm aware I'm possibly a niche's niche though!
 
Seems the cheapest 4K native 120hz TV with VRR enabled over HDMI 2.0 is the Samsung NU8000 (NU8500 if you like a curved one) I would wait till the competition inevitably catches up with the HDMI 2.0 VRR updates mind as only Samsung have supported this so far! Problem is only nVidia has GPUs that can really push 4k at 60hz or more. So maybe we need to wait until either nVidia support VRR on TV sets (doesn't have to be supporting AMD's own FreeSync) or AMD release a 1080ti beater next year!

Question, would we still get just as much screen tearing if the VRR was only between 48-60hz at 4K - as its reported to be over HDMI 2.0? https://www.rtings.com/tv/reviews/samsung/nu8500
 
Cable length matters a lot, I have an expensive 15m cable running round my front room and it will not display @4k 60Hz at all, where as if I move the PC over to the TV, even a cheap cable will do 4k @60.

Following up with some tests I did on this on the same GPU and 4k display setup. I have a couple of relatively good quality 3m HDMI cables and it doesn't display 4k @ 60hz. Seems either the GPU or the display forces everything back to 30hz after a few seconds of crazy fuzzy colours and blinking. I then used a 1m long cable (think it was from a Nintendo console) and it works no fuss. Im wondering if 1-2m long cables are the absolute limit to pushing 4k @ 60hz?

I need an expert to explain how much bandwidth is being used mind, because if 4k @60hz is 'supposed' to be too much information for a High Speed HDMI cable, then why is it possible in a short cable and not a long one? Is the real statement actually that 4k @ 60hz is within the bandwidth capability of these common cables, but the chance of it degrading is exponentially higher with increased length?

Would a more powerful GPU push HDMI signals over longer cable lengths? Or is that not how it works at all? Its all a bit smoke and mirrors to me right now.
 
So i recently got a 4k Blu-Ray player and the manual actually states if using a regular high speed HDMI cable to use one thats shorter than 5 meters. They included a 2m regular HDMI cable, which was interesting.

Also, im using a 2080 Ti now and the 4k60 image & 5.1 sound decoding is running nice and stable on a 4k tv (using a Philips premium/Ultra high speed HDMI cable.) Only thing that doesnt work in some games like Destiny is HDR, it just completely messes up the colour contrast and i think its a software problem in the game, not the TV or cables im using. HDR works everytime in other titles at 4k60 like FarCry 5 for example.
 
Back
Top Bottom