• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

HDMI 2.1 VRR AMD FreeSync nVidia G-SYNC 4K TV explained

Seems the cheapest 4K native 120hz TV with VRR enabled over HDMI 2.0 is the Samsung NU8000 (NU8500 if you like a curved one) I would wait till the competition inevitably catches up with the HDMI 2.0 VRR updates mind as only Samsung have supported this so far! Problem is only nVidia has GPUs that can really push 4k at 60hz or more. So maybe we need to wait until either nVidia support VRR on TV sets (doesn't have to be supporting AMD's own FreeSync) or AMD release a 1080ti beater next year!

Question, would we still get just as much screen tearing if the VRR was only between 48-60hz at 4K - as its reported to be over HDMI 2.0? https://www.rtings.com/tv/reviews/samsung/nu8500

The screen is 4K 60hz or 1080p @ 120hz. If your GPU can produce 49+ fps you will be fine.
I will let you know how it goes today, waiting delivery of the 55NU8000 to put it in the bedroom, it replaced the 55KS7000

I can't see it being a big swing factor for AMD unless they come out with much more price/performance competitive GPUs - maybe a little more movement their way in the 580/1060 bracket but that could be short lived depending on what nVidia do next.

First off. Apple desktop Macs all are using AMD cards. 20 million sold last 3 quarters alone.

Xbone S and X support freesync. A visit to their focums (eg Microsoft) you will find that many are ditching their nvidia cards and buying AMD because of freesync TVs and monitors. And these are around 20-25 millions of potential customers here.

PS5. Is given that most would replace their PS4s thats another 60-70million potential customers there. (if previous sales decline is used from PS2 to PS3 to PS4).

And more AMD optimised games will come out now, since older platforms like the Xbox360 are not available to use their codebase and strap gameworks on.
And AMD optimised games means they are optimised for NV also. Look at FC5 for example.
Something that cannot be said for NV Gameworks.....
 
First off. Apple desktop Macs all are using AMD cards. 20 million sold last 3 quarters alone.

Xbone S and X support freesync. A visit to their focums (eg Microsoft) you will find that many are ditching their nvidia cards and buying AMD because of freesync TVs and monitors. And these are around 20-25 millions of potential customers here.

PS5. Is given that most would replace their PS4s thats another 60-70million potential customers there. (if previous sales decline is used from PS2 to PS3 to PS4).

And more AMD optimised games will come out now, since older platforms like the Xbox360 are not available to use their codebase and strap gameworks on.
And AMD optimised games means they are optimised for NV also. Look at FC5 for example.
Something that cannot be said for NV Gameworks.....

Those aren't markets that particularly demand VRR though compared to PC gamers - there might be some crossover in console/PC users where having one display they can use with both devices is a factor but the vast majority of console gamers seem to be perfectly happy with a locked 30 FPS/30Hz V-Sync! (though some of the more discerning are starting to wake up to the benefits of 60FPS).

People who use Macs and care about VRR are also a tiny minority of the users.

AMD optimisation at console level rarely has much impact on how a game runs on PC (other than badly if it is more or less a straight "port" without PC optimisations) due to the differences in threading models, memory implementations, GPU access, etc.
 
AMD optimisation at console level rarely has much impact on how a game runs on PC (other than badly if it is more or less a straight "port" without PC optimisations) due to the differences in threading models, memory implementations, GPU access, etc.

Most console ports even recent ones like Tomb Raider, used the Xbox360 codebase not the Xbone one, got Nvidia money to strap Gameworks on it and put it on the market.
And this job wasn't even done by the makers of the title but third party company.
On the other end look at Far Cry 5. An AMD title. It runs perfectly on all platforms.
 
I’m having difficult time trying to find which TV’s have FreeSync.

I don’t suppose some could tell me the cheapest 55”+ FreeSync TV??

Cheers, when I search all find is “Samsung are bring out firmware updates for there 2018 range” and the one that comes up is no long available to buy.

I can’t even find a list of freesync TV’s
 
I’m having difficult time trying to find which TV’s have FreeSync.

I don’t suppose some could tell me the cheapest 55”+ FreeSync TV??

Cheers, when I search all find is “Samsung are bring out firmware updates for there 2018 range” and the one that comes up is no long available to buy.

I can’t even find a list of freesync TV’s
Are you searching the term variable refresh rate?
 
I’m having difficult time trying to find which TV’s have FreeSync.

I don’t suppose some could tell me the cheapest 55”+ FreeSync TV??

Cheers, when I search all find is “Samsung are bring out firmware updates for there 2018 range” and the one that comes up is no long available to buy.

I can’t even find a list of freesync TV’s


55NU8000 is the one your looking for they are around £1000-£1300 I think.

The other models are the 2018 qleds q6fn q7fn q8fn q9fn.

Maybe someone else can confirm.
 
55NU8000 is the one your looking for they are around £1000-£1300 I think.

The other models are the 2018 qleds q6fn q7fn q8fn q9fn.

Maybe someone else can confirm.
Thank you, sadly that’s a little out of my price range but I’ll take a closer look. Am I right I’m saying that it’s only Samsung TV’s that have freesync right now?

I was gonna buy a Hiscence 4K for about £800 and a 1080ti, but now I’m swaying to a FreeSync TV and a Vega64, if I can find a TV that doesn’t cost more than grand that is
 
No, I had just been typing in 120hz freesync tv
Freesync is a branded solution so most TVs will use the term VRR as it's referring to the more universal bits of the standard :)

And right now it's very much a cutting edge/new premium feature for TVs, so expect it only in their top offerings for this year at least.

55NU8000 is the one your looking for they are around £1000-£1300 I think.

The other models are the 2018 qleds q6fn q7fn q8fn q9fn.

Maybe someone else can confirm.
By Samsung's letter code convention I'd have thought an N series TV was 2019. Are they out yet?
 
Does anyone know if you can do HDMI audio while also doing VRR?

The reason for asking is this:
The receiver repeats the signal and can't do VRR, so VRR information should be lost right?
Freesync is not supported in multi monitor mode.

The contrast comparing freesync 8 ms input lag to regular tv input lag is huge. This shouldn't take long to get noticed by the console community. I expect VRR/Freesync to be the next big thing for tvs on CES 2019. Since it is fairly easy to implement, there should be value models shortly thereafter.

@Bensky:
I spoke with a Samsung employee and he told me all 2018 qleds get Freesync. Dunno if he was knowledgeable though.

Most console ports even recent ones like Tomb Raider, used the Xbox360 codebase not the Xbone one, got Nvidia money to strap Gameworks on it and put it on the market.
And this job wasn't even done by the makers of the title but third party company.
On the other end look at Far Cry 5. An AMD title. It runs perfectly on all platforms.
Oh, and wait until the crazy console optimization gurus start optimizing for fp16 double throughput (rapid packed math) :)
 
I didn't even think about vrr not working when I run through my surround sound system.
I'm not entirely sure it won't. It was more an assumption then a statement.

I am fairly sure freesync won't work if you try to set it up with multi monitor mode though.

Too bad there is no simple sound device outputting HDMI sound. Doing it via GPU is driver hell :/
 
I didn't even think about vrr not working when I run through my surround sound system.

So I guess if running freesync from graphics to tv I would need to use hdmi ARC from tv back to my surround system.

There will be eARC on new TVs as well as HDMI 2.1. Will allow you to connect your devices to the TV then pass through the sound losslessly via HDMI Arc to your sound system.
 
Cable length matters a lot, I have an expensive 15m cable running round my front room and it will not display @4k 60Hz at all, where as if I move the PC over to the TV, even a cheap cable will do 4k @60.

Following up with some tests I did on this on the same GPU and 4k display setup. I have a couple of relatively good quality 3m HDMI cables and it doesn't display 4k @ 60hz. Seems either the GPU or the display forces everything back to 30hz after a few seconds of crazy fuzzy colours and blinking. I then used a 1m long cable (think it was from a Nintendo console) and it works no fuss. Im wondering if 1-2m long cables are the absolute limit to pushing 4k @ 60hz?

I need an expert to explain how much bandwidth is being used mind, because if 4k @60hz is 'supposed' to be too much information for a High Speed HDMI cable, then why is it possible in a short cable and not a long one? Is the real statement actually that 4k @ 60hz is within the bandwidth capability of these common cables, but the chance of it degrading is exponentially higher with increased length?

Would a more powerful GPU push HDMI signals over longer cable lengths? Or is that not how it works at all? Its all a bit smoke and mirrors to me right now.
 
As far as I know the power of the GPU makes no difference, but it might be the quality of the components on the card that make a bit of difference. HDMI cable quality makes a great deal of difference of long distances. Now I know you can get HDMI signal boosters but with prices between £5 and £1000+ you just know the cheap ones wont be worth diddly, I haven't tried one though.
 
Thank you, sadly that’s a little out of my price range but I’ll take a closer look. Am I right I’m saying that it’s only Samsung TV’s that have freesync right now?

I was gonna buy a Hiscence 4K for about £800 and a 1080ti, but now I’m swaying to a FreeSync TV and a Vega64, if I can find a TV that doesn’t cost more than grand that is
Dig around and the 55NU8000 can be had for 850 and the 65NU8000 for 1200
 
Im wondering if 1-2m long cables are the absolute limit to pushing 4k @ 60hz?
Nope, i have a 3m HDMI and have no issues with 4k @ 60Hz.

My setup = HTPC - 1m cable - AMP - 3m cable - TV

"CSL - 3m UHD HDMI 2.0b Cable | 4K @ 60Hz / 2160p / 4:4:4 (High Speed) with Ethernet | ARC and CEC | Deep Color | fully HDCP compliant/HD Ready / 3D TV/Playstation 4 Pro/Nintendo Switch ecc"

£6.49 delivered!
 
Following up with some tests I did on this on the same GPU and 4k display setup. I have a couple of relatively good quality 3m HDMI cables and it doesn't display 4k @ 60hz. Seems either the GPU or the display forces everything back to 30hz after a few seconds of crazy fuzzy colours and blinking. I then used a 1m long cable (think it was from a Nintendo console) and it works no fuss. Im wondering if 1-2m long cables are the absolute limit to pushing 4k @ 60hz?

I need an expert to explain how much bandwidth is being used mind, because if 4k @60hz is 'supposed' to be too much information for a High Speed HDMI cable, then why is it possible in a short cable and not a long one? Is the real statement actually that 4k @ 60hz is within the bandwidth capability of these common cables, but the chance of it degrading is exponentially higher with increased length?

For me it had to be 2m to work perfectly (Roline 11.04.5681; can't link it). Afaik RGB 4:4:4 8bit @ 4k60hz pretty much pushes as close to the 18gb/s limit as possible. You can use the standard cheapo cables that do around 10gb/s for 4k60hz too, but likely YCbCR 4:2:2 8bit and will end up with artifacting or display loss at times (seen it happen at someone else's, they use a long standard hdmi cable and sometimes when switching from games back to desktop the screen gets artifacted or blacks out and doesn't fix without sign-out or restart). See here for bandwidth stuff.
 
Im wondering if 1-2m long cables are the absolute limit to pushing 4k @ 60hz?

Pretty much with current displayport and HDMI for high refresh or 4K@60Hz beyond ~1.8m cable length things become very pot luck - I know people who are running like 5m cables and like "what is all the fuss about? it works fine for me" and others who get constant artefacting and loss of signal even a few centimetres over 2m, etc.
 
I'll admit to pushing my luck by running 4K60 YCbCr 4:2:2 8bit over a 7.5m "basics" high speed cable from a 1080Ti to a Sony TV and it is rock solid.

The cable is very thick though - it reminds me of the original ThickNet coax ethernet cable that you couldn't bend too tightly or the signal wouldn't make it round the bend :-)

Trying to go to YCbCr4:4:4 is a step too far though and I start to see strange colour artifacts.
 
Back
Top Bottom