• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

HDMI 2.1 VRR AMD FreeSync nVidia G-SYNC 4K TV explained

I have a samsung q7 with update i just wish amd had a card to drive it. Sticking with my 1080 for now had it for over 2 yrs so happy i brought it. However x box x is tempting for my son.
 
I have a samsung q7 with update i just wish amd had a card to drive it. Sticking with my 1080 for now had it for over 2 yrs so happy i brought it. However x box x is tempting for my son.

I been looking at the q7 it looks amazing.

Have you played much 1440p 120hz? I wondered how it played for games like overwatch. I was planning on playing overwatch at 1440p 120hz and games like street fighter at 4k.

The vega cards have dropped in price a bit lately and they have custom ones now, have you thought about side grading?
 
Rounding all this up in one place for new TV & GPU buyers. It's amazing how confused people are across the web at the moment concerning refresh rates and GPU compatibility:

  • HDMI 2.1 is the new connection standard for 4k TV's that allows Variable Refresh Rate (VRR) - these TV's are only just coming out now in 2018[/QUOTE]

What Tv's are there in 2018 with HDMI 2.1 and VRR? I am pretty sure the answer is none? They will probably come out 2019-2020?
 
I do think when they have HDMI 2.1 TV's with 4k 120hz, tyou would be better getting a TV than a PC monitor. Because TV's generally look better and are cheaper than monitors.
 
I do think when they have HDMI 2.1 TV's with 4k 120hz, tyou would be better getting a TV than a PC monitor. Because TV's generally look better and are cheaper than monitors.
Depends on input lag. Some are getting better for that. Motion blur is terrible on my Samsung but it's nothing special.
 
  • G-SYNC is nVidia capitalising on their dominant market position to lock their customers into their own proprietary hardware ecosystem. G-SYNC is a competitor to VRR and is unlikely to appear in 4k TVs. GeForce GPUs currently don't support HDMI 2.1 VRR - but that could change in the future.
G-sync was developed and released before VRR was available, It's from G-sync that AMD came up with the idea of having the VRR ability added to the DP standard. When it comes to TV's I imagine Nvidia will use VRR as they have been with the G-sync Laptops.
 
G-sync was developed and released before VRR was available, It's from G-sync that AMD came up with the idea of having the VRR ability added to the DP standard. When it comes to TV's I imagine Nvidia will use VRR as they have been with the G-sync Laptops.
Not true.

Vertical blank interrupt has been around for decades.

VRR had been around for several years before either GSync or Adaptive Sync were announced. Of those two, GSync was announced first by a narrow margin and as nvidia were part of the body that announced Adaptive Sync, they likely knew it was coming before they announced GSync.

Edit: OK looking back at dates it seems like it was only in 2012 that people began talking about laptops doing VRR to reduce power consumption so perhaps not years! Could have sworn I saw stuff earlier than that :shrug:

Edit2: Oh boy my edits got all jumbled so that turned into a mess. :/ computers 1 - 0 me.
 
Last edited:
Depends on input lag. Some are getting better for that. Motion blur is terrible on my Samsung but it's nothing special.

This years higher end samsungs have sorted both them issues the input lag is really low and you can enable the perfect motion even in game mode.

Plus they have free sync.

G-sync was developed and released before VRR was available, It's from G-sync that AMD came up with the idea of having the VRR ability added to the DP standard. When it comes to TV's I imagine Nvidia will use VRR as they have been with the G-sync Laptops.

Do you think nvidia will let people use vrr with tv's?

I hope they do but I don't think they will. They seem to be pushing these big format gsync displays.
 
Do you think nvidia will let people use vrr with tv's?

I hope they do but I don't think they will. They seem to be pushing these big format gsync displays.

I can't see it - there is no way to lock it down really between TVs and monitors, without demanding hardware manufacturers put in some kind of extra hardware which kind of defeats the point, so they'd have to enable VRR for any supported device or none at all.

Not true.

Vertical blank interrupt has been around for decades and is the basis for

VRR had been around for several decades before either (External) GSync or (External) Adaptive Sync were announced. Of those two, GSync was announced first by a narrow margin and as nvidia were part of the body that announced Adaptive Sync, they knew it was coming before they announced GSync.

Edit: OK looking back at dates it seems like it was only in 2012 that people began talking about laptops doing VRR to reduce power consumption so perhaps not years! Could have sworn I saw stuff earlier than that :shrug:

eDP supported adaptive sync - used in laptops and professional displays long before G-Sync. Despite what some claim nVidia actually went away and made G-Sync in response to a lack of interest from other parties in implementing a standard for desktop use and possibly a good bit of the reason they've persisted with it being locked down is due to other parties only jumping on the bandwagon after nVidia made it popular.
 
This years higher end samsungs have sorted both them issues the input lag is really low and you can enable the perfect motion even in game mode.

Plus they have free sync.



Do you think nvidia will let people use vrr with tv's?

I hope they do but I don't think they will. They seem to be pushing these big format gsync displays.

I don't see why not but at this point it's anyone's guess. The more casual gaming TV market may not interest them as they're in a strong place right now so they could say no, but on the other hand they may not want to miss out and they could use VRR rather than work towards bringing G-sync TV's to the market as they already did with laptops.
 
Not true.

Vertical blank interrupt has been around for decades.

VRR had been around for several years before either GSync or Adaptive Sync were announced. Of those two, GSync was announced first by a narrow margin and as nvidia were part of the body that announced Adaptive Sync, they likely knew it was coming before they announced GSync.

Edit: OK looking back at dates it seems like it was only in 2012 that people began talking about laptops doing VRR to reduce power consumption so perhaps not years! Could have sworn I saw stuff earlier than that :shrug:

Edit2: Oh boy my edits got all jumbled so that turned into a mess. :/ computers 1 - 0 me.

Whether the capability was there or not I don't know but it was only after G-sync released that AMD pushed for the upcoming DisplayPort update to include what was needed for them to take advantage of a variable refresh rate in desktop monitors. Before then it wasn't possible.

Despite what some claim nVidia actually went away and made G-Sync in response to a lack of interest from other parties in implementing a standard for desktop use and possibly a good bit of the reason they've persisted with it being locked down is due to other parties only jumping on the bandwagon after nVidia made it popular.

Can you source me some info on this please as it'll make an interesting read.
 
Can't wait to get a massive 4k Freesync TV. That might even be enough reason to convince me to buy a Vega Nano.
If nvidia keeps locking their cards into the gsync ecosystem, I'm pretty much going AMD all the way since I do all my PC stuff on the big screen. It doesn't matter if AMD is slow and rubbish, if they can let me game with VRR that's 99% of my motivation. I would like to think that by the time I can buy a decent 55-60" 4KTV with VRR, AMD have pulled their finger out their arse and actually made a decent GPU for once. They really need to capitalize on what I think is going to be a very big mistake on nvidia's part.
 
VRR is basically the TV equivalent of Freesync. In order to see how much of an issue it is, you would have to find out how many PC gamers actually game using a TV rather than a monitor. I suggest that number would be very very small indeed.

Right now, I dont think nVidia are concerned at all. I dont think this year or even next year anything at all will change, there's no new xbox, no new ps and the huge % of pc gamers that use monitors will continue to use monitors. nVidia wont have a gsync option in tv's imho, but they may produce their own version of adaptive sync for their cards that would be compatible with tv's but as I say there's zero rush for that at the moment.

I get being prepared, but this is way much too soon.
 
Can you source me some info on this please as it'll make an interesting read.

Wouldn't even know where to go about finding information online about it now - this was like 8 years ago. I think Tom Petersen talked about it briefly on one of the PCPer videos as well.
 
Not many devices that primarily go with TVs have displayport connectivity so kind of a legacy thing I guess just never took off.

While on PCs DVI and dual-link set the pace so HDMI without support for higher resolution/refresh combinations wasn't enough hence DP got adopted.
 
Wouldn't even know where to go about finding information online about it now - this was like 8 years ago. I think Tom Petersen talked about it briefly on one of the PCPer videos as well.

Shame, I've tried Googling for info and checking sources like the Wiki, (questionable source I know but it's best to look everywhere) to back up what you said but found nothing.


While being light on content these are worth a read as they have some basics on G-sync and it's predecessors.

https://www.avadirect.com/blog/nvidia-g-sync-a-true-gaming-industry-revolution/

https://www.pcper.com/reviews/Graph...-Death-Refresh-Rate/Introduction-LCD-Monitors

To flesh out what I said: "G-sync was developed before VRR was available for use in the way it's now used for gaming, In response to Nvidia's G-sync's announcement AMD put in motion a plan to get VRR supported by the upcoming DP update 1.2a that was soon to release. Since then VRR's evolved to include HDMI in a limited way". I was under the impression this was general knowledge, easily proven with a bit of research.
 
Back
Top Bottom