Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I have a samsung q7 with update i just wish amd had a card to drive it. Sticking with my 1080 for now had it for over 2 yrs so happy i brought it. However x box x is tempting for my son.
Rounding all this up in one place for new TV & GPU buyers. It's amazing how confused people are across the web at the moment concerning refresh rates and GPU compatibility:
- HDMI 2.1 is the new connection standard for 4k TV's that allows Variable Refresh Rate (VRR) - these TV's are only just coming out now in 2018[/QUOTE]
What Tv's are there in 2018 with HDMI 2.1 and VRR? I am pretty sure the answer is none? They will probably come out 2019-2020?
Out now it seems.
I thought you was running CF Vega 64's?Can't wait to get a massive 4k Freesync TV. That might even be enough reason to convince me to buy a Vega Nano.
Depends on input lag. Some are getting better for that. Motion blur is terrible on my Samsung but it's nothing special.I do think when they have HDMI 2.1 TV's with 4k 120hz, tyou would be better getting a TV than a PC monitor. Because TV's generally look better and are cheaper than monitors.
- G-SYNC is nVidia capitalising on their dominant market position to lock their customers into their own proprietary hardware ecosystem. G-SYNC is a competitor to VRR and is unlikely to appear in 4k TVs. GeForce GPUs currently don't support HDMI 2.1 VRR - but that could change in the future.
Not true.G-sync was developed and released before VRR was available, It's from G-sync that AMD came up with the idea of having the VRR ability added to the DP standard. When it comes to TV's I imagine Nvidia will use VRR as they have been with the G-sync Laptops.
Depends on input lag. Some are getting better for that. Motion blur is terrible on my Samsung but it's nothing special.
G-sync was developed and released before VRR was available, It's from G-sync that AMD came up with the idea of having the VRR ability added to the DP standard. When it comes to TV's I imagine Nvidia will use VRR as they have been with the G-sync Laptops.
Do you think nvidia will let people use vrr with tv's?
I hope they do but I don't think they will. They seem to be pushing these big format gsync displays.
Not true.
Vertical blank interrupt has been around for decades and is the basis for
VRR had been around for several decades before either (External) GSync or (External) Adaptive Sync were announced. Of those two, GSync was announced first by a narrow margin and as nvidia were part of the body that announced Adaptive Sync, they knew it was coming before they announced GSync.
Edit: OK looking back at dates it seems like it was only in 2012 that people began talking about laptops doing VRR to reduce power consumption so perhaps not years! Could have sworn I saw stuff earlier than that :shrug:
This years higher end samsungs have sorted both them issues the input lag is really low and you can enable the perfect motion even in game mode.
Plus they have free sync.
Do you think nvidia will let people use vrr with tv's?
I hope they do but I don't think they will. They seem to be pushing these big format gsync displays.
Not true.
Vertical blank interrupt has been around for decades.
VRR had been around for several years before either GSync or Adaptive Sync were announced. Of those two, GSync was announced first by a narrow margin and as nvidia were part of the body that announced Adaptive Sync, they likely knew it was coming before they announced GSync.
Edit: OK looking back at dates it seems like it was only in 2012 that people began talking about laptops doing VRR to reduce power consumption so perhaps not years! Could have sworn I saw stuff earlier than that :shrug:
Edit2: Oh boy my edits got all jumbled so that turned into a mess. :/ computers 1 - 0 me.
Despite what some claim nVidia actually went away and made G-Sync in response to a lack of interest from other parties in implementing a standard for desktop use and possibly a good bit of the reason they've persisted with it being locked down is due to other parties only jumping on the bandwagon after nVidia made it popular.
If nvidia keeps locking their cards into the gsync ecosystem, I'm pretty much going AMD all the way since I do all my PC stuff on the big screen. It doesn't matter if AMD is slow and rubbish, if they can let me game with VRR that's 99% of my motivation. I would like to think that by the time I can buy a decent 55-60" 4KTV with VRR, AMD have pulled their finger out their arse and actually made a decent GPU for once. They really need to capitalize on what I think is going to be a very big mistake on nvidia's part.Can't wait to get a massive 4k Freesync TV. That might even be enough reason to convince me to buy a Vega Nano.
Can you source me some info on this please as it'll make an interesting read.
Wouldn't even know where to go about finding information online about it now - this was like 8 years ago. I think Tom Petersen talked about it briefly on one of the PCPer videos as well.