• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

HDMI 2.1 VRR AMD FreeSync nVidia G-SYNC 4K TV explained

VRR with the 2018 hdmi 2.0 tvs would only work at 4k60 or 1080p120 because of the limited bandwidth.

Excited about this, but I think it'll take a long time to get the new cards and tvs to do it properly.
 
VRR with the 2018 hdmi 2.0 tvs would only work at 4k60 or 1080p120 because of the limited bandwidth.

Excited about this, but I think it'll take a long time to get the new cards and tvs to do it properly.

They also do 1440p 120hz.

Acording to the reviews freesync is working great on this years tv's.
 
The upcoming Intel GPU is not made by AMD.
Also shows how narrow the Nvidia market actual is to run without fierce competition. On PC top £700+ card only. Anything deviated from that parameter AMD has more broad supporting devices.
After that is just blind faith to a failed Nvidia religion.

And according to anandtech:
Select Samsung QLED TVs to be launched this year are set to support a 120 Hz maximum refresh rate, HDMI 2.1’s VRR, as well as AMD’s FreeSync technologies, the company announced earlier this year. The technologies do essentially the same thing, but they are not the same method – AMD's Freesync-over-HDMI being a proprietary method – and as such are branded differently.
https://www.anandtech.com/show/12997/hdmi-forum-demonstrates-hdmi-21-vrr-capabilities-on-samsung-tv

HDMI2.1 VRR and Freesync are not the same thing

nVidia objectively win in the 1080ti category yes. I don't know what you mean with "AMD has more broad supporting devices" below that, but I'd take a 1080 over a Vega64 any day of the week. Vega is a bit of a mess. 580/570 vs 1060 is perhaps a different question.

The fact that the console designs were won by AMD doesn't really mean that much does it. Just because MS and Sony made a decision, it doesn't mean they know best or that is the best option for everyone should make that decision. Nintendo made the opposite decision. Sony made the opposite decision last gen. MS made the opposite decision in the original Xbox.

AMD have a big advantage for consoles in that they have an x86 license and can produce some powerful APUs. I don't want an APU for my PC.

I'm actually hopeful that nVidia will support HDMI2.1 VRR eventually. In the meantime freesync and gsync are functionally equivalent for most people.
 
I use a 60 inch sony bravia. (non 4k)
The dream is to have a 65 inch OLED that supports FreeSync 2 HDR or G-SYNC or some native VRR GPU compatibility. Ill basically buy whoever's GPU next year supports VRR on tv sets. I'm aware I'm possibly a niche's niche though!
 
VRR is basically the TV equivalent of Freesync. In order to see how much of an issue it is, you would have to find out how many PC gamers actually game using a TV rather than a monitor. I suggest that number would be very very small indeed.

Right now, I dont think nVidia are concerned at all. I dont think this year or even next year anything at all will change, there's no new xbox, no new ps and the huge % of pc gamers that use monitors will continue to use monitors. nVidia wont have a gsync option in tv's imho, but they may produce their own version of adaptive sync for their cards that would be compatible with tv's but as I say there's zero rush for that at the moment.

I get being prepared, but this is way much too soon.

It's not just for PC gamers. The Xbox one supports VRR right now.
 
To flesh out what I said: "G-sync was developed before VRR was available for use in the way it's now used for gaming, In response to Nvidia's G-sync's announcement AMD put in motion a plan to get VRR supported by the upcoming DP update 1.2a that was soon to release. Since then VRR's evolved to include HDMI in a limited way". I was under the impression this was general knowledge, easily proven with a bit of research.

Actually AMD had been working on adaptive sync before Gsync was released. I don't think AMD expected Nvidia to get Gsync out so quickly.
 
If 4k and VRR are set to become an option on normal TV's for pc gamers, it would be pretty silly of manufacturers of those sets to ignore a huge chunk of the market. However, I dont think manufacturers are really accounting for pc gamers.

I'm using 4K TV instead of monitor, and all i can say, i'm happy. i know picture is a bit blurry, or input lag is there. But for the games i play i see no difference, and now i don't think i'll ever going back to owning proper monitor for pc.
 
Where did you read that?

Well you can work it out yourself.

AMD put hardware into their desktop cards only suitable for connecting to an adaptive sync monitor. Cards like the 290, 260x etc. Some of these cards were available to buy before Nvidia even announced Gsync. The 260x for example. Why make cards with this capability if they weren't thinking about it?

They submitted the proposal to VESA to get adaptive sync put into the Display Port standard in early November 2013. Do you believe that AMD would be able to come up with something like this in less than a month?

And lastly, I asked :) Simples, when AMD had the questions and Answers session on this forum about Freesync a few years ago. He confirmed that AMD had been working on Freesync alongside their Bonaire and Hawaii cards. Which is why the older GCN cards don't work with Adaptive sync.
 
I'm using 4K TV instead of monitor, and all i can say, i'm happy. i know picture is a bit blurry, or input lag is there. But for the games i play i see no difference, and now i don't think i'll ever going back to owning proper monitor for pc.

Well that’s three of us at least! Can’t say I’ve noticed lag either but then I made a point of avoiding a Gsync or Freesync tie-in. Coming from either of those it probably would be noticeable.

It’s the way forward I tell thee. :D
 
Well you can work it out yourself.

AMD put hardware into their desktop cards only suitable for connecting to an adaptive sync monitor. Cards like the 290, 260x etc. Some of these cards were available to buy before Nvidia even announced Gsync. The 260x for example. Why make cards with this capability if they weren't thinking about it?

They submitted the proposal to VESA to get adaptive sync put into the Display Port standard in early November 2013. Do you believe that AMD would be able to come up with something like this in less than a month?

And lastly, I asked :) Simples, when AMD had the questions and Answers session on this forum about Freesync a few years ago. He confirmed that AMD had been working on Freesync alongside their Bonaire and Hawaii cards. Which is why the older GCN cards don't work with Adaptive sync.

Amd had it before that too, it was used in laptops for power saving purposes, using variable refresh rates to increase battery life.
 
Actually AMD had been working on adaptive sync before Gsync was released. I don't think AMD expected Nvidia to get Gsync out so quickly.
Well you can work it out yourself.

AMD put hardware into their desktop cards only suitable for connecting to an adaptive sync monitor. Cards like the 290, 260x etc. Some of these cards were available to buy before Nvidia even announced Gsync. The 260x for example. Why make cards with this capability if they weren't thinking about it?

They submitted the proposal to VESA to get adaptive sync put into the Display Port standard in early November 2013. Do you believe that AMD would be able to come up with something like this in less than a month?

And lastly, I asked :) Simples, when AMD had the questions and Answers session on this forum about Freesync a few years ago. He confirmed that AMD had been working on Freesync alongside their Bonaire and Hawaii cards. Which is why the older GCN cards don't work with Adaptive sync.

Well on that basis NVidia Gsync works with cards from the 600 series upwards, launched in march 2012, which trumps the 290's November 2013. But of course both companies were probably working on these things way before that. ;)
 
Nvidia seemed to rush gsync out around the time everyone was talking about Mantle. IIRC you couldn't buy gsync monitors and had to buy a crappy Asus TN screen and fit the FPGA yourself.
 
Yeah they rushed to market to be first, very obviously. Still, it worked as posts on here demonstrate - people think it was an actual product before anyone else thought to do similar.

Edit: Looking soley at public announcements & release dates. Who started developing what beehind closed doors when I don't know obviously, and gsync was still cool, just a shame to not use the standard for so long
 
Well on that basis NVidia Gsync works with cards from the 600 series upwards, launched in march 2012, which trumps the 290's November 2013. But of course both companies were probably working on these things way before that. ;)

The 290's launched in October 2013. The R7 260x launched a week before the Gsync announcement in October 2013.

Do you really have that limited of an understand of how Gsync and Freesync works? You don't need anything on the GPU side for Gysnc, the Gysnc module handles everything, it has the timing controller, memory buffer and scaler all built into the one FPGA. But, to connect to an adaptive sync monitor the GPU has to have a timing controller and memory buffer on the display port. This extra hardware is not normally required on a desktop GPU. It's the reason why Nvidia's Kepler and Maxwell GPU's can't connect to adaptive sync monitors. I don't think Pascal has the ability either.

It's also the reason all GCN APU's support Adaptive Sync, because they had to meet the specifications for the Embedded display port standard, but not all desktop GCN cards do.

So again, if AMD hadn't been working on a VRR solution why would they put hardware needed to connect to adaptive sync monitors onto their cards?

My point is that AMD didn't suddenly start working on Freesync just because Nvidia came out with Gsync. I am not even sure the release of Gsync pushed out the release of Adpative sync monitors any quicker as AMD had decided to go the open route that required certification by VESA and that takes time.
 
Back
Top Bottom