• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA Open Up Support for Adaptive-sync/FreeSync for Future Native G-sync Module Screens

Soldato
OP
Joined
30 Mar 2010
Posts
13,059
Location
Under The Stairs!
VRR over 2.1 is optional apparently. Plenty of wiggle room there!
I mean in context with AMD/Nv can't lock you into their own AS implementation if Samsung/LG/whoever states 2.1 VRR support laid out in their specifications- it would have to work on any gpu with 2.1 VRR support.

Bottom line imo it's insane the confusion regarding AS compatability and we know what it is.
 
Caporegime
Joined
18 Oct 2002
Posts
29,871
I mean in context with AMD/Nv can't lock you into their own AS implementation if Samsung/LG/whoever states 2.1 VRR support laid out in their specifications- it would have to work on any gpu with 2.1 VRR support.

I'm sure if nVidia can they certainly will ;) :D

I reckon they'll give it a good go!!
 
Man of Honour
Joined
13 Oct 2006
Posts
91,171
I mean in context with AMD/Nv can't lock you into their own AS implementation if Samsung/LG/whoever states 2.1 VRR support laid out in their specifications- it would have to work on any gpu with 2.1 VRR support.

Bottom line imo it's insane the confusion regarding AS compatability and we know what it is.

Not quite sure what you are saying there - nothing stopping AMD or nVidia using propitiatory extensions with HDMI 2.1 and not supporting VRR while working with another company to produce a custom adaptive sync implementation for their device - but if they want to support 3rd party devices that are made to the HDMI 2.1 spec as per the standard with VRR then their GPUs would have to support the full specification. VRR is optional to the spec due to the vast range of devices that will use it - portable devices, DVD players, etc. are unlikely to ever implement or potentially even need to implement all the features of HDMI 2.1.
 
Soldato
Joined
19 Dec 2010
Posts
12,031
Not quite sure what you are saying there - nothing stopping AMD or nVidia using propitiatory extensions with HDMI 2.1 and not supporting VRR while working with another company to produce a custom adaptive sync implementation for their device - but if they want to support 3rd party devices that are made to the HDMI 2.1 spec as per the standard with VRR then their GPUs would have to support the full specification. VRR is optional to the spec due to the vast range of devices that will use it - portable devices, DVD players, etc. are unlikely to ever implement or potentially even need to implement all the features of HDMI 2.1.

Why are you making it so complicated? What he is stating is pretty simple.

If a TV has VRR support over HDMI 2.1 than any GPU that supports VRR over HDMI 2.1 will be able to connect to it and use that feature.
 
Soldato
Joined
6 Feb 2019
Posts
17,598
Why are you making it so complicated? What he is stating is pretty simple.

If a TV has VRR support over HDMI 2.1 than any GPU that supports VRR over HDMI 2.1 will be able to connect to it and use that feature.

Correct, so far Nvidia supports this while AMD does not.

The tricky parts comes from pass-through - now put a AV reciever in the mix, will it take HDR, VRR, 4k, 120hz and send it to the TV?
 
Soldato
OP
Joined
30 Mar 2010
Posts
13,059
Location
Under The Stairs!
I'm sure if nVidia can they certainly will ;) :D

I reckon they'll give it a good go!!

:eek:Hope not:p
if they want to support 3rd party devices that are made to the HDMI 2.1 spec as per the standard with VRR then their GPUs would have to support the full specification.

If a TV has VRR support over HDMI 2.1 than any GPU that supports VRR over HDMI 2.1 will be able to connect to it and use that feature.


Yes, that's my take on it, once some of us get past the brand venom, it'll be good.:)
 
Associate
Joined
14 Jan 2014
Posts
220
So, i'm pretty sure I already know the answer, but can someone please just clarify:

AS / VRR - Generic Term for Adaptive Synchronisation of displayed frame rate, based on real time graphical demand of the game being played; displayed as a variable refresh rate?

GSync - NVid's branding of the technology that provides AS / VRR?

Freesync - AMD's branding of said tech?

The reason I ask is because I am considering an AMD GPU with a VRR monitor. I want to future proof the monitor by buying Gsync compatible Freesync. Is the GSync / Freesync labelling still an exclusivity, or is it now or ever likely just going to be universally labelled AS or VRR. Basically, am I still limited to buying one or the other, or the third option of buying from the relatively thin market of Gsync compatible freesync monitors?

This'll be a stop gap until the next top-tier Nvidia 30XX series cards come out (for the sake of argument a 3080 or 3080Ti, if not as ludicrously priced as the 2080Ti) if HDMI 2.1 is supported. The intention being to game at 4K 120Hz with a LG C9 OLED which is VRR enabled up to 120Hz via HDMI 2.1, when the wife and kids aren't around and still retain the monitor for the odd session when she's engrossed in something unwatchable.
 
Soldato
Joined
6 Feb 2019
Posts
17,598
The writing was on the wall when new & improved Gsync came with a propeller that made your monitor sound like a fan oven.

that has nothing to do with Gsync lol.

those crappy lcd screens need fans or they overheat - the thickness of the monitor and its heat is not from Gsync it's because they wanted local dimming and HDR and they needed to get several times brighter than previous gaming monitors.

One of the big reasons TVs are always ahead of monitors with technology is its often easier to implement new tech in a bigger device than trying to shrink it into a tiny monitor.

and if anyone wants to know why their latest smart phone can get super bright HDR without overheating - because they're all OLED screens not crappy LCD - OLED is the future, just try asking ASUS how well their BFG screens are selling LMAO
 
Soldato
Joined
22 Nov 2003
Posts
2,933
Location
Cardiff
that has nothing to do with Gsync lol.

those crappy lcd screens need fans or they overheat - the thickness of the monitor and its heat is not from Gsync it's because they wanted local dimming and HDR and they needed to get several times brighter than previous gaming monitors.

One of the big reasons TVs are always ahead of monitors with technology is its often easier to implement new tech in a bigger device than trying to shrink it into a tiny monitor.

and if anyone wants to know why their latest smart phone can get super bright HDR without overheating - because they're all OLED screens not crappy LCD - OLED is the future, just try asking ASUS how well their BFG screens are selling LMAO
You are wrong. The module itself, an Intel Altera Arria 10 GX 480 required active cooling. Nothing to do with the screen.
 
Soldato
Joined
13 Aug 2012
Posts
4,277
Correct, so far Nvidia supports this while AMD does not.

The tricky parts comes from pass-through - now put a AV reciever in the mix, will it take HDR, VRR, 4k, 120hz and send it to the TV?

Easier to go straight from pc to the tv then ARC back out to the sound system. The hdmi 2.1 supports the new ARC so will output back everything.
 
Back
Top Bottom