RT has existed for decades before it was called RTX.
Having said that Nvidia were the ones to recognise the potential of it in real time render gaming and put a lot of hard work in to making that a reality, a lot of good and complex work.
Are you saying that because Nvidia couldn't do HDR / 1000 Nits+ without a G-Sync module they needed AMD to step in?
Having said that Nvidia were the ones to recognise the potential of it in real time render gaming and put a lot of hard work in to making that a reality, a lot of good and complex work.
Gsync HDR (aka Gsync Ultimate) came out in like 2018, back then a module was required, AMD didn't have a solution for 144Hz HDR with guaranteed 1000 nits, Nvidia did, hence why a module was necessary and why the whole Ultimate certification exists, to offer that guarantee . Even today none of AMD's FreeSync tiers guarantee a luminance maximum for HDR.
Obviously on modern displays things have changed and up to 1000 nits is possible without a module. You're talking about the early days, so that's what it was like during those early days of VRR factoring in HDR performance. At the baseline ALL of these VRR displays follow the same adaptive sync standard that has been dictated. It's the additional features that once was only possible with the Gsync module.
Yes the module is basically defunct now since monitor's onboard scalers have been able to do what the gsync module has been able to do for some time. But let's not gloss over the actual specs and timeframes in question.
Are you saying that because Nvidia couldn't do HDR / 1000 Nits+ without a G-Sync module they needed AMD to step in?