• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Workaround: FreeSync on nVidia GPUs

Yet here we are HDMI 2.1 will be coming shortly on all displays (including TVs), and Nvidia because it doesn't want to support it is even making own TVs!!!!!

And HDMI 2.1 with VRR (is compulsory) costs to manufacturers $0.05 and HDR Gsync module how much?

Why do you think they should support it when there most likely making a huge packet out of there G-sync

I wonder whether the G-sync TV's will play nice with an AMD GPU?

I'd bet that as far as Nvidia are concerned they'll have you captive once you invest in a G-sync TV as they're several grand each :eek:. The longer AMD remain out gunned at the top of the charts the more Nvidia will push the limits to see how far they can push PC gamers.

I thought about grabbing an RTX but I can't stomach it, just like I couldn't stomach paying up to £800 for Vega, That's why I sold mine within a month or so to recuperate most of the money. Today Vega's one of the best deals on the market. Especially with AMD being so slow to replace their flagships (famous last words)
Let's just hope we don't see another mining boom anytime soon, with my luck it'll probably be when Vega gets replaced. :rolleyes:
 
I am not denying that Nvidia had it out first. What I am saying is that AMD were working on a VRR solution before Gsync was released. AMD were obviously caught on the hop by Gsync and rushed out a demo in CES in January 2014 using laptops, but, they had been putting things into place for VRR before the Gsync demo. First Gsync monitor was August 2014, first Freesync monitor was April 2015. (I am not counting the BEN Q monitor that released in MArch as they jumped the gun before the drivers were ready and it needed a firmware update.)

They did have to get display port certification, you are confusing eDP with Display Port. Do you not remember? AMD had to put a proposal to VESA and that didn't get accepted until May 2014 and even then it only became an optional part of the Display port standard. So, AMD to wait until after that to start working on the Manufacturers.

AMD had the hardware in place to connect to Adaptive Sync monitors before the Gsync demo even took place. They Submitted the Proposal to VESA in early November 2013. Now, Maybe AMD managed to come with an alternative to Gsync, put a proposal together and send it to VESA in less than a month, I don't believe that for a second. But, combine that with having the necessary hardware to connect to a DP 1.2a monitor on desktop cards released before the Gsync demo, suggests that AMD were working on a VRR solution.

Besides, I asked this when they did the open Q &A session here on Freesync. They confirmed that they were working on Adaptive sync during the development of the Hawaii and Bonaire cards.

So yes, Nvidia did release first, and did put the skids under AMD. But, AMD were working on a VRR solution before that.

I'm sure AMD would have known that Nvidia were working on it long before the reveal. Probably not the full details but enough to make it clear it was for manufacture and not just a concept (module makers, monitor partners etc etc).

The timeline is telling - AMD taking so long to get to manufacture and then the initial monitors were nowhere near the quality (subpar refresh rate ranges) of the Nvidia offerings and also took a fair time to rectify pointing to a rushed release due to a case of major catch-up.
 
I wonder whether the G-sync TV's will play nice with an AMD GPU?

I'd bet that as far as Nvidia are concerned they'll have you captive once you invest in a G-sync TV as they're several grand each :eek:. The longer AMD remain out gunned at the top of the charts the more Nvidia will push the limits to see how far they can push PC gamers.

I thought about grabbing an RTX but I can't stomach it, just like I couldn't stomach paying up to £800 for Vega, That's why I sold mine within a month or so to recuperate most of the money. Today Vega's one of the best deals on the market. Especially with AMD being so slow to replace their flagships (famous last words)
Let's just hope we don't see another mining boom anytime soon, with my luck it'll probably be when Vega gets replaced. :rolleyes:

You missed that the Nvidia TV is but a but a monitor with shield strapped on it?
I not normal smart TV which you can plug on and watch broadcasting in the normal sense.

VRR on mainstream TV going to be norm starting next year.

Already there are 20 TVs from Samsung alone. With 55" hdr1000 starting at 800 with Freesync.
 
You missed that the Nvidia TV is but a but a monitor with shield strapped on it?
I not normal smart TV which you can plug on and watch broadcasting in the normal sense.


VRR on mainstream TV going to be norm starting next year.

Already there are 20 TVs from Samsung alone. With 55" hdr1000 starting at 800 with Freesync.

That's a shame, I thought the upcoming G-sync TV's were actual TV's with a 4k 144hz HDR panel and g-sync module added.
 
Last edited:
Seems using an APU actually may be the better way, as it triggers Nvidia to think you have an Optimus setup: https://www.reddit.com/r/Amd/comments/99yz9h/freesync_on_nvidia_1060_6gb_through_amd_apu/

ILglejQl.png.jpg

Nn8Ht3cl.png.jpg


This way you don't have to rely on Windows to select the GPU, but can use the Nvidia Control Panel itself to do the choosing.
 
I'm sure AMD would have known that Nvidia were working on it long before the reveal. Probably not the full details but enough to make it clear it was for manufacture and not just a concept (module makers, monitor partners etc etc).

The timeline is telling - AMD taking so long to get to manufacture and then the initial monitors were nowhere near the quality (subpar refresh rate ranges) of the Nvidia offerings and also took a fair time to rectify pointing to a rushed release due to a case of major catch-up.

It would have been the other way around too. Nvidia would have known what AMD were up to as well, as AMD had to work things out with VESA and Nvidia are on the board of VESA.

The only way the timeline is telling is that the months waiting for Certification from VESA was the major holdup. Final proposal submitted in November 2013, wasn't made part of the Display port spec until May 2014. Then the first monitors were released in April 2015. If you take out the time taken to get approval by VESA, then the timeline isn't that different to Nvidias. Gsync demo in October 2013, first monitor released in August 2014. And the first Gsync monitors had their issues too.

Now, If you remember the first monitor released with Freesync was the BenQ one, but, they jumped the gun, they released in MArch before AMD was ready. Which resulted in those monitors requiring a firmware update.

Again, not saying that Nvidia didn't release first and it probably rushed AMD a bit, but, AMD were on the road to their own VRR solution long before Gsync was released.
 
If Intel actually get their arse in gear and finally add adaptive sync support to their iGPUs, then it'd probably work with their mainstream CPUs too.
 
FreeSync on Nvidia GPUs Workaround Tested, It Actually WORKS!
Interesting video. Discreet GPU works but requires the game to have a GPU selector. If a hack is developed so that it appears as an Optimus / low-power GPU then you could just slap an RX550 into an existing Intel/Nvidia system.
 
[QUOTE="Panos, post: 32079792, member: 90574"]You missed that the Nvidia TV is but a but a monitor with shield strapped on it?
I not normal smart TV which you can plug on and watch broadcasting in the normal sense.


VRR on mainstream TV going to be norm starting next year.

Already there are 20 TVs from Samsung alone. With 55" hdr1000 starting at 800 with Freesync.

That's a shame, I thought the upcoming G-sync TV's were actual TV's with a 4k 144hz HDR panel and g-sync module added.[/QUOTE]

Nah.
Also i see it as pointless given next year when this comes out, we going to have OLED with HDMI 2.1 with VRR for much less!!!!
 
Interesting video. Discreet GPU works but requires the game to have a GPU selector. If a hack is developed so that it appears as an Optimus / low-power GPU then you could just slap an RX550 into an existing Intel/Nvidia system.

It's a shame that someone from Nvidia will see this and put a stop to it in the next driver updated.

Nah.
Also i see it as pointless given next year when this comes out, we going to have OLED with HDMI 2.1 with VRR for much less!!!!

Personally I think adaptive sync monitors are the way to go.

I was wondering are companies like Intel allowed to use the Freesync name when they use adaptive sync? If they where are they now not allowed to use the Freesync 2 name? I ask because if AMD did allow it with Freesync and now don't with Freesync Intel may be reluctant to use the original freesync branding as it'll be seen as the lesser version.

I'm asking as I thought adaptive sync's an open standard & the Freesync branding anyone can use not just monitor manufacturers.
 
Nah.
Also i see it as pointless given next year when this comes out, we going to have OLED with HDMI 2.1 with VRR for much less!!!!

That's only if the OLED TV's use the full HDMI 2.1 specification. They don't have to. VRR is not a requirement when using HDMI 2.1, it's just one of the features they can use if they want to.
 
I wonder how they will stop it? It's not as simple as Physx, detect AMD card stop PhysX. Detect AMD card stop display?

I hadn't thought of it like that. If a driver update can't do it maybe a bio's update will, they might even be rushing around like mad rabbits doing some bios changes for the RTX's as we speak. :D
Or they may not care, after all it hurts performance and it's not something everyone with Freesync monitors and Nvidia cards can suddenly start doing.
 
Last edited:
I hadn't thought of it like that. If a driver update can't do it maybe a bio's update will, they might even be rushing around like mad rabbits doing some bios changes for the RTX's as we speak. :D
Or they may not care, after all it hurts performance and it's not something everyone with Freesync monitors and G-sync can suddenly start doing.

And how many people will have a card from each company and a Freesync monitor? It would be a small subsection of people.
 
That's only if the OLED TV's use the full HDMI 2.1 specification. They don't have to. VRR is not a requirement when using HDMI 2.1, it's just one of the features they can use if they want to.

It will be a boon to have VRR given the console support and sales.
 
I wonder whether the G-sync TV's will play nice with an AMD GPU?

I'd bet that as far as Nvidia are concerned they'll have you captive once you invest in a G-sync TV as they're several grand each :eek:. The longer AMD remain out gunned at the top of the charts the more Nvidia will push the limits to see how far they can push PC gamers.
One the biggest problems is if you buy a large 4K G-Sync TV for playing PC games on and you also own an Xbox one X which is of cause Free-sync....:(
 
Back
Top Bottom