• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is "Freesync" dead?

More on this...

There are also 'rumours' that Nvidia are forcing tech reviewers to have Nvidia products prominently in Intel and AMD product reviews.

https://youtu.be/1PLU50veK6A?t=1222

Well we definitely see the video card boxes in the background in most videos, as for the rest of it, well it could be happening, but as Jim says if they said it to him he wouldn't hesitate to blow the whistle.
 
Right, i did as Jim did emphasis 'rumour', like Jim its something i wouldn't put past Nvidia, IMO they have gone beyond the arrogant stage to narcissistic.

Green GTX boxes in the background on shelves is just decoration, tho in the past some have pointed out that some Youtubers box art shelves seem to be entirely green, others have pulled up people like HUB for having too much Ryzen box art in the background, i think one can always pick at something that is entirely innocent. (I mean, they are good looking boxes :D)

The only thing that i've ever found odd in a review was Steve from Gamers Nexus, on reviewing the first Threadripper CPU's, almost sneering with attitude "you don't need this many cores" CUDA does everything, all you need is CUDA, aside from being factually wrong he said CUDA more times in that review than he said anything else.

That's the sort of crap i'll be looking out for.
 
Last edited:
Right, i did as Jim did emphasis 'rumour', like Jim its something i wouldn't put past Nvidia, IMO they have gone beyond the arrogant stage to narcissistic.

Green GTX boxes in the background on shelves is just decoration, tho in the past some have pointed out that some Youtubers box art shelves seem to be entirely green, others have pulled up people like HUB for having too much Ryzen box art in the background, i think one can always pick at something that is entirely innocent. (I mean, they are good looking boxes :D)

The only thing that i've ever found odd in a review was Steve from Gamers Nexus, on reviewing the first Threadripper CPU's, almost sneering with attitude "you don't need this many cores" CUDA does everything, all you need is CUDA, aside from being factually wrong he said CUDA more times in that review than he said anything else.

That's the sort of crap i'll be looking out for.

Yeah. I am sure something is up. I never trust any one source.
 
I really enjoyed this conversation.
To see how emotionally involved some are just fascinates me.

Be that as it may, I as well as others do enjoy the added benefit of lower input lag by disabling "sync".
:cool:

The problem is Nvidia have been ratcheting the prices up to ridiculous levels and as a result of that people have been buying less GPU's, so instead of Nvidia recognising they are going too far they actively work on suppressing any remnants of exposure any competition has left.

It's predatory capitalism at its worst, i'm not some Marxist Corbynista, a very long way from that infact i'm more of a Tory Liberal but i like many here i suspect recognise this benefits only Nvidia and their investors and its literally at the expense of all of us.

The competitive scape in GPU's right now is very very unhealthy, and yes, quite a lot of it is AMD's own fault, not just with the past competitive nature of their GPU's but they just haven't been aggressive enough themselves and still aren't, doing all the work for some of this stuff, putting in all the R&D and then just open sourcing it without any protections to be nice and flowery about it just leaves themselves wide open for predictors to steal it right from under their nose.

Time and time again AMD make these mistakes.

jvw0vf5f.zyj.png
 
Last edited:
The problem is Nvidia have been ratcheting the prices up to ridiculous levels and as a result of that people have been buying less GPU's, so instead of Nvidia recognising they are going too far instead they actively work on suppressing any remnants of exposure any competition has left.

It's predatory capitalism at its worst, i'm not some Marxist Corbynista, a very long way from that infact i'm more of a Tory Liberal but i like many here i suspect recognise this benefits only Nvidia and their investors and its literally at the expense of all of us.

The competitive scape in GPU's right now is very very unhealthy, and yes, quite a lot of it is AMD's own fault, not just with the past competitive nature of their GPU's but they just haven't been aggressive enough themselves and still aren't, doing all the work for some of this stuff, putting in all the R&D and then just open sourcing it without any protections to be nice and flowery about it just leaves themselves wide open for predictors to steal it right from under their nose.

Time and time again AMD make these mistakes.

I hear you. And I agree with you.

However IMO:
Nvidia trying to do GPP on Freesync doesn't make any business since in the long term. As AMD moves toward Freesync 2 it leaves high end PC gamers with old tech syndrome.

I too don't like what Nvidia is doing but the whole premise of freesync is "free" it's not proprietary to AMD only. Remember the echos of "Gsync Monitors are too expense. Give (us) a monitor that's cost competitive to FS." "...if nvidia wanted to use FS they could..." Well, they took them up on that offer...

AMD did put the work in to force that market to provide better HW for monitors marketed as "gaming". I don't see them losing much without FS labeling. AMD should start promoting FS2 (HDR) more. As AMD is requiring their API in order for developers to implement it.

But lets face it, FS at this point is old hat. Nvidia trying to steal a competitor's old "free" IP is moronic and emo to me. As AMD looks toward both PC and next gen console market. Something Nvidia can't do in the same breath.

As for AMD being competitive again that's a matter of prospective. We know AMD will release 5800/5900 series cards with the later being referred to as the Geforce killer by AMD's engineering dept. Then you have Intel with Raja coming out with some sort of mid range gpus. Within 1/1.5 years the landscape of the GPU will drastically change leaving Nvidia (and those who love them) wondering the future.
 
I hear you. And I agree with you.

However IMO:
Nvidia trying to do GPP on Freesync doesn't make any business since in the long term. As AMD moves toward Freesync 2 it leaves high end PC gamers with old tech syndrome.

I too don't like what Nvidia is doing but the whole premise of freesync is "free" it's not proprietary to AMD only. Remember the echos of "Gsync Monitors are too expense. Give (us) a monitor that's cost competitive to FS." "...if nvidia wanted to use FS they could..." Well, they took them up on that offer...

AMD did put the work in to force that market to provide better HW for monitors marketed as "gaming". I don't see them losing much without FS labeling. AMD should start promoting FS2 (HDR) more. As AMD is requiring their API in order for developers to implement it.

But lets face it, FS at this point is old hat. Nvidia trying to steal a competitor's old "free" IP is moronic and emo to me. As AMD looks toward both PC and next gen console market. Something Nvidia can't do in the same breath.

As for AMD being competitive again that's a matter of prospective. We know AMD will release 5800/5900 series cards with the later being referred to as the Geforce killer by AMD's engineering dept. Then you have Intel with Raja coming out with some sort of mid range gpus. Within 1/1.5 years the landscape of the GPU will drastically change leaving Nvidia (and those who love them) wondering the future.

I'm not sure Nvidia have stopped with 'Free Sync' -Original? as it were, they certainly wouldn't stop there if they can get HDR working without the G-Sync module, AMD haven't put any protections in any of this and while the intent may have been to 'spread the goodness' i very much doubt they ever envisaged their own logo being removed from Free-Sync compatible screens because Nvidia wants it so, this is the point where you are perusing Adaptive Sync screens and finding that apparently less of them are compatible with AMD GPU's, perhaps any at all when in fact they contain the fruits of AMD's labour and previously would have had a logo on them to tell you this is an Adaptive Sync screen for AMD also.

Which of course is the whole point of Nvidia's efforts here.

This is an example where AMD should have been more like Nvidia to protect themselves from Nvidia.
 
I'm not sure Nvidia have stopped with 'Free Sync' -Original? as it were, they certainly wouldn't stop there if they can get HDR working without the G-Sync module, AMD haven't put any protections in any of this and while the intent may have been to 'spread the goodness' i very much doubt they ever envisaged their own logo being removed from Free-Sync compatible screens because Nvidia wants it so, this is the point where you are perusing Adaptive Sync screens and finding that apparently less of them are compatible with AMD GPU's, perhaps any at all when in fact they contain the fruits of AMD's labour and previously would have had a logo on them to tell you this is an Adaptive Sync screen for AMD also.

Which of course is the whole point of Nvidia's efforts here.

This is an example where AMD should have been more like Nvidia to protect themselves from Nvidia.

Sure, sure they should.
It will be interesting to see what AMD actually does about this.

However, I have a sneaking suspicion that AMD might be banking on the console market. Which is market prohibitive to Nvidia to self correct.
IE: As a company who sells monitors you can't sell Gsync monitors to PS5/Xbox next gen owners. Nor will you attract AMD GPU Owners. So they will have some "thinking" to do. Among other self policing policy(s).

But yeah, lets see were the chips fall on that on. It's comically to me though.
 
The problem is that even though AMD was very clever with the whole Freesync naming for Adaptive sync monitors, the fact that they pushed the idea that it was free to use with no ties to any one manufactuer. It left the door open for others to muscle in on it.
Doesn't mean that what NVidia is doing is right of course, but I do get the feeling that if NVidia hadn't done it Intel would have with their discrete cards when they arrive next year.
AMD come across as a little too easy going, which sounds great and all that but it does allow other companies to walk all over you half the time.
 
Doesn't mean that what NVidia is doing is right of course

I wonder if a little bit of it is driven by revenge - despite the denials from some posters here, who want to push another narrative, nVidia did originally try to push adaptive sync through VESA rather than their propitiatory approach.
 
How did we get to a point were nvidia required monitor manufactures to use their HW to being gsync compatible over the years?
This is were critical thinking comes into play here. AMD freesync initiative helped foster monitors that no longer require nvidia hardware.
Take a gander at their compatbility list:
https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/
In the last few weeks they've added more compatible monitors then they ever had gsync monitors at one time. The manufacturing of these monitors have greatly improved from years past. That's the point :).

The point still remains, the HW in them used today is far better then before freesync/gsync.

So again, is free/g sync dead. My answer is still yes it is. With AMD pushing the Freesync 2 (I'm not seeing Nvidia doing anything on this front) they will push manufactures even further to make those monitors more robust.

Your response really doesn't show any objection to this. Albeit, you object to the topic of it being discussed in a thread labeled Is Freesync Dead. Furthermore, if I and others no longer use "sync" to game then your preface to suggest otherwise lacks any real rebuttal.

How can anyone answer this post? Your arguments come from a seriously flawed understanding of freesync, Gsync, Gsync compatible and monitors in general.

It's pretty simple how we got from Nvidia using their hardware to also using adaptive sync monitors. At the time Gsync was developed adaptive sync didn't exist and no Nvidia GPU could connect to one even if they did because they didn't have the capability. Now they feel that Gsync is a strong enough brand on it's own that they think it's financially viable to allow their GPUs to connect to certain adaptive sync monitors and calling those monitors Gsync compatible.

Your whole point was that Freesync and Gsync wasn't needed anymore because Tearing was only a problem on crappy monitors not good ones. If you don't believe me, go back and read your post no. 93, here I will quote you.

Lets be honest here, Free/G Sync is only intended for monitors with very poor HW scalers (among other HW). When you have a decent, well ventilated HW for your monitor you will hardly notice, if ever, tearing.

If the HW in them didn't use Freeysnc and/or gysnc, then Tearing would still exist. The hardware has improved by adding scalers with the ability to use variable refresh rates. But, if you connected a GPU to that monitor that didn't have the ability use adaptive sync, there would still be tearing. Specific Hardware is needed on the GPU side(except if you are using Gsync) Specific Hardware is needed on the Monitor side and software support all have to be in place.


The last few lines of your post are nonsense.

My discussion is in regards to the topic of this thread. And why I believe both gsync and freesync are dead. This is the topic of this thread.
I've also included gsync to the topic as well. As nvidia has (in my view) conceded.

You shown throughout this thread that you have no understanding of what Tearing is.

You have also shown that you have no understanding of what Freesync, Gsync, Gsync compatible do or how they work.

Lastly, with the second post I have quoted, you have no idea what the thread topic is actually about. It's not about the death of Freesync technology. It's about the death of the Freesync brand name and Nvidia using their financial clout to stop monitor manufacturers displaying "Freesync" on their monitors.
 
I wonder if a little bit of it is driven by revenge - despite the denials from some posters here, who want to push another narrative, nVidia did originally try to push adaptive sync through VESA rather than their propitiatory approach.

You are going to have to show some proof of that.
 
You are going to have to show some proof of that.

Can't easily find the information any more :s the conference notes have long since disappeared from the internet or at least from my ability to find as I can't remember specific dates now.
 
Back
Top Bottom