• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

However did we play games without G Sync and FreeSync?

What downsides are going change the truth of his statement?

None.

So what is your point?

Don't be an ignorant fool over this.

There is a very good reason we no longer use CRTs. For all the benefits they have over current monitor tech the cons are obviously the reason they are still not in use.

His statement is pointless, irrespective of it's truth.
 
Simples, once you use it and see what a game changer it is, you realise how horrible gaming was without it.

Nope, playing my games just fine without gsync enabled. Ive have so for the last test period of almost a week now.

Tried it and can live without it.
I feel the same way. I have a gsync monitor, its a nice addon but i regret buying it cause i dont feel it has been worth the premium. Atleast i got a 1440 144hz screen outta that deal as well and im not tied to nvidias ecosystem either so thats another good thing.

I could live without it aswell. It's better for low refresh. But if your not dropping frames it's not worth it. Couldnt play happily with 60hz though.

I find anykind of low fps(sub 60) gaming obnoxious and no amount of gsync/freesync/variable refresh rate technology can save it and make it a better experience for me personally. It turns into such a slideshow and i dont like slideshows.
 
Admittedly i'm not up on the Displayport standards but are we going to reach a point where nVidia simply have no choice but to adopt Freesync to stay compliant?
 
NVidia will never adopt Freesync, as Freesync is AMD's own technology, Adaptive sync on the other hand is an open standard and available to anyone.

I thought we had been through all this time and time again.

Marketing genius by AMD, everybody uses the term Freesync when in fact they mean Adaptive sync.
 
Lets put it this way, if Intel's next APU (or a future one) is Adaptive sync capable , are they going to advertise it as AMD's Freesync capable or Isync capable (or some other Intel specific name).
What would be funny (even thought it would never happen) is if Intel and NVidia got together and agree to call it the same thing BananaSync or AdaptiveSync+ (or whatever) then monitor manufactures would either have to rebrand to the new name leaving AMD out in the cold or add it as well as the Freesync branding.

Here's the new super monitor from XYZ featuring Bananasync technology and FreeSync which is actually exactly the same thing, yours for only lots of pounds.
 
Assuming nvidia add display port 1.3 to all their future GPUs and when display port 1.3 becomes the standard for all new monitors, they won't have any choice but to support it and if they refuse to enable it via drivers, I wouldn't put it past some kid to write a program to enable adaptive sync.

And if they choose to still use DP 1.2 on their next GPUs just to avoid the above then they are cutting off a lot of people since DP 1.3 will be needed for higher refresh rate and higher res. screens i.e. 4k + 3440x1440 + 144+HZ

If nvidia do support adaptive sync, it will certainly be interesting to see what happens with the whole naming scheme.
 
Last edited:
Assuming nvidia add display port 1.3 to all their future GPUs and when display port 1.3 becomes the standard for all new monitors, they won't have any choice but to support it and if they refuse to enable it via drivers, I wouldn't put it past some kid to write a program to enable adaptive sync.

And if they choose to still use DP 1.2 on their next GPUs just to avoid the above then they are cutting off a lot of people since DP 1.3 will be needed for higher refresh rate and higher res. screens i.e. 4k + 3440x1440 + 144+HZ

If nvidia do support adaptive sync, it will certainly be interesting to see what happens with the whole naming scheme.

i think what we will see is not that different to now, g-sync monitors having higher max refresh rates, if there is something in the g-sync tech that makes higher refresh easier or just some deal nvidia made with the monitor makers i have no idea but its a thing!

nothing to stop them adding something new too like nvidia eye-care with awesome subliminal advertising or wotever :p
 
Yup indeed.

As I said above Pure Marketing genius from AMD, they wont loose out either way, but the monitor manufactures, will have to decide just how they want to brand their monitors.

Of course it hasn't happened yet, so we will just have to wait and see.
 
Don't be an ignorant fool over this.

There is a very good reason we no longer use CRTs. For all the benefits they have over current monitor tech the cons are obviously the reason they are still not in use.

His statement is pointless, irrespective of it's truth.

Don't be an arrogant idiot over this.

He would win his bet, for all the advances in technology, LCD's still lag behind CRT's, Games would feel smoother on a CRT. And that's all he was doing, pointing out how little LCD tech has improved for gaming.

There was several references made to CRT's throughout the thread. You bringing him to task over his comment was way more pointless.
 
Don't be an arrogant idiot over this.

He would win his bet, for all the advances in technology, LCD's still lag behind CRT's, Games would feel smoother on a CRT. And that's all he was doing, pointing out how little LCD tech has improved for gaming.

There was several references made to CRT's throughout the thread. You bringing him to task over his comment was way more pointless.

CRT's use to make my eyes stream if I used them longer than a few hours, Glad we've moved on since CRT.
 
NVidia will never adopt Freesync, as Freesync is AMD's own technology, Adaptive sync on the other hand is an open standard and available to anyone.

I thought we had been through all this time and time again.

Marketing genius by AMD, everybody uses the term Freesync when in fact they mean Adaptive sync.

In their "gsync" laptops they are using adaptive sync tech, not their module. They are still calling it Gsync. I would imagine they could do something similar on the desktop side, when their GPU's support adaptive sync of course :)

And surely, if their market power is as good as everyone says it is, they could just make monitor manufacturers put Gsync capable on adaptive sync monitors? IT will be interesting to see what way this all falls in a couple of years.

I hope there is an even more improved monitor tech out that makes gysnc/freesync redundant.
 
What downsides are going change the truth of his statement?

None.

So what is your point?

  • CRTs could cause excessive eye strain
  • Small size of screen
  • Bulky and take up large amount of desk space
  • Significantly Higher heat output
  • Higher power requirements
  • Colour and display distortions from EMI
  • Generally lower sharpness compared to LCD (at native res)
  • Give of electro magnetic fields

The benefits are there of course and in many areas LCD type screen have never caught up with old CRTs but lets not look at CRTs through rose tinted glasses as if they were perfect and should never have been replaced.
 
Assuming nvidia add display port 1.3 to all their future GPUs and when display port 1.3 becomes the standard for all new monitors, they won't have any choice but to support it and if they refuse to enable it via drivers, I wouldn't put it past some kid to write a program to enable adaptive sync.

And if they choose to still use DP 1.2 on their next GPUs just to avoid the above then they are cutting off a lot of people since DP 1.3 will be needed for higher refresh rate and higher res. screens i.e. 4k + 3440x1440 + 144+HZ

If nvidia do support adaptive sync, it will certainly be interesting to see what happens with the whole naming scheme.

Adaptive Sync is optional in the DP 1.2a spec, you do not need to support it to remain 1.2a compliant.

I've not read anything which suggests it will be mandatory in 1.3 and above? Do you have a source? The only reading I have come across confirms it remains as an optional requirement.
 
It sure will, as long as you are willing to live with all the downsides CRTs bring with them.

What exactly is your point?

My point? no point, I just think apart from the slimness benefit of LED they are no where near the quality of picture and smoothness that could be had on a crt years ago.

Imagine what they could do with CRT monitors nowadays and how often does anyone move their monitor so the size thing would not bother me in my man cave.
 
Last edited:
My point? no point I just think apart from the slimness benefit of LED they are no where near the quality of picture and smoothness that could be had on a crt years ago.

Imagine what they could do with CRT monitors nowadays and how often does anyone move their monitor so the size thing would not bother me in my man cave.

It's a shame SED never took off:

https://en.wikipedia.org/wiki/Surface-conduction_electron-emitter_display

FED is still experimental too:

https://en.wikipedia.org/wiki/Field_emission_display
 
Back
Top Bottom