• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

However did we play games without G Sync and FreeSync?

My point? no point, I just think apart from the slimness benefit of LED they are no where near the quality of picture and smoothness that could be had on a crt years ago.

Imagine what they could do with CRT monitors nowadays and how often does anyone move their monitor so the size thing would not bother me in my man cave.

People keep telling you what the disadvantages of CRTs are. As if you didn't know what the problems were.

The disadvantages don't matter to what what you said. It's amazing to think that technology out years ago was better for gaming than what we have now. That 15 years later we still can't same the colour, clarity, motion, response time. And I too wonder what advances would have been made if they had kept with CRT development? Maybe none.
 
People keep telling you what the disadvantages of CRTs are. As if you didn't know what the problems were.

The disadvantages don't matter to what what you said. It's amazing to think that technology out years ago was better for gaming than what we have now. That 15 years later we still can't same the colour, clarity, motion, response time. And I too wonder what advances would have been made if they had kept with CRT development? Maybe none.

I agree, its a shame we wont ever know.

With regards to current monitors I'd like a 1440 144hz screen but I would also like a 4k 144hz screen, I wont buy either yet as I have a 780, if I get a pascal or polaris card then hopefully there is a 4k 144hz screen with NVidia or amd sync.
 
People keep telling you what the disadvantages of CRTs are. As if you didn't know what the problems were.

The disadvantages don't matter to what what you said. It's amazing to think that technology out years ago was better for gaming than what we have now. That 15 years later we still can't same the colour, clarity, motion, response time. And I too wonder what advances would have been made if they had kept with CRT development? Maybe none.

Same with Freeview TV what a down grade that's been. Extra channels in exchange for a poor picture.
 
My point? no point, I just think apart from the slimness benefit of LED they are no where near the quality of picture and smoothness that could be had on a crt years ago.

Imagine what they could do with CRT monitors nowadays and how often does anyone move their monitor so the size thing would not bother me in my man cave.

to an extent i totally agree. I sold my iiyama vision master pro 510 (21inch) (which used bnc connectors) because of its size and bulk.

It was a really nice screen but it was limited to 85hz at 1600x1200.

there is a niche market for iiyama just waiting for a 144hz CRT with displayport but the components would be so expensive now it would be a ridiculous price, even more than OLED.

I have to say the 27inch Eizo IPS 144hz screen i have now on balance is better than my old crt and i would not swap back, the input lag is still there but i cant percept it.

OLED is where we need to be, there is not long left to wait.
 
yeh tho the oled tv's we've seen so far have a huge processing delay, im not lucky enough to try out that dell screen but looked to me that was aimed more at graphical/special use anyway

they need to make a oled with gaming in mind before we know i guess :)
a lot of laptops coming out with oleds now so i dont see why we shouldnt see them <3
 
Well yes you have just listed the problems with LOW FPS, that has no relevance to Gsync or Freesync, low FPS is the same on any monitor except worse without Gsync due to stuttering or tearing. Most people with a gsync monitor would have probably got 60fps with a non Gsync monitor, but with a Gsync monitor they are getting 60-144fps, and if it drops under 60 then you don't get any stuttering. I am guessing at least half the people saying Gsync is pointless, have not actually used it.

No you could not be more wrong. Its not the same, low fps behaves differently on a gsync/freesync enabled monitor than it does on static refresh rates. You dont get more blur the lower the fps goes on static refresh rate(lets assume in this example vsync is off), sure you get tearing which is not that exciting but the blur stays the same, where as on a gsync/freesync it increases dramatically the lower the fps go since FPS= effective hz. Now this is not a flaw in gsync/freesync but a limitation due to how the monitor works by holding an image for a longer period of time also know as image persistence but it still translate to the experience of using gsync/freesync at lower fps and this is where you as a user have to choose. Can i live with the blur and be happy there is no tearing? or are your preferences the other way around.

I know i hate blur more than i hate tearing as it makes my eyes strain a good deal and therefor i feel like its very important to present variable refresh rate in a correct light by listing both the positive(which there certainly is) but also the negatives because we as users are not the same and not everyone is bothered by the same things which is also why i think that just recommending gsync/freesync as the holy grails is wrong.

Now your second point about getting 60-144hz with gsync/freesync, i have to wonder, have you tried that experience? one second completely smooth 144hz? the next second 60hz or 70hz? due to your gpu not being able to keep up? That experience is certainly not one i like and i rather prefer locking the fps to 70-80 max then to avoid that spike. This is especially true if i play shooters where consistency is key. Again you may be "immune", or perhaps the correct term is insensitive, to these unpleasant side effects but some of us are not and for us, myself included, gsync/freesync isn't always the key. Example i prefer to run 144hz static refresh right now and play BF4 uncapped giving me around 250+ fps which is such a treat for mouse response and tearing is not visible(it is of course there by logic). Now are you going to try and convince me that gsync is always superior in every scenario?

Now let me end this by giving an example of where i find gsync/freesync to work very will. In the 50ish to 70ish range where you dont experience frame drops of more than 5-10 but minimums are above 50. So for example The Division was a treat with gsync on for me after i had tuned a few settings to raise the minimums above 50. It is important for me to stress that im not against variable refresh rate technology but i do think its very important to show it in the right light with both positives and negatives since there's still a nasty premium for most users if they want to consider the option.
 
Last edited:
No you could not be more wrong. Its not the same, low fps behaves differently on a gsync/freesync enabled monitor than it does on static refresh rates. You dont get more blur the lower the fps goes on static refresh rate(lets assume in this example vsync is off), sure you get tearing which is not that exciting but the blur stays the same, where as on a gsync/freesync it increases dramatically the lower the fps go since FPS= effective hz. Now this is not a flaw in gsync/freesync but a limitation due to how the monitor works by holding an image for a longer period of time also know as image persistence but it still translate to the experience of using gsync/freesync at lower fps and this is where you as a user have to choose. Can i live with the blur and be happy there is no tearing? or are your preferences the other way around.

I know i hate blur more than i hate tearing as it makes my eyes strain a good deal and therefor i feel like its very important to present variable refresh rate in a correct light by listing both the positive(which there certainly is) but also the negatives because we as users are not the same and not everyone is bothered by the same things which is also why i think that just recommending gsync/freesync as the holy grails is wrong.

Now your second point about getting 60-144hz with gsync/freesync, i have to wonder, have you tried that experience? one second completely smooth 144hz? the next second 60hz or 70hz? due to your gpu not being able to keep up? That experience is certainly not one i like and i rather prefer locking the fps to 70-80 max then to avoid that spike. This is especially true if i play shooters where consistency is key. Again you may be "immune", or perhaps the correct term is insensitive, to these unpleasant side effects but some of us are not and for us, myself included, gsync/freesync isn't always the key. Example i prefer to run 144hz static refresh right now and play BF4 uncapped giving me around 250+ fps which is such a treat for mouse response and tearing is not visible(it is of course there by logic). Now are you going to try and convince me that gsync is always superior in every scenario?

Now let me end this by giving an example of where i find gsync/freesync to work very will. In the 50ish to 70ish range where you dont experience frame drops of more than 5-10 but minimums are above 50. So for example The Division was a treat with gsync on for me after i had tuned a few settings to raise the minimums above 50. It is important for me to stress that im not against variable refresh rate technology but i do think its very important to show it in the right light with both positives and negatives since there's still a nasty premium for most users if they want to consider the option.

Is this not potentially addressable by allowing per user-specification of the the frame doubling threshold?
So on a hypothetical 40-144hz, rather than only activating sub 40, the user can chose a threshold of 50/55fps or above if the monitor tech allows.
 
Is this not potentially addressable by allowing per user-specification of the the frame doubling threshold?
So on a hypothetical 40-144hz, rather than only activating sub 40, the user can chose a threshold of 50/55fps or above if the monitor tech allows.

It should in theory help with the blur but my guess is you would add as a result a frame of input latency. Now let me say that this is my guess and i could be wrong. Someone like badass or PCM2 should be much more qualified to answer that question. In my own personal testing the frame doubling in gsync mode never had a positive effect even below 30 and the experience just got more and more slideshow. This is tested over 3 different gsync screens and 2 different gpus and a ton of different drivers and OS installs.
 
I agree, its a shame we wont ever know.

With regards to current monitors I'd like a 1440 144hz screen but I would also like a 4k 144hz screen, I wont buy either yet as I have a 780, if I get a pascal or polaris card then hopefully there is a 4k 144hz screen with NVidia or amd sync.

Well...we do know.....

It's not hard to make an educated guess that CRT technology was reaching a peak it would never likely surpass (economically and in the face of technology which for 99% of the market was better).

The sheer physical implementation of Cathode Ray Tube tells you one thing - They were not getting much smaller or lighter. There is a limit to what you can package a CRT into for a reasonable cost.

Toxic materials, susceptible to interference, potential for high frequency noise etc etc

CRT was never going to continue as a technology in which R&D money was spent. Where was the market for it, gamers? Gamers represent such a tiny fraction of the display market.

It's a pipe dream, it's like arguing "Where would VHS be these days if we kept developing it!?" Moot, better technologies came around.Keep in mind, when I say better I do not mean better in ALL respects. Merely better on the whole as a product for the biggest % of the target market.
 
Speaking of dead end tech I still wonder why minidisc's weren't popular enough to replace the CD,
It was capable of everything CD was, It was housed in a protective shell while being considerably smaller than a cd meaning it's disc was protected when out of the case and it was easier to store (Handy in a car).
I can understand stuff like the Philips laser disc machine that used films that looked like old record LP's failing but sometimes I think we as consumers got it wrong, CRT wasn't one of those times though..
 
Speaking of dead end tech I still wonder why minidisc's weren't popular enough to replace the CD,
It was capable of everything CD was, It was housed in a protective shell while being considerably smaller than a cd meaning it's disc was protected when out of the case and it was easier to store (Handy in a car).
I can understand stuff like the Philips laser disc machine that used films that looked like old record LP's failing but sometimes I think we as consumers got it wrong, CRT wasn't one of those times though..

Well everyone is entitled to their opinion but lcd was garbage for gaming when it first arrived but marketing made everyone want one.
 
Speaking of dead end tech I still wonder why minidisc's weren't popular enough to replace the CD,
It was capable of everything CD was, It was housed in a protective shell while being considerably smaller than a cd meaning it's disc was protected when out of the case and it was easier to store (Handy in a car).

They were more expensive (both discs and players), lower quality and everyone already had CD players.
 
Back
Top Bottom