• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

60"+ TV FreeSync Living Room PC Gaming Inbound???

Soldato
Joined
30 Mar 2010
Posts
13,432
Location
Under The Stairs!
With the imminent FreeSync 2 console support announcement, any AMD users making the move from monitor to 60"+ Samsung FS2 TV's?

Think Nvidia will bite the bullet and open up adaptive sync for living room gaming rather than shoehorning you into one of the few G-Sync TV's?
 
The trouble AMD have is that they still don’t have a viable 4k card to make use of the living room space. More and more people have 4k TVs now and while freesync is something I’d love to implement in the room, I’d still take a 1080ti and vsync.

It’s a great addition for console users and existing amd users though. But I don’t see it swinging the market or affecting any nvidia decisions.
 
NVIDIA have already unveiled there 65” gsync TV/Monitors.
As a living room gamer that games on a TV I’m excited about this new competition. I’m still gaming at 60hz 1080p, and if/when we see 4k capable cards at decent price point I’ll be sure to grab one of these displays, along as there not insanly expensive that is
 
Freesync 2 coming to Xbox One S/X is going to make this space interesting. No doubt, Sony will follow.
While I believe Gysnc to be the superior technology, I think its going to have trouble as the mainstream TV producers introduce Freesync.
Those 65” gsync TV/Monitors could become a niche product.
 
Don't forget Freesync 1 support on the XBone (no S/X version). So in total around 40,000,000 devices got Freesync overnight. That's a huge boon to freesync TV and monitor sales.


Also the Gsync 65" MONITOR (is not TV with scaler, only supports internet streaming services from Amazon/Netflix apparently) going to be a niche market for those owning a GTX1080Ti/Titan XP/Titan Xp/Titan V and they do not want a TV on the living room and have space for 65", ~£4000 device (if not more).
 
Unless the TV is OLED (which it won't be), not interested :p Don't care for 4k at all, tried it and it isn't worth the hit in FPS (especially when sitting at 8+ feet from the TV) + 120HZ is much more noticeable/greater than the higher PPI.

Hopefully some manufacturers will be able to update their older TVs to enable freesync as some of Samsungs TV's already have it despite not having HDMI 2.1.
 
From what I read years ago, Nvidia lacks the hardware to provide Adaptive Sync support within its desktop GPUs.
 
From what I read years ago, Nvidia lacks the hardware to provide Adaptive Sync support within its desktop GPUs.

Actually all Pascal cards have the support at hardware level, the previous one not.
So for Pascal it would only take the drivers to switch it on.....

Also the Nvidia Gsync laptops are using adaptive sync, and they do not have Gsync module inside.
 
Actually all Pascal cards have the support at hardware level, the previous one not.
So for Pascal it would only take the drivers to switch it on.....

Also the Nvidia Gsync laptops are using adaptive sync, and they do not have Gsync module inside.

That's assuming nVidia would switch it on, if they do it they will use it to market it as a new feature on later cards.
 
Unless the TV is OLED (which it won't be), not interested :p Don't care for 4k at all, tried it and it isn't worth the hit in FPS (especially when sitting at 8+ feet from the TV) + 120HZ is much more noticeable/greater than the higher PPI.

Hopefully some manufacturers will be able to update their older TVs to enable freesync as some of Samsungs TV's already have it despite not having HDMI 2.1.

You might want to think twice about buying OLED for gaming. They seems to suffer from Burn-IN :(

Here is a test that been on going for 20 weeks last updated 08,03,2018
https://www.rtings.com/tv/learn/permanent-image-retention-burn-in-lcd-oled
 
Actually all Pascal cards have the support at hardware level, the previous one not.
So for Pascal it would only take the drivers to switch it on.....

That's good to hear as I have a 1080Ti myself. Do you have a source for that?
 
You might want to think twice about buying OLED for gaming. They seems to suffer from Burn-IN :(

Here is a test that been on going for 20 weeks last updated 08,03,2018
https://www.rtings.com/tv/learn/permanent-image-retention-burn-in-lcd-oled

Yup seen that and also seen people who have had theirs for 3+ years and not had any burn in. The only people/end users on forums that I ever see who report "permanent" burn in are ones that watch news for 6+ hours every day @ 60+% OLED light settings.

I am using sensible settings i.e. the recommended brightness setting etc. as well as turning of unnecessary HUD elements (something I've always done). Had mine for 4+ months now and no signs of any burn in or even temporary image retention either, probably game on mine for about 3 hours a night, if that.
 
Yup seen that and also seen people who have had theirs for 3+ years and not had any burn in. The only people/end users on forums that I ever see who report "permanent" burn in are ones that watch news for 6+ hours every day @ 60+% OLED light settings.

I am using sensible settings i.e. the recommended brightness setting etc. as well as turning of unnecessary HUD elements (something I've always done). Had mine for 4+ months now and no signs of any burn in or even temporary image retention either, probably game on mine for about 3 hours a night, if that.

Knowing my luck I would buy one and it would get burn-in. Happened with a old Plasma I owned never again did I buy another Plasma. Just think for this day and age they really isn't any excuses for this to happen, We have been there with Plasma much better image quality than LCD. I would rather buy the best LCD HDR screen even if it is sub par. :D
 
Knowing my luck I would buy one and it would get burn-in. Happened with a old Plasma I owned never again did I buy another Plasma. Just think for this day and age they really isn't any excuses for this to happen, We have been there with Plasma much better image quality than LCD. I would rather buy the best LCD HDR screen even if it is sub par. :D

As long as you are sensible you should be fine. It really just comes down to if you are someone who likes eye blinding brightness settings on your displays and plays games for 4+ hours every day (which would be problematic i.e. HUD you can't hide or choose not to which is 100% opaque and of yellow/red colour), there is talk going around about the ones who have got burn in badly just having got a panel which is more susceptible to burn in or/and sunlight on the display causing it to happen (due to the heat)

And yup, I came from a plasma too, it was very poor for image retention, mainly just temporary image retention for me though.
 
They are equipped with DP 1.4 and HDMI 2.0b that's more than enough. The hardware is there.
I think there was more to it than just the port support? The old AMD r9 290 series has DP 1.2 as does the Nvidia 970, yet the 290 can, while the 970 can't.
 
So what AMD had on chip is now a requirement of the port standard, not optional?
Vesa adaptive sync is an open standard. Doesn't need special electronics assuming you use DP1.2 or better, or HDMI 1.4a or better.

Freesync is the software implementation.
 
Back
Top Bottom