From what I understand about this technology is that it can allow games to look really smooth at low frame rates because it's syncing the GPU with the monitor?
So for example it could make games running at 30fps look as good as games running at 60fps on regular monitors?
Am I right in thinking that a GPU running 4k @ 30fps-40fps on ultra settings with a gsync/freesync monitor would be very playable and a good experience?
I'm buying components for a new build and wanted to see about buying a new monitor. But forgot about this technology. If this is the case I could get away with a Sapphire Radeon R9 290X Tri-X OC 8192MB GDDR5 PCI-Express Graphics Card for 4k gaming for the next 1/2 years?
Here is a review of the card http://www.tweaktown.com/reviews/7037/sapphire-radeon-r9-290x-8gb-tri-video-card-review/index9.html
As you can see it seems to handle 4k on high/ultra settings at a steady 40fps-60fps.
So this coupled with a freesync monitor should fit me well. I think anyway
So for example it could make games running at 30fps look as good as games running at 60fps on regular monitors?
Am I right in thinking that a GPU running 4k @ 30fps-40fps on ultra settings with a gsync/freesync monitor would be very playable and a good experience?
I'm buying components for a new build and wanted to see about buying a new monitor. But forgot about this technology. If this is the case I could get away with a Sapphire Radeon R9 290X Tri-X OC 8192MB GDDR5 PCI-Express Graphics Card for 4k gaming for the next 1/2 years?
Here is a review of the card http://www.tweaktown.com/reviews/7037/sapphire-radeon-r9-290x-8gb-tri-video-card-review/index9.html
As you can see it seems to handle 4k on high/ultra settings at a steady 40fps-60fps.
So this coupled with a freesync monitor should fit me well. I think anyway

. One thing is sure though when talking about below 60 gaming, its better with gsync/freesync than without imho as long as you stay within supported fps ranges ofcourse..
I mean they've been talking about this Freesync since Jan 2014 (or maybe before). I have a feeling that this is the main issue why we have the massive delay and quietness around 4K and Freesync. Maybe AMD's iteration is not as good as should be under all circumstances (at the moment)? 