Up until a few weeks ago I was running an EVGA 980Ti SC+ ACX 2.0 with a
Philips BDM4065UC 4K 40" monitor and I was pretty happy. But there was always a little niggly feeling that I should have went for G-Sync or Freesync.
A lot of research later I decided that a minimum 32", 21:9, 4K and G-sync/Freesync was the only option I would consider. Unfortunately this ruled out G-sync as there are currently no 16:9 4K monitors at 32" or above. So I shelved the idea as the only monitor that currently meets these specifications is the Samsung U32E850R 32" IPS with Freesync, but it only supports a Freesync range of 40-60Hz. Which I found pointless as most games would be running lower than 40FPS at 4K. Thankfully I found there are hacked drivers that give this monitor a much more sensible Freesync range of 33-60Hz. So my adaptive sync plan was back on again.
So despite my trepidation I sold the 980Ti and the Philips 40" monitor and bought a Sapphire R9 Fury and the 32" Samsung. I went for the non X Fury as I wanted to keep costs down in the very possible event I made a massive error in judgement. Obviously I expected a fair drop in performance, since at 4K the R9 Fury would be around 15%+ slower than my 980Ti at stock and about 25%+ compared to the 1430 -1450 OC on my 980Ti.
After getting everything installed and fearing the worst I have to say I was pleasantly surprised that max OC vs OC the R9 Fury was "only" around 15% slower than the 980Ti. Well at least in Witcher 3 that is, I have not really tested other games.
Previously my experience of G-Sync tech was on a ROG Swift and my overall thoughts were "nothing special". I think the problem was that I tested Bioshock Infinite and it was running at 100+ FPS, so any perception of stutters and tearing were almost totally unnoticeable to me with G-sync off or on. Despite not seeing how G-sync coped with lower FPS I was of the unshakable opinion that low FPS is low FPS. I was wrong to be frank. With the addition of Freesync, low FPS feels much smoother than my previous experience. To say that Freesync and G-sync makes a big difference at lower FPS is an understatement.
My Witcher 3 settings are: 4K res, hairworks off, DoF off, Blur off, Shadows High, Foliage vis high. HBAO mid.
980Ti OC at 1450 and +500 VRAM. FPS between 40 - 50+
R9 Fury at 1100 core and 550 VRAM. FPS between 35-47
Despite this the Freesnyc tech makes the game play and feel much smoother and stutter free. So thankfully my fears and trepidation were unfounded and I am delighted with my downgrade. Sorry for the long post folks, just wanted to share my new found love for adaptive synchronization.
Philips BDM4065UC 4K 40" monitor and I was pretty happy. But there was always a little niggly feeling that I should have went for G-Sync or Freesync.
A lot of research later I decided that a minimum 32", 21:9, 4K and G-sync/Freesync was the only option I would consider. Unfortunately this ruled out G-sync as there are currently no 16:9 4K monitors at 32" or above. So I shelved the idea as the only monitor that currently meets these specifications is the Samsung U32E850R 32" IPS with Freesync, but it only supports a Freesync range of 40-60Hz. Which I found pointless as most games would be running lower than 40FPS at 4K. Thankfully I found there are hacked drivers that give this monitor a much more sensible Freesync range of 33-60Hz. So my adaptive sync plan was back on again.
So despite my trepidation I sold the 980Ti and the Philips 40" monitor and bought a Sapphire R9 Fury and the 32" Samsung. I went for the non X Fury as I wanted to keep costs down in the very possible event I made a massive error in judgement. Obviously I expected a fair drop in performance, since at 4K the R9 Fury would be around 15%+ slower than my 980Ti at stock and about 25%+ compared to the 1430 -1450 OC on my 980Ti.
After getting everything installed and fearing the worst I have to say I was pleasantly surprised that max OC vs OC the R9 Fury was "only" around 15% slower than the 980Ti. Well at least in Witcher 3 that is, I have not really tested other games.
Previously my experience of G-Sync tech was on a ROG Swift and my overall thoughts were "nothing special". I think the problem was that I tested Bioshock Infinite and it was running at 100+ FPS, so any perception of stutters and tearing were almost totally unnoticeable to me with G-sync off or on. Despite not seeing how G-sync coped with lower FPS I was of the unshakable opinion that low FPS is low FPS. I was wrong to be frank. With the addition of Freesync, low FPS feels much smoother than my previous experience. To say that Freesync and G-sync makes a big difference at lower FPS is an understatement.
My Witcher 3 settings are: 4K res, hairworks off, DoF off, Blur off, Shadows High, Foliage vis high. HBAO mid.
980Ti OC at 1450 and +500 VRAM. FPS between 40 - 50+
R9 Fury at 1100 core and 550 VRAM. FPS between 35-47
Despite this the Freesnyc tech makes the game play and feel much smoother and stutter free. So thankfully my fears and trepidation were unfounded and I am delighted with my downgrade. Sorry for the long post folks, just wanted to share my new found love for adaptive synchronization.

Last edited:
I did not say no over clocking i said voltage control. Go back and read and understand what i said. Currently what you see is what you get but like i said... once again ill say it. Hopefully it will get voltage control. Your just being plain ignorant trying to sate it as fact that there is no more OC improvements to come, as it's a possibility.