• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD FreeSync vs Nvidia G-Sync - the adaptive sync battle

Soldato
Joined
30 Mar 2010
Posts
13,407
Location
Under The Stairs!
AMD FreeSync vs Nvidia G-Sync - the adaptive sync battle

How does adaptive sync work, and which technology should you buy? We investigate


Screen tearing is one of the biggest irritations facing PC gamers today. It's a huge annoyance for players who want quick response times in fast-paced games such as FPS and RTS titles, but the problem affects games and gamers across the board, from budget PCs to high-end monsters. It's a crisis that graphics card and monitor makers have finally come together to fix.

Fear not, though: AMD and Nvidia have you covered with two differing solutions to this problem. Together they're called adaptive sync (alternatively known as dynamic refresh rates). The two firms market their technologies differently, but they solve the same problems in precisely the same way; it's the hardware implementations that vary slightly. In this article, we'll talk about how the technology works and help you decide which to choose when you're next in the market for a monitor or graphics card.

Nvidia got the jump on AMD last year when it launched its first consumer monitors with built-in adaptive sync, which it calls G-Sync. AMD, meanwhile, has been talking about its own adaptive sync technology, called FreeSync, for a long time and finally looks set to get a flurry of FreeSync compatible monitors on to the market this year.

The most important thing to take away from this is that even fairly old mid-range cards from both AMD and Nvidia both support adaptive sync. This means you don't need to buy a new card to reap the benefits of this technology. The lineup of monitors supporting FreeSync certainly looks stronger, but there are some great G-Sync monitors on the market from Acer, Asus (the SWIFT in particular is staggering) and AOC.

If you're buying a brand-new graphics card and have an adaptive sync monitor in mind, AMD looks more attractive right now simply because of the variety of FreeSync hardware set to be released this year. Adaptive sync greatly benefits modest hardware, and those with mid-range cards will greatly appreciate the lower cost overheads for DisplayPort 1.2a monitor over a G-Sync one. With that in mind it looks like AMD is in a better position.

Whichever you choose, the future is bright for adaptive sync technologies, and screen tearing in PC games should be a thing of the past very soon.

http://www.expertreviews.co.uk/acce...ync-vs-nvidia-g-sync-the-adaptive-sync-battle
 
Quite a nice article, fairly neutral and quite well thought out in my oppinion.

Only one thing stood out to me, when they say that fairly old midrange cards from both sides can support it.
Well yes they can but isnt the 260 still a currant model, Even though it has been around a while?

Now we just need some back to back reviews done to see how well each system works.
 
I think I'll sell my U2715H and get a freesync monitor when they arrive. Tearing and IPS glow aren't very nice.
 
Having played through the Assassins Creed series over the past month, I think sync will benefit Ubi games massively. No multi GPU support, shocking lack of tweaking even with patches, prone to tearing.
 
Started putting a few quid aside for a new monitor, (small win on the lottery helped too) but for a gsync monitor it's looking like the swift is the only viable option. Aint got the gpu grunt for 4k and 1920x1080 is a bit pointless in regards to resolution. Shame that the ROG mark up on the swift pushes the price so high.
 
Lmao, they show the list of 7 models of freesync, then say there are more than gsync without mentioning how many models of gsync monitors will be available in the same time frame

Its also not an actual vs. As there was no testing involved... Oh and they say older cards from both side, yet 7*** series only support video playback

Neutral my arris, that is yet another hyperbolic AMD advert, not exactly an "expertreview"
 
Started putting a few quid aside for a new monitor, (small win on the lottery helped too) but for a gsync monitor it's looking like the swift is the only viable option. Aint got the gpu grunt for 4k and 1920x1080 is a bit pointless in regards to resolution. Shame that the ROG mark up on the swift pushes the price so high.

In the same postition. Fancy an upgrade to my Hazro 1440 screen but the Asus mark up just seems so much.
 
If AMD made a chocolate teapot you'd buy it :D

Already got one. ;)

TR8DiLu.jpg
 
In the same postition. Fancy an upgrade to my Hazro 1440 screen but the Asus mark up just seems so much.
If only the price of the TN swift drops a bit, it's not too far of the price of the acer ips model. Albeit an ips wont be as responsive as the TN panel in the swift. Ive been using the monitor in sig for just over seven years now, so an upgrade is kinda long overdue. 780 sli is too much gpu for 1920x1200.:(
 
If you own one of the following you're beyond retribution.

AMD cap, AMD backpack, AMD Memory, AMD SSD. :D

According to my tests the 480GB AMD R7 SSD performs similarly to the Samsung 850 256GB, which is a beast SSD. I own both so can compare performance directly.

Can't fault the memory either for the price, managed to tune it decently on stock volts. :)
 
If AMD wants some proper publicity, from a true expert... Send a GPU my way. Otherwise we'll have to put up with shallow rehashes of existing information like that ^^.

Does this tech make gaming at sub 60FPS feel any smoother?

In the sense that tearing and stuttering from the traditional frame rate and refresh rate mismatch is eliminated, yes. But low frame rates remain low frame rates. If you suddenly drop from ~100fps to 50fps you'll certainly feel it and it will still be painful.
 
Back
Top Bottom