AMD FreeSync vs Nvidia G-Sync - the adaptive sync battle
How does adaptive sync work, and which technology should you buy? We investigate
Screen tearing is one of the biggest irritations facing PC gamers today. It's a huge annoyance for players who want quick response times in fast-paced games such as FPS and RTS titles, but the problem affects games and gamers across the board, from budget PCs to high-end monsters. It's a crisis that graphics card and monitor makers have finally come together to fix.
Fear not, though: AMD and Nvidia have you covered with two differing solutions to this problem. Together they're called adaptive sync (alternatively known as dynamic refresh rates). The two firms market their technologies differently, but they solve the same problems in precisely the same way; it's the hardware implementations that vary slightly. In this article, we'll talk about how the technology works and help you decide which to choose when you're next in the market for a monitor or graphics card.
Nvidia got the jump on AMD last year when it launched its first consumer monitors with built-in adaptive sync, which it calls G-Sync. AMD, meanwhile, has been talking about its own adaptive sync technology, called FreeSync, for a long time and finally looks set to get a flurry of FreeSync compatible monitors on to the market this year.
The most important thing to take away from this is that even fairly old mid-range cards from both AMD and Nvidia both support adaptive sync. This means you don't need to buy a new card to reap the benefits of this technology. The lineup of monitors supporting FreeSync certainly looks stronger, but there are some great G-Sync monitors on the market from Acer, Asus (the SWIFT in particular is staggering) and AOC.
If you're buying a brand-new graphics card and have an adaptive sync monitor in mind, AMD looks more attractive right now simply because of the variety of FreeSync hardware set to be released this year. Adaptive sync greatly benefits modest hardware, and those with mid-range cards will greatly appreciate the lower cost overheads for DisplayPort 1.2a monitor over a G-Sync one. With that in mind it looks like AMD is in a better position.
Whichever you choose, the future is bright for adaptive sync technologies, and screen tearing in PC games should be a thing of the past very soon.
http://www.expertreviews.co.uk/acce...ync-vs-nvidia-g-sync-the-adaptive-sync-battle