• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD FreeSync vs Nvidia G-Sync - the adaptive sync battle

Soldato
Joined
30 Mar 2010
Posts
13,407
Location
Under The Stairs!
AMD FreeSync vs Nvidia G-Sync - the adaptive sync battle

How does adaptive sync work, and which technology should you buy? We investigate


Screen tearing is one of the biggest irritations facing PC gamers today. It's a huge annoyance for players who want quick response times in fast-paced games such as FPS and RTS titles, but the problem affects games and gamers across the board, from budget PCs to high-end monsters. It's a crisis that graphics card and monitor makers have finally come together to fix.

Fear not, though: AMD and Nvidia have you covered with two differing solutions to this problem. Together they're called adaptive sync (alternatively known as dynamic refresh rates). The two firms market their technologies differently, but they solve the same problems in precisely the same way; it's the hardware implementations that vary slightly. In this article, we'll talk about how the technology works and help you decide which to choose when you're next in the market for a monitor or graphics card.

Nvidia got the jump on AMD last year when it launched its first consumer monitors with built-in adaptive sync, which it calls G-Sync. AMD, meanwhile, has been talking about its own adaptive sync technology, called FreeSync, for a long time and finally looks set to get a flurry of FreeSync compatible monitors on to the market this year.

The most important thing to take away from this is that even fairly old mid-range cards from both AMD and Nvidia both support adaptive sync. This means you don't need to buy a new card to reap the benefits of this technology. The lineup of monitors supporting FreeSync certainly looks stronger, but there are some great G-Sync monitors on the market from Acer, Asus (the SWIFT in particular is staggering) and AOC.

If you're buying a brand-new graphics card and have an adaptive sync monitor in mind, AMD looks more attractive right now simply because of the variety of FreeSync hardware set to be released this year. Adaptive sync greatly benefits modest hardware, and those with mid-range cards will greatly appreciate the lower cost overheads for DisplayPort 1.2a monitor over a G-Sync one. With that in mind it looks like AMD is in a better position.

Whichever you choose, the future is bright for adaptive sync technologies, and screen tearing in PC games should be a thing of the past very soon.

http://www.expertreviews.co.uk/acce...ync-vs-nvidia-g-sync-the-adaptive-sync-battle
 
I have a feeling it won't be major selling point, and will instead become a standard feature of middle to high end screens that people take for granted.

Exactly, Gsync the must have experience that hardly anyone is taking the plunge, it's been out a year and no ones tripping over themselves to shout about it-apart from the pedantic fanboys.

Overall graphic market share Intel>AMD>Nvidia, Intel is going to be all over A-Sync, add in AMD leaving Nvidia on roughly 16% market share, that's a way higher potential captive audience which is clearly indicative by the amount of A-Sync panels inbound.

A-Sync monitors are aimed at 100% of the market, if you are in the market for a new monitor, next to no one is going to discount an A-Sync monitor in a specific targeted price point because they run Nvidia.

Difference being you are paying a royalty above and beyond the monitor cost with Gsync.
 
Back
Top Bottom