• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Explains Why Their G-Sync Display Tech Is Superior To AMD's FreeSync

Caporegime
Joined
24 Sep 2008
Posts
38,283
Location
Essex innit!
This month, the battle for GPU market share officially extends beyond the computer itself and into the display space, with AMD’s FreeSync technology launching to compete with Nvidia’s G-Sync. G-Sync is a proprietary solution invented by Nvidia in 2013 to enable a smoother gaming experience by eliminating screen tearing and seriously reducing annoying problems like ghosting and stutter. It requires the addition of Nvidia’s G-Sync module inside each monitor that is sold. FreeSync is AMD’s long awaited answer, and is built on the back of a Adaptive-Sync, an open standard which doesn’t require the addition of extra hardware.

On the surface it’s easy to write off these technologies as equals. Thus, it’s easy for consumers to simply choose the cheaper option. But Nvidia’s on a mission to make sure you understand that there are differences — even differences you may want to pay a premium for.

What I know for a fact based on testing is that regardless of their unique approaches, both technologies dramatically improve the PC gaming experience more than any other solution in existence. This is only the beginning of the G-Sync versus FreeSync exploration, but take solace in the fact that regardless which technology you embrace, screen tearing and stutter will largely be a thing of the past.

I recently had an educational conversation with Tom Petersen, Distinguished Engineer at Nvidia, regarding those differences and a range of other topics related to the G-Sync versus FreeSync debate. It was so insightful that rather than use it as background information for future articles, I wanted to publish it verbatim. Please enjoy!

Cost Vs Price

Forbes: Tom, for anyone not yet invested in Nvidia’s ecosystem, what benefits are there to G-Sync, and what do you believe justifies the added price?

Tom Petersen: “That’s an interesting question. I differentiate cost from price. The price of a G-Sync monitor or the price of a GPU is what a user sees at a business. You can imagine that the cost — what Nvidia or an OEM incurs when building that product — is a totally orthogonal issue. Price is determined by the market, and by the consumers. If Asus tried to charge more for a G-Sync monitor like a Swift…they want to charge as much as they can, but if it’s too high the market doesn’t buy it, they don’t get to sell it, and the price goes down.”

Forbes: Speaking of Asus, the ROG Swift seems like a great example, because all reports indicate it’s constantly selling out at $749, which is obviously more expensive than an equivalent non G-Sync monitor.

Tom Petersen: “That tells me the price is probably too low. Commercially, if the price is too high you’ve got heavy stock, and prices come down to move that product. So I think as a very first principle, it’s important to differentiate in your mind the difference between price and cost. Cost is what determines profitability. So from an OEM’s perspective, they want to build products where the cost of manufacturing is lower than the price. So when I hear people talk about pricing, what I think a lot of people don’t understand is…most of the time, Nvidia or Asus or Newegg doesn’t really set the price. You get to effectively accept or not accept what the market is doing about pricing. The market is saying ‘I love this monitor, reviews are awesome, I’m willing to pay this much money.’ If you look at a FreeSync monitor and say ‘oh, they’re cheaper,’ that’s likely because the OEMs believe that the market will only tolerate a lower price. They’re not setting it lower because they want to make less money.

Nvidia argues that the Radeon 295x2 isn't $700 because AMD's partners want to make less money. It's $700 because that's simply what the market is willing to pay.

Let’s use another example, AMD’s Radeon 295×2. It launched at $1499 last year. Since then, the price of the 295×2 has been cut in half. Do you think AMD’s partners lowered the price because they wanted to make less money? Of course not! The market gave them feedback and said ‘I know it seems reasonable that this would be a $1500 part and you’re delivering a lot of performance, but that’s not the price.’ So then AMD said ‘Maybe $999 is the price.’ And the market responded by saying ‘nope, not quite.’ The G-Sync enabled ROG Swift has only come down in price by $50 in the last year. That’s because demand is there.

We come back to one of AMD’s principle arguments about our cost being higher than their solution. That’s absolutely true. But you know what? That’s an Nvidia problem. It’s completely unrelated and irrelevant to the consumer. You don’t whine at Ford for having spark plugs that are platinum plated and cost 4 cents extra.”

Forbes: Ok, let’s talk about the cost to display manufacturers for G-Sync. I remember the original DIY kit was upwards of $240, but it can’t be that high for OEMs.

Tom Petersen: “No, it’s not. And the cost they’re incurring is something they’re factoring into their profitability. And that factors into the decision of whether or not they make them. The fact that OEMs like Asus are charging a significant premium for their monitors makes perfect sense to me. They’re building a monitor for gaming, and the market loves it. So everybody’s happy, right?”

Forbes: Credit where credit is due: I give Nvidia props for inventing a solution to screen tearing, stutter, and input lag well before an open standard existed. But will we ever see a day when Nvidia ditches their proprietary solution in favor of Adaptive-Sync?

Tom Petersen: “When we invented G-Sync, we determined very early on that in order to accomplish everything we wanted, we needed to be on both sides of the problem — at the front end where we’re controlling the GPU, and the backend inside of the monitor. As FreeSync comes to market, we’ll be able to compare the different strategies and see which one’s more effective. For us, having the module inside the panel allows us to deliver what we think is a very good experience across a full range of operating frequencies for refresh rate or framerate. We have some really significant technology inside that module dealing with the low end of refresh rates. So as a game transitions from 45fps down to 25fps and back, and games get really intense? During that transition our tech kicks in and delivers a smooth experience over that transition.”


Forbes: Let’s talk about the minimum response times that both G-Sync and Adaptive Sync support.

Tom Petersen: “First of all, the spec ‘Adaptive Sync’ has no minimum. Both have the ability to communicate any range, so there’s nothing about the base specs that are different. What’s interesting though, is the reason there are panel-specific refresh limits. LCD images decay after a refresh, you kinda paint the screen and it slowly fades. That fade is just related to the panel. The reason there’s an Adaptive Sync spec and G-Sync module is because that lower limit is variable depending on the technology inside the panel. But games don’t know about that! So what do you do when a game has a lower FPS than the minimum rate you want to run your panel? Because when they run below that minimum rate things start to flicker, and that’s a horrible experience.”

Forbes: So what specifically does Nvidia do to combat that?

Tom Petersen: “I can’t go into too much detail because it’s still one of our secret sauces. But our technology allows a seamless transition above and below that minimum framerate that’s required by the panel. PC Perspective wrote an article guessing how we did that, and they’re not that far off…”

Forbes: You’ve said in the past that one of the crucial goals of G-Sync was to never introduce screen tearing, no matter what.

Tom Petersen: “You never want to introduce stutter, either. It’s a complex problem, which is why we think you need some of that secret sauce in both the driver and the module. In contrast, AMD’s not doing that. As you transition from the high frequencies to the low frequencies of FPS, they have some jarringly negative experiences coming out of their zone. If you take any of their panels and run it from whatever frequency is in the zone, to any frequency out of the zone at the low end, the experience is not good at all.”

Now G-Sync addresses two problems at the low end, tearing and stutter. Stutter is caused by having a repeat of the exact same frame. So you show all these new frames, then suddenly if you’re not able to keep up with the refresh rate minimum, you see that frame twice. That’s what V-Sync does, repeating a frame. But when you repeat a frame, motion stops and that’s why you feel a stutter. G-Sync doesn’t do that. It adjusts the refresh rate to keep it above that minimum rate, and we have other techniques like shifting and centering to avoid stutters. It’s a tough problem, which again requires the module inside the monitor. If you download our new G-Sync demo which purposefully goes in and out of that zone on your AMD FreeSync monitor, you’ll see that whether you’re tearing of stuttering, the experience is not great. You can set whatever framerate range you want, and it operates across all monitors to compare the differences.”

AMD is counting on FreeSync’s more open nature for wider adoption by OEMs and consumers.

Ghost In The Machine

Tom Petersen: “There’s also a difference at the high frequency range. AMD really has 3 ranges of operation: in the zone, above the zone, and below the zone. When you’re above the zone they have a feature which I like (you can either leave V-Sync On or V-Sync Off), that we’re going to look at adding because some gamers may prefer that. The problem with high refresh rates is this thing called ghosting. You can actually see it with AMD’s own Windmill Demo or with our Pendulum Demo. Look at the trailing edge of those lines and you’ll see a secondary image following it.”

Note to readers: I have seen this firsthand on the Acer FreeSync monitor I’m reviewing, and PC Perspective noticed it with 2 additional FreeSync monitors, the BenQ XL2730Z and LG 34UM67. To illustrate the problem they recorded the aforementioned monitors running AMD’s Windmill demo, as well as the same demo running on a G-Sync enabled Asus ROG Swift. Ignore the stuttering you see (this is a result of recording at high speed) and pay attention to the trailing lines, or ghosting. I agree that it’s jarring by comparison.

Tom Petersen: “We don’t do that. We have anti-ghosting technology so that regardless of framerate, we have very little ghosting. See, variable refresh rates change the way you have to deal with it. Again, we need that module. With AMD, the driver is doing most of the work. Part of the reason they have such bad ghosting is because their driver has to specifically be tuned for each kind of panel. They won’t be able to keep up with the panel variations. We tune our G-Sync module for each monitor, based on its specs and voltage, which is exactly why you won’t see ghosting from us.

We also do support the majority of our GPUs going back to Kepler. The 650Ti Boost is the oldest GPU we support, and there’s a lot of gaps in their GPU support. It’s a tough problem and I’m not meaning to knock AMD, but having that module allows us to exercise more control over the GPU and consequently offer a deeper range of support.”

Stay tuned for more coverage of G-Sync and FreeSync. If you have questions about these technologies, reach out to me on Twitter and I’ll do my best to answer them for you.
http://www.forbes.com/sites/jasonev...nc-display-tech-is-superior-to-amds-freesync/

A very interesting read and Tom Peterson knows his stuff :)
 
Last edited:
Ghosting is non existent, I can't see anything on my BenQ and I'm not using Freesync either. Panel is amazing, gaming is amazing. Maybe professional tests might show it, but when I game and use PC for general use, I really have no idea what ppl are on about.

Some people can easily see, whilst others can't. On my Dell IPS for example, I could clearly see it but on my Swift, I can't.

Love or hate nVidia's proprietary tech, they at least give you a reason to spend the extra cash.
 
Why would you want it to work down to 9Hz?

That far down, your not really having fun whether freesync is working or not.

It doesn't work like that either. The Swift just runs nice at 30Hz or above.

Another point in the article which Tom mentioned was the wat Freesync turns off over the refresh rate of the monitor and see's that as a plus for AMD and will be looking to do the same with G-Sync. For me, I like it as it is but having the option would be sweet.
 
Lol
The guy knows nothing if he believes the ghosting is from Freesync..

Can't be that much better if we go by the benchmark results from pcper.. Freesync you actually gain performance, little bit but least it's a gain.

The ghosting in my BenQ is the same ghosting I can make my older xl2420t do.. Mum that has no Freesync..

Read the article and you will see what he said :o
 
That doesn't say Freesync is ghosting because of the driver though....

I will explain it:

He is saying that each G-Sync module is tuned to the panel, which seriously helps with Ghosting. He then says "With AMD, the driver is doing most of the work" (which is spot on) but he doesn't say at all that the ghosting is because of the driver. :o
 
From this video, it is inherently clear that the BenQ and LG are suffering from Ghosting. Now it is down to the individual if they are happy with that or not and some will see it whilst others clearly won't. I am very susceptible to these things and personally would hate it but that's just me.

 
Pretty dramatic difference and it is clear why the SWIFT can command that kind of price, it is clearly in a league of its own.

Yer agreed and no buyers remorse for me and I was more than happy to pay that after I got it. Prior to purchasing, I felt it was OTT and too expensive but after getting it, my mind was changed and in terms of monitors, it blows everything away I have ever owned.

If anyone asked me what to get and they owned nVidia, I would have no problems advising the Swift.
 
Ok so im buying a new 27" 1440p 120/144hz Screen this year, more than likely to go with a 390X (depends on how the card is) if not a 390X then a 295x2, i hate the thought of lining Nvidias pockets with cash, so basically, is freesync worth the investment or is G-Sync and Nvidia the better option?

All i want is 1440p and 144hz as much as possible, i play MMOs FPS and D3 mainly :)

Si, it would be wiser to ask that in the monitors section, as I doubt you would get a reasoned response in the GPU section (might be wrong). :)
 
Good call, tbh as much as it really pains me, im tempted to hang out for the 980ti if it comes and just buy a damn G-Sync screen like the Swift, AMD are beginning to look amateur hour now :( lack of any kind of news coupled with having to spend money before the wife discovers it means i need to buy soon lol

I am massively biased as you well know and my 290X has been a great experience in truth (ignoring my 144Hz issues) but I just can't help feeling that nVidia go that bit further (and at a cost for sure). I can't comment on Freesync, as I don't use or have ever seen one in motion but the Swift has been a fantastic monitor. It isn't till you try it do you see what you didn't see.... I know that doesn't make sense but tearing and stutter are gone and you notice how bad it was before. Having no G-Sync for the past 2 weeks has just made me not want to play games in truth and that is how G-Sync has improved my gaming.

Not much else I can add really and only you know what your eyesight is like and what you accept in games.

The comment that the price of the SWIFT has barely dropped is a very valid point- the market has decided it is worth that price.

Yer, if they weren't selling, they would have dropped big time.
 
OK, Peter, now tell me... what did you do when the market said 'no' to the $2,000 Titan Z?

I don't think he can hear ya hunny :D

But spot on. The Z was a failure regardless of the price and I wasn't surprised to see it drop in price like it did and even then, people still didn't want it. Nice card but throttling and not as fast as a pair of Titans was never going to sell well.
 
Back
Top Bottom