Caporegime
This month, the battle for GPU market share officially extends beyond the computer itself and into the display space, with AMD’s FreeSync technology launching to compete with Nvidia’s G-Sync. G-Sync is a proprietary solution invented by Nvidia in 2013 to enable a smoother gaming experience by eliminating screen tearing and seriously reducing annoying problems like ghosting and stutter. It requires the addition of Nvidia’s G-Sync module inside each monitor that is sold. FreeSync is AMD’s long awaited answer, and is built on the back of a Adaptive-Sync, an open standard which doesn’t require the addition of extra hardware.
On the surface it’s easy to write off these technologies as equals. Thus, it’s easy for consumers to simply choose the cheaper option. But Nvidia’s on a mission to make sure you understand that there are differences — even differences you may want to pay a premium for.
What I know for a fact based on testing is that regardless of their unique approaches, both technologies dramatically improve the PC gaming experience more than any other solution in existence. This is only the beginning of the G-Sync versus FreeSync exploration, but take solace in the fact that regardless which technology you embrace, screen tearing and stutter will largely be a thing of the past.
I recently had an educational conversation with Tom Petersen, Distinguished Engineer at Nvidia, regarding those differences and a range of other topics related to the G-Sync versus FreeSync debate. It was so insightful that rather than use it as background information for future articles, I wanted to publish it verbatim. Please enjoy!
Cost Vs Price
Forbes: Tom, for anyone not yet invested in Nvidia’s ecosystem, what benefits are there to G-Sync, and what do you believe justifies the added price?
Tom Petersen: “That’s an interesting question. I differentiate cost from price. The price of a G-Sync monitor or the price of a GPU is what a user sees at a business. You can imagine that the cost — what Nvidia or an OEM incurs when building that product — is a totally orthogonal issue. Price is determined by the market, and by the consumers. If Asus tried to charge more for a G-Sync monitor like a Swift…they want to charge as much as they can, but if it’s too high the market doesn’t buy it, they don’t get to sell it, and the price goes down.”
Forbes: Speaking of Asus, the ROG Swift seems like a great example, because all reports indicate it’s constantly selling out at $749, which is obviously more expensive than an equivalent non G-Sync monitor.
Tom Petersen: “That tells me the price is probably too low. Commercially, if the price is too high you’ve got heavy stock, and prices come down to move that product. So I think as a very first principle, it’s important to differentiate in your mind the difference between price and cost. Cost is what determines profitability. So from an OEM’s perspective, they want to build products where the cost of manufacturing is lower than the price. So when I hear people talk about pricing, what I think a lot of people don’t understand is…most of the time, Nvidia or Asus or Newegg doesn’t really set the price. You get to effectively accept or not accept what the market is doing about pricing. The market is saying ‘I love this monitor, reviews are awesome, I’m willing to pay this much money.’ If you look at a FreeSync monitor and say ‘oh, they’re cheaper,’ that’s likely because the OEMs believe that the market will only tolerate a lower price. They’re not setting it lower because they want to make less money.
Nvidia argues that the Radeon 295x2 isn't $700 because AMD's partners want to make less money. It's $700 because that's simply what the market is willing to pay.
Let’s use another example, AMD’s Radeon 295×2. It launched at $1499 last year. Since then, the price of the 295×2 has been cut in half. Do you think AMD’s partners lowered the price because they wanted to make less money? Of course not! The market gave them feedback and said ‘I know it seems reasonable that this would be a $1500 part and you’re delivering a lot of performance, but that’s not the price.’ So then AMD said ‘Maybe $999 is the price.’ And the market responded by saying ‘nope, not quite.’ The G-Sync enabled ROG Swift has only come down in price by $50 in the last year. That’s because demand is there.
We come back to one of AMD’s principle arguments about our cost being higher than their solution. That’s absolutely true. But you know what? That’s an Nvidia problem. It’s completely unrelated and irrelevant to the consumer. You don’t whine at Ford for having spark plugs that are platinum plated and cost 4 cents extra.”
Forbes: Ok, let’s talk about the cost to display manufacturers for G-Sync. I remember the original DIY kit was upwards of $240, but it can’t be that high for OEMs.
Tom Petersen: “No, it’s not. And the cost they’re incurring is something they’re factoring into their profitability. And that factors into the decision of whether or not they make them. The fact that OEMs like Asus are charging a significant premium for their monitors makes perfect sense to me. They’re building a monitor for gaming, and the market loves it. So everybody’s happy, right?”
http://www.forbes.com/sites/jasonev...nc-display-tech-is-superior-to-amds-freesync/Forbes: Credit where credit is due: I give Nvidia props for inventing a solution to screen tearing, stutter, and input lag well before an open standard existed. But will we ever see a day when Nvidia ditches their proprietary solution in favor of Adaptive-Sync?
Tom Petersen: “When we invented G-Sync, we determined very early on that in order to accomplish everything we wanted, we needed to be on both sides of the problem — at the front end where we’re controlling the GPU, and the backend inside of the monitor. As FreeSync comes to market, we’ll be able to compare the different strategies and see which one’s more effective. For us, having the module inside the panel allows us to deliver what we think is a very good experience across a full range of operating frequencies for refresh rate or framerate. We have some really significant technology inside that module dealing with the low end of refresh rates. So as a game transitions from 45fps down to 25fps and back, and games get really intense? During that transition our tech kicks in and delivers a smooth experience over that transition.”
Forbes: Let’s talk about the minimum response times that both G-Sync and Adaptive Sync support.
Tom Petersen: “First of all, the spec ‘Adaptive Sync’ has no minimum. Both have the ability to communicate any range, so there’s nothing about the base specs that are different. What’s interesting though, is the reason there are panel-specific refresh limits. LCD images decay after a refresh, you kinda paint the screen and it slowly fades. That fade is just related to the panel. The reason there’s an Adaptive Sync spec and G-Sync module is because that lower limit is variable depending on the technology inside the panel. But games don’t know about that! So what do you do when a game has a lower FPS than the minimum rate you want to run your panel? Because when they run below that minimum rate things start to flicker, and that’s a horrible experience.”
Forbes: So what specifically does Nvidia do to combat that?
Tom Petersen: “I can’t go into too much detail because it’s still one of our secret sauces. But our technology allows a seamless transition above and below that minimum framerate that’s required by the panel. PC Perspective wrote an article guessing how we did that, and they’re not that far off…”
Forbes: You’ve said in the past that one of the crucial goals of G-Sync was to never introduce screen tearing, no matter what.
Tom Petersen: “You never want to introduce stutter, either. It’s a complex problem, which is why we think you need some of that secret sauce in both the driver and the module. In contrast, AMD’s not doing that. As you transition from the high frequencies to the low frequencies of FPS, they have some jarringly negative experiences coming out of their zone. If you take any of their panels and run it from whatever frequency is in the zone, to any frequency out of the zone at the low end, the experience is not good at all.”
Now G-Sync addresses two problems at the low end, tearing and stutter. Stutter is caused by having a repeat of the exact same frame. So you show all these new frames, then suddenly if you’re not able to keep up with the refresh rate minimum, you see that frame twice. That’s what V-Sync does, repeating a frame. But when you repeat a frame, motion stops and that’s why you feel a stutter. G-Sync doesn’t do that. It adjusts the refresh rate to keep it above that minimum rate, and we have other techniques like shifting and centering to avoid stutters. It’s a tough problem, which again requires the module inside the monitor. If you download our new G-Sync demo which purposefully goes in and out of that zone on your AMD FreeSync monitor, you’ll see that whether you’re tearing of stuttering, the experience is not great. You can set whatever framerate range you want, and it operates across all monitors to compare the differences.”
AMD is counting on FreeSync’s more open nature for wider adoption by OEMs and consumers.
Ghost In The Machine
Tom Petersen: “There’s also a difference at the high frequency range. AMD really has 3 ranges of operation: in the zone, above the zone, and below the zone. When you’re above the zone they have a feature which I like (you can either leave V-Sync On or V-Sync Off), that we’re going to look at adding because some gamers may prefer that. The problem with high refresh rates is this thing called ghosting. You can actually see it with AMD’s own Windmill Demo or with our Pendulum Demo. Look at the trailing edge of those lines and you’ll see a secondary image following it.”
Note to readers: I have seen this firsthand on the Acer FreeSync monitor I’m reviewing, and PC Perspective noticed it with 2 additional FreeSync monitors, the BenQ XL2730Z and LG 34UM67. To illustrate the problem they recorded the aforementioned monitors running AMD’s Windmill demo, as well as the same demo running on a G-Sync enabled Asus ROG Swift. Ignore the stuttering you see (this is a result of recording at high speed) and pay attention to the trailing lines, or ghosting. I agree that it’s jarring by comparison.
Tom Petersen: “We don’t do that. We have anti-ghosting technology so that regardless of framerate, we have very little ghosting. See, variable refresh rates change the way you have to deal with it. Again, we need that module. With AMD, the driver is doing most of the work. Part of the reason they have such bad ghosting is because their driver has to specifically be tuned for each kind of panel. They won’t be able to keep up with the panel variations. We tune our G-Sync module for each monitor, based on its specs and voltage, which is exactly why you won’t see ghosting from us.
We also do support the majority of our GPUs going back to Kepler. The 650Ti Boost is the oldest GPU we support, and there’s a lot of gaps in their GPU support. It’s a tough problem and I’m not meaning to knock AMD, but having that module allows us to exercise more control over the GPU and consequently offer a deeper range of support.”
Stay tuned for more coverage of G-Sync and FreeSync. If you have questions about these technologies, reach out to me on Twitter and I’ll do my best to answer them for you.
A very interesting read and Tom Peterson knows his stuff
Last edited: