• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Gsync Owners - Would you go AMD?

Soldato
Joined
6 Jan 2013
Posts
22,344
Location
Rollergirl
Reading early reports that Vega may provide better performance than a 1080 or possibly even the Titan P at a price point of maybe £500, and this is a question specifically relating to people with high end Gsync monitors... would you make the switch?

I've avoided the new Nvidia Pascal cards this year due to the paltry performance increase over the 980ti for what... £700-1100? If AMD brought out a card that could see a performance gain of 50% over a 980ti for around £500-600 then I'd like to think I could reward them with a purchase.

But my beloved Gsync... :(

I suppose I could disable Gsync in NCP and see how I felt about Vsync options, but would you think it worth considering or do you think that after spending £900 on a X34A then it's Nvidia or nothing?
 
I couldn't really do without G-Sync now and would happily pay a fair bit over the odds to get the performance I required + G-Sync so AMD would have to truly dominate on performance without any chance of a response by nVidia for me to even consider it.
 
G-Sync is dead, it just doesn't know it yet.

..but Nvidia & the monitor manufacturers do.

I can't see it going anywhere any time soon - people keep saying stuff like that but there is no shortage of people who will pay well over the odds for nVidia hardware and solutions.
 
Plenty of people with FreeSync gone Nvidia, so the other way, and they say they don't miss it one bit, and i wouldn't go by reports of Vega outperforming the 1080, as that was only under Vulkan running Doom, the only game its in (as far as i know), and what AMD are known to be good at anyway, seen as its their mantle, id wait for the actual reviews, and see how it fairs in the vast majority of games, which are Dx11.
 
Last edited:
G-sync is being priced into irrelevance.

Bit-Tech reviewed this yesterday:

http://www.bit-tech.net/hardware/monitors/2016/12/13/asus-mg28uq-review/1

28' Asus with 4K resolution & Freesync for £400.

If that was G-Sync, you'd be looking at more than £600.

The price discrepancy is unsustainable.

Supply and demand doesn't just go one way and then just go pop.

If sales drop off the price will become keener - but often with nVidia stuff there are plenty of people who'll just open their wallet every time nVidia asks.
 
Supply and demand doesn't just go one way and then just go pop.

But the monitor manufacturers will look at the sales of their free sync & G-sync range and see the difference in units shifted.

G-sync monitors have additional manufacturing costs over and above just putting in the Nvidia hardware.

..Nvidia will simply drop it in the same way they did with separate PhysX cards back in the day and switch to a Free sync style solution.
 
I would seriously miss G-Sync but happy to use other sync variants where needed. I did run a Fury X and a 290X and really did notice the stutter though. I am mainly playing VR and The Division though, so nothing I couldn't cope with.
 
Supply and demand doesn't just go one way and then just go pop.

If sales drop off the price will become keener - but often with nVidia stuff there are plenty of people who'll just open their wallet every time nVidia asks.

Interesting article on all of this:

http://www.pcworld.com/article/3129...ia-g-sync-on-monitor-selection-and-price.html

Why AMD FreeSync is beating Nvidia G-Sync on monitor selection and price
Hint: It's not just the cost of the proprietary module.


Let’s say you’ve just bought one of Nvidia’s slick new Pascal-based GeForce graphics cards such as the GTX 1070, and now you’re seeking a G-Sync monitor to go with it.

Looking at what’s available, you’ll probably become envious of PC gamers on the Radeon side of the fence. Compared to G-Sync monitors, displays supporting AMD’s FreeSync adaptive sync tech are generally much cheaper, with a wider range of vendors and tech specs to choose from. The website 144HzMonitors lists 20 available G-Sync monitors, versus 85 FreeSync monitors, the latter showing more combinations of screen size, refresh rate, and resolution.

Why the disparity? The conventional wisdom is that Nvidia’s proprietary G-Sync hardware module raises the monitor price due to licensing fees, but that’s not a satisfying explanation. Nvidia is still far and away the market share leader in graphics cards, so you’d think that most monitor makers would create G-Sync variants of their FreeSync displays and at least give GeForce users the option of absorbing the module cost.

As I started talking to monitor makers, a more complicated picture emerged. The real reason for G-Sync’s limited availability is as much about design and development concerns as it is about the price of the module itself.


G-Sync vs. FreeSync refresher

PCWorld has already published a detailed primer on G-Sync and FreeSync, but the gist is that both technologies allow the graphics card to adjust the monitor’s refresh rate on the fly, matching it to the PC’s current framerate. This prevents the screen tearing effect that occurs when refresh rate and framerate fall out of sync, and (mostly) eliminates stutter, creating a buttery-smooth gameplay experience.
catalyst omega amd freesync

G-Sync accomplishes these variable refresh rates with a proprietary hardware module, which is built into every supported monitor. With FreeSync, no such module is required, because it uses the variable refresh rate tech that’s part of the DisplayPort standard (and, more recently, HDMI as well). But again, the lack of extra hardware is not the only reason FreeSync monitors are cheaper and more readily available.
Design costs

Some display makers say Nvidia’s module requires more room inside the monitor enclosure. While that may not seem like a big deal, creating a custom product design for one type of monitor raises development costs considerably, says Minhee Kim, a leader of LG’s PC and monitor marketing and communications. By comparison, Kim says, AMD’s approach is more open, in that monitor makers can include the technology in their existing designs.

“Set makers could adopt their technology at much cheaper cost with no need to change design,” Kim says. “This makes it easier to spread models not only for serious gaming monitors but also for mid-range models.”

LG’s FreeSync monitor selection bears this out: The company offers several 1080p monitors under 30 inches diagonal with an ultrawide 21:9 aspect ratio, priced as little as $279. With G-Sync, the only 1080p ultrawide monitor is a 35-inch curved panel from Acer with a much higher refresh rate. The cost? $900.
gsyncpredator

The cheapest ultrawide 1080p G-Sync monitor will set you back nearly $1000.

Even if monitor makers proceed with the necessary research and development, the resulting product will be more expensive, which inevitably means it will sell in lower volumes. That, in turn, means it’s harder for monitor makers to recoup those up-front development costs, says Jeffry Pettinga, the sales director for monitor maker Iiyama.

“You might think, oh 10,000 sales, that’s a nice number. But maybe as a manufacturer you need 100,000 units to pay back the development costs,” Pettinga says.

Meanwhile, he says, monitors are constantly improving in other areas such as bezel size. As monitors shrink from wide bezels to slim bezels to edge-to-edge displays, the risk is that a slow-selling G-Sync will become outdated long before the investment pays off.

“Let’s say you introduced, last year, your product with G-Sync. Six months of development, and you have to change the panel. You haven’t paid off your development cost,” Pettinga says. “There’s a lot of things going on on the panel side.”
Limited flexibility

Costs aside, some monitor makers feel restricted in how they can differentiate their G-Sync monitors.

Display maker Eizo, for instance, has a feature in its gaming monitors called Smart Insight that adjusts gamma and brightness on the fly, helping to improve visibility in light and dark areas. This feature wouldn’t be possible with G-Sync, says Keisuke Akiba, Eizo’s product & marketing manager, because Nvidia’s module handles all the color adjustments itself.

“The G-Sync module accepts color adjustment in the module, not an outside chip,” Akiba says. “Our color adjustment needs power and flexibility so we’ve gone for FreeSync.”
eizo

G-Sync doesn’t allow monitor makers to offer their own color adjustments, like Eizo does.

Monitor makers also have limits on what video inputs they can include. All G-Sync monitors have one DisplayPort input, and in some cases they also include an HDMI input that doesn’t support variable refresh rate. You won’t find any G-Sync monitors with more than two inputs (or with support for DVI). Also, G-Sync doesn’t support variable refresh rate over HDMI. That means every G-Sync monitor must include DisplayPort—again raising the cost to manufacture.

“DisplayPort is relatively expensive on a monitor because of the cable—it’s a quite expensive cable if you include a cable—and the board design itself. So DisplayPort adds a lot more to the cost than HDMI,” Pettinga says.
Nvidia’s answer: It’s about value, not cost

In an interview, Tom Petersen, Nvidia’s director of technical marketing, doesn’t dispute any of these concerns, and acknowledges that the high cost to develop G-Sync monitors puts them into a pricier segment of the market.

But to Nvidia, that’s okay, because G-Sync is supposed to be a premium product. The company points to several ways in which G-Sync is superior to FreeSync, including its ability to handle any drop in refresh rate—FreeSync only works within a specified range—and Nvidia’s complete control over things like monitor color and motion blur, which Petersen argues are superior to what monitor makers are offering outside the module.

For those reasons, Petersen says any price disparity between comparable G-Sync and FreeSync monitors is not due to the module, whose cost he says is “relatively minor,” but due to monitor makers' decision to charge more.

“To me, when I look out and see G-Sync monitors priced higher, that’s more of an indication of value rather than cost,” he says. “Because at the end of the day, especially these monitors at the higher segments, the cost of the components don’t directly drive the price.”
gsyncmodule

Nvidia says the proprietary module is not a major contributor to the cost of G-Sync monitors, especially since it replaces some other standard components.

Perhaps that’s a fair point for higher-priced monitors, but as we’ve heard from monitor makers, the bigger issue is that the module is inherently harder to include in lower-priced options. With G-Sync, for instance, you can’t buy a 60Hz monitor in less than 4K resolution, whereas FreeSync offers plenty of options in 1440p and 1080p.

Nvidia's Petersen suggested that addressing these mid-tier markets isn’t a priority. “I think over time, you’ll see lower-priced monitors that are lower-featured, that include G-Sync, but it’s not our goal,” Petersen says. “Our goal is to provide a premium gaming experience, and the premium gaming experience requires a lot of hands-on work from Nvidia, and that’s where we’re going to continue to focus over time.”

Of course, some monitor makers would prefer that Nvidia supported DisplayPort’s adaptive sync standard, so users could ,at least enjoy some anti-tearing benefits even if they didn’t splurge for a G-Sync monitor. To that, Petersen says “never say never,” but right now he argues there’s no benefit to doing so.

“I’m worried that by just throwing it out there, we could be delivering the same less-than-awesome experience that FreeSync does today,” he says, “and that’s just not our strategy.”

For loyal Nvidia customers, the takeaway is clear: If you want G-Sync, be prepared to jump into the deep end of luxury gaming monitors, because the technology isn’t going downmarket anytime soon.

I think Nvidia should support both,and brand it like this:
1.)VESA adaptive sync= GSync Lite
2.)Nvidia adaptive sync=GSync Premium

Make it such that the GSync Premium monitors have the best features overall,etc.
 
Last edited:
Why are all the larger free sync screens a lower refresh rate then g sync?

The gsync module helps achieve said higher OC, however, not every monitor can achieve what is advertised i.e. for most of the 34" gsync 100HZ, most people have to settle with 95 or lower. Problem with these higher overclocks is that it introduces scanline issues and possibly coil whine, at least this is the case for the 34" 1440 100HZ gsync screens:

https://forums.overclockers.co.uk/showpost.php?p=29472741&postcount=20

Samsung and the new microboard 34" 1440 freesync screens are 100HZ (there are reports of being able to go higher than 100HZ on the microboard without frame skipping too)

I can't see it going anywhere any time soon - people keep saying stuff like that but there is no shortage of people who will pay well over the odds for nVidia hardware and solutions.

It won't go anywhere, I'm sure nvidia will keep it for the high end premium monitors but it will certainly become even more of a niche product in the future.

If gsync was doing so well and in high demand, I imagine that we would be seeing a lot more than these figures and gsync has been out for over a year longer:

2016 figures are Jan-Oct:

2015 new monitors in total: 268 IPS, 156 TN, 59 VA
2016 new monitors in total: 178 IPS, 116 TN, 57 VA

2015 G-Sync monitors: 8 IPS, 3 TN, 1 VA
2016 G-Sync monitors: 3 IPS, 5 TN, 6 VA

2015 FreeSync monitors: 13 IPS, 15 TN, 2 VA
2016 FreeSync monitors: 32 IPS, 17 TN, 13 VA

When Tom Peterson was asked about supporting adaptive sync, his answer:

“never say never”

IMO, that says it is only a matter of time until they support adaptive sync. They will still more than likely keep gsync alive for the "high end" premium gaming monitors though.

Unless nvidia can manage to significantly reduce the size of the gsync module so that monitor manufacturers don't have to re-design the chassis/assembly of the monitor then it is going to become even more of a niche product than what it already is (the other main reason why it is hardly used atm)
 
Last edited:
No way would having FreeSync/G-Sync stop me switching sides, anyone who does is off their bloody rocker imo, if the brand im on can't provide an upgrade, theres absolutely no bloody way id not switch, theres no way in hell id stick on the card im on until they have, struggling on with it, having to turn settings down/off etc..., id switch straight away if the other brand can provide the upgrade, and keep me enjoying games how they're supposed to be played, which is not at low settings, stuff turned off, and at low fps.
 
If gsync was doing so well and in high demand, I imagine that we would be seeing a lot more than these figures and gsync has been out for over a year longer:

That is only half the picture mind (sales figures, etc.) - G-Sync doesn't lend itself to being slapped into a monitor almost as an afterthought in comparison - some of those FreeSync monitors are much better regarded than others.

I think one of the things that has made monitor manufacturers a little nervy as well has been the failure of some of the common panel types used with G-Sync i.e. the AUO high refresh ones could almost be regarded as "experimental" :S

IMO, that says it is only a matter of time until they support adaptive sync. They will still more than likely keep gsync alive for the "high end" premium gaming monitors though.

There seems to be a bit of a division over that at nVidia - some want to see adaptive sync supported by default regardless of what happens with G-Sync while others aren't exactly cooperative towards that end.
 
If nVidia can't provide the performance I need at a similar price level to AMD then I would have no problem switching, it would suck to be without G Sync now, but I can't see this situation happening. If Vega produces better performance at a lower price then nVidia will just reduce prices to suit.
 
Wouldn't change in the next few years due to having 34" Rog. If I didn't have this monitor then maybe I would jump as the last AMD cards I owned 5850's where great for the money.
 
Back
Top Bottom