LG 34GK950G, 3440x1440, G-Sync, 120Hz

Yes I am not arguing with you about it, or saying you do not know what you are talking about, just saying it seems that for example freesync will probably be able to use 144hz on this monitor, although obviously not completely free. But the new Gsync module apparently costs 500 and does not support HDR400/600. As I have said a few times it seems like a large gap between the current 100hz or 120hz OC DP 1.2 version or this new fangled HDR1000 module. Overall this current Gsync situation is not ideal and there must be some easier solution such as either reducing the cost of the module or replacing the DP 1.2 module etc.
 
Last edited:
Yes I am not arguing with you about it, or saying you do not know what you are talking about, just saying it seems that for example freesync will probably be able to use 144hz on this monitor, although obviously not completely free. But the new Gsync module apparently costs 500 and does not support HDR400/600. As I have said a few times it seems like a large gap between the current 100hz or 120hz OC DP 1.2 version or this new fangled HDR1000 module. Overall this current Gsync situation is not ideal and there must be some easier solution such as either reducing the cost of the module or replacing the DP 1.2 module etc.


The new G-Sync module does support HDR400/600. Nvidia are not permitting partners to use the module with HDR400/600 panels/monitors.

HDR400/600/1000 capability is down to the panel, not DisplayPort or G-Sync. DisplayPort and G-Sync simply transfer or process data. The maximum brightness (i.e. HDR400/600/1000) is limited by the capability of the panel.
 
The new G-Sync module does support HDR400/600. Nvidia are not permitting partners to use the module with HDR400/600 panels/monitors.

HDR400/600/1000 capability is down to the panel, not DisplayPort or G-Sync. DisplayPort and G-Sync simply transfer or process data. The maximum brightness (i.e. HDR400/600/1000) is limited by the capability of the panel.

Yes I know it CAN support it, but it does not actually support it, because NVidia said HDR1000 only. Similar to how Nvidia COULD support freesync, but they don't because Nvidia said no. Common theme here, Nvidia can do loads of stuff but they don't because they are annoying haha.
 
The new G-Sync module does support HDR400/600. Nvidia are not permitting partners to use the module with HDR400/600 panels/monitors.

Can I ask what the source is for this?

Not that I'd want to defend nVidia, but it could also just come down to cost.

The whole point of DisplayHDR 400 is to allow OEMs to slap a HDR badge on their more affordable monitors, thereby allowing OEMs to market something as HDR without having to invest into improving image quality beyond what a good SDR monitor already achieves. DisplayHDR 600 does make a visual difference, but is specified in a way so that OEMs can achieve it without having to go all out. Only DisplayHDR 1000 requires the really expensive stuff like FALD.

DisplayHDR 600 and 400 exist only as cost cutting measures. However, if you're going to incorporate a $500 G-SYNC module, then that already puts your monitor into the highest-end luxury segment, in which case it better support DisplayHDR 1000.

If that is what nVidia is thinking, I'd say that's just common sense (ignoring that >=$500 for an adaptive sync technology is entirely non-sensical).
 
Last edited:
They only want G-Sync associated with HDR1000, and will only allow the module to be used in monitors that can achieve this HDR standard, to avoid poor HDR performance being associated with Nvidia.

If they don't work on bringing down that cost massively then I can see a lot of medium/long-term switches to Freesync (as long as AMD become competitive again in the medium term in the higher-end of the performance spectrum).
 
Yes I know it CAN support it, but it does not actually support it, because NVidia said HDR1000 only. Similar to how Nvidia COULD support freesync, but they don't because Nvidia said no. Common theme here, Nvidia can do loads of stuff but they don't because they are annoying haha.

nVidia's G-SYNC module does actually support DisplayHDR 400 and 600. If nVidia has restrictions on how their G-SYNC module can be used, then those restrictions must be enforced through legal means via the contracts they have with OEMs. I wouldn't say that is similar to FreeSync, because supporting FreeSync would require at least some engineering efforts (at the very least updated drivers, likely more for FreeSync 2).
 
nVidia's G-SYNC module does actually support DisplayHDR 400 and 600. If nVidia has restrictions on how their G-SYNC module can be used, then those restrictions must be enforced through legal means via the contracts they have with OEMs. I wouldn't say that is similar to FreeSync, because supporting FreeSync would require at least some engineering efforts (at the very least updated drivers, likely more for FreeSync 2).

Imho even if someone like LG wanted to use the Gsync HDR module, they are realistic about the adding an extra ~$800 on the price of the monitor. Including an active fan.

And I wrote $800 because the whole module doesn't cost "$500" everybody quotes.
Seems many forgot to read the same article they are quoting, that the "$500" is estimate price for the FPGA CPU only, assuming Nvidia can buy them at 80% discount over the street price. The rest of the board BOM costs another $250-$300 according to the same article. And that without adding Nvidia profit margins and manufacturinga and designing costs.

So, what would be preferable from now on? Having a series of Gsync HDR monitors at £2000+ while their Freesync syblings are at half the price?
Or you want the prices to be diluted and having the Gsync monitors being sold at loss (~£1500) and the Freesync at inflated prices to cover up the damages, effectively having the Freesync users paying the costs of the Gsync modules?
 
So, what would be preferable from now on? Having a series of Gsync HDR monitors at £2000+ while their Freesync syblings are at half the price?
Or you want the prices to be diluted and having the Gsync monitors being sold at loss (~£1500) and the Freesync at inflated prices to cover up the damages, effectively having the Freesync users paying the costs of the Gsync modules?

Your first proposition implies G-SYNC (based on DP1.4) will never command more than a minuscule share of the market. However, without significant market penetration, G-SYNC becomes irrelevant to consumers (because cost), monitor OEMs (because revenue) and ultimately also to nVidia, which would spell the end of G-SYNC as a technology overall. It's unlikely nVidia thinks that's a good idea. Your latter proposition violates antitrust laws to a degree that seems comical, even by today's levels of corporate corruption. Obviously, neither is preferable to anything.

Answer me this:

Are you convinced that nVidia's latest G-SYNC module can NOT be delivered at a lower cost WITHOUT sacrificing user-facing features (HDR, refresh rates, DP1.4 compliance, etc)? It seems to me that is what some folks here assume. Based on your propositions, it seems you're assuming the same, namely that nVidia's current implementation is as cheap as G-SYNC (based on DP1.4) gets. I think that's BS.

You've asked me "what is preferable". The answer is simple: nVidia should just stop using such a ridiculously expensive FPGA. I know enough about hardware and software development to say there is absolutely no technical reason which forces nVidia to use an FPGA. I don't know what drove nVidia to make that choice, but suspect their reasons boil down to economics and/or risk aversion. Since I'm not privy to their thinking, there isn't much point to me speculating on what those economic or risk related issues may be, much less on how to solve them.

If forced to speculate about a solution based on my currently incomplete knowledge, I'd say nVidia should do the following:

Throw out the current FPGA based design. Replace it with a module that functions as a complete scaler package which is built as an ASIC. Sell the ASIC as a direct competitor to the scalers offered by Realtek, Novatec or MStar (it's from these three companies that monitor OEMs purchase the "modules" which receive the FreeSync signals from AMD GPUs for about $2). If it was up to me, the unique selling point of nVidia's scaler would be its ability to support both FreeSync and G-SYNC. G-SYNC support could be disabled in firmware. With G-SYNC disabled, the scaler would be sold to OEMs at the same price as the competitor's scalers. This would allow monitor OEMs to sell the exact same monitor in a G-SYNC or FreeSync variant. All the differences would be limited to firmware. This would bring the cost of the G-SYNC model down to a price much closer to that of the FreeSync model. Technically, this would even allow a single monitor to support FreeSync and G-SYNC simultaneously, which would be great for consumers, but that's probably not in line with nVidia's market strategy.

At least technically, that is realistic. I don't know if it's realistic economically.

Either way, it seems inconceivable to me that nVidia believes their G-SYNC module (based on DP1.4), at its current price, represents a product that is viable in the long term. However nVidia chooses to reduce the module's cost, which we all believe they must, eliminating HDR support isn't going to be beneficial to them in that regard. Assuming DP1.4 compliance is a given, reducing bandwidth also isn't an option. Your propositions aren't solutions either. As far as I can tell, nVidia's only hope is to eliminate the need for the FPGA and replace it with an ASIC. There are multiple ways to go about that, with the differences between them being related to where the G-SYNC features are integrated (into the GPU, the scaler, or both). What I've suggested above is just one way.
 
Last edited:
I read it somewhere on the internet, so it must be true! :p

Read it ages ago on a reputable site, but I can't find anything about it now... sorry

No worries. It's not that I doubt it. I was just hoping to get more details but couldn't find anything. Thanks anyway.
 
Your first proposition implies G-SYNC (based on DP1.4) will never command more than a minuscule market share. .

If we assume that Nvidia goes back to the drawing board to create a new Gsync HDR board all is open.
But why Nvidia didn't do it in first place and went to use an FGPA cpu requiring active cooling?
 
If we assume that Nvidia goes back to the drawing board to create a new Gsync HDR board all is open.
But why Nvidia didn't do it in first place and went to use an FGPA cpu requiring active cooling?

How am I supposed to know that? I don't work for nVidia, so all I can do is take stabs in the dark.

  • Maybe nVidia felt the flexibility offered by an FPGA outweighs the additional cost, at least initially? Using an FPGA allows nVidia to essentially "re-design" their logic circuitry, even after the FPGA has shipped to OEMs or consumers, which can dramatically lower the cost of corrections should show stopping bugs be found (looking at nVidia's G-SYNC support forums, we know they are still struggling with plenty of issues on Windows 10).
  • Maybe nVidia always intended for their latest G-SYNC module (based on DP1.4) to be built using an ASIC, but due to the delays these monitors have already experienced, nVidia decided to skip that step and use the FPGA based development board as the final product (FPGAs are typically used during development, when changes are frequent)?
  • Maybe the sales volumes of G-SYNC based monitors are so low that manufacturing a custom ASIC isn't worth it, and for who knows what reason nVidia didn't want to build a general purpose scaler that can compete with Realtek, Novatec or MStar, which would allow nVidia to leverage economies of scale?
Plenty of other possibilities exist. Unfortunately, unless we get more information on what is going on behind the scenes, we'll probably remain in the dark about what the problem/s really is/are.

Whatever the cost cutting solution is, nVidia likely has no choice but to return to the drawing board. Most of the cost of their current design results from using very expensive 3rd party components, for which nVidia can't control costs. In addition to pricing concerns, eliminating the FPGA would also eliminate the need for a fan, as a custom built ASIC is a LOT more efficient than an FPGA.

If nVidia doesn't provide a cheaper alternative, then I'd start to wonder if nVidia is planning to retire G-SYNC altogether.
 
Last edited:
Do we know anything about the curvature on these screens? I couldn't find any info on the website.

Thanks!

Before the final monitor has been reviewed I'd remain somewhat skeptical of all specs. However, for what it's worth, according to TFTCentral's parts database, the curvatures of the UW4 and UW5 panels (and this monitor certainly uses one of those two) are both 1900R.
 
Last edited:
Do you think we will know something about the 34GK950G at IFA 2018? Release date maybe?

BTW, I'm waiting to see if it really comes with 120Hz OC or 144Hz downclocked to 120. If it's the first scenario, maybe I'll go for the Alienware AW3418DW.

I like so much the ASUS ROG PG35VQ and Acer Predator X35 with HDR and 200Hz OC but their price will be over 2000$.
 
Last edited:
Back
Top Bottom