LG 34GK950G, 3440x1440, G-Sync, 120Hz

Associate
Joined
15 Jun 2018
Posts
99
As a gamer AND professional graphic artist, given the F version does not comprimise on color reproduction quality, and on top of that it comes with higher refresh rate and future proof connectivity such as DP1.4 and hdmi 2.0. I think it would be unwise to support nvidia screwing people over with the super expensive g-sync in order to get 144hz and true 8bit color depth and latest connectivity, so all in all I think the F version is overall a better product and cheaper as well, and my conclusion is coming from a person who wants to use a Nvidia GPU with it.

Also if you really want your nvidia card to work with free sync, there are hacks that will enable that.
https://www.tomshardware.co.uk/use-amd-freesync-nvidia-gpu,news-59067.html

I have no interest in being political or taking a stand against NVidia but as you say, I am not interesting in paying overprice either.

I think someone pointed out here that the 10bit panel is nothing you will notice in games. So it motivates you since you work with graphics but for me as a pure gamer the GSync lures more.

I wouldn't buy a monitor based on a hack that will most likely be patched away as soon as NVidia gets the chance.

But I do agree with you on a lot of points and appreciate your input.
 
Associate
Joined
29 May 2018
Posts
146
As a gamer AND professional graphic artist, given the F version does not comprimise on color reproduction quality, and on top of that it comes with higher refresh rate and future proof connectivity such as DP1.4 and hdmi 2.0. I think it would be unwise to support nvidia screwing people over with the super expensive g-sync in order to get 144hz and true 8bit color depth and latest connectivity, so all in all I think the F version is overall a better product and cheaper as well, and my conclusion is coming from a person who wants to use a Nvidia GPU with it.
The idea that having a DP1.4 connector somehow future-proofs the monitor is slightly ridiculous. You buy the monitor with the capabilities it has at the time of purchase, and that's how it will stay until discarded. Having a more modern connector will not somehow magically improve the monitor down the road. There simply is no "future-proofing" by way of the connector.

Also, both F and G use the same 8 bit+FRC panel. The idea of one being "true 8 bit" is strange. For graphics work there are certainly better monitors.

In a vaccum, I agree that the F is the better product and better value, but that is irrelevant because the product is not used in a vaccum. The important question is which model gives each person, in their specific setup, the best results. That is definitely not always the F model.
 
Associate
Joined
29 May 2018
Posts
146
I think someone pointed out here that the 10bit panel is nothing you will notice in games. So it motivates you since you work with graphics but for me as a pure gamer the GSync lures more.
For HDR, a 10bit panel (or 8bit+FCR) is mandatory.

On a real HDR monitor (not the DisplayHDR 400 rated F version), you most definitely will notice a huge difference in gaming, so in that way, a 10bit panel is something you'd want for gaming, but only in combination with HDR.

Without HDR, most AMD and nVidia cards would refuse to even send a 10bit signal, making any 10bit panel entirely useless for gaming.
 
Associate
Joined
29 May 2018
Posts
146
Perhaps I misworded my point. Let me try rephrase it. The Gsync that comes with this display is outdated in the sense that it can only supply you with 8bit color depth at 100hz, which seems to be the absolute limit it can take in terms of maintaining image quality, should you wish to overclock it to 120hz, you will likely be turning the image quality into something more in line with 6bit color depth. For me that is unacceptable for such a high priced monitor, especially now that the freesync in comparison offers connectivity that makes the refresh rates (144hz without any compromise to the 8bit color depth image quality
Yup, you misworded :) If reduction to 6bit is in fact the consequence of "overclocking" the G-SYNC module (or any other similar tricks) , then I completely agree... that would be crap.

However, AFAIK that is currently merely an assumption which lacks all/any supporting evidence. Certainly not enough to accuse the G of compromising color depth.

See post #782 for my view on the G's overclocking "feature".
 
Associate
Joined
15 Jun 2018
Posts
99
For HDR, a 10bit panel (or 8bit+FCR) is mandatory.

On a real HDR monitor (not the DisplayHDR 400 rated F version), you most definitely will notice a huge difference in gaming, so in that way, a 10bit panel is something you'd want for gaming, but only in combination with HDR.

Without HDR, most AMD and nVidia cards would refuse to even send a 10bit signal, making any 10bit panel entirely useless for gaming.

Ah. thank you for the clarification.
 
Associate
Joined
15 Jun 2018
Posts
99
I know that you talked about bandwidth earlier in this thread and if I could ask a question about that.

If this is the correct calculation:
3840x2160@60hz (8bit panel) is 14.93 Gbps And 3440x1440@144hz (8bit panel) is 21.40 Gbps

And the aim for the new 2080TI is to keep 4k@60 consistently at high settings. Does that mean that if I bought the F-version I would have to start playing at say medium/low settings in games to be able to keep 144hz?

Am I out of track here? I have a light fever so I am feeling a bit slow at the moment and could miss something obvious.
 
Associate
Joined
12 Dec 2010
Posts
1,837
Location
Washington D.C.
4K/60 is 498M pixel/sec demand. 3440x1440/144 is 713M pixels/sec demand. 43% greater demand for the latter. The latter having the same requirement around 100 Hz as 4K at 60 Hz/FPS.

So assuming in triple AAA on Ultra 2080 Ti did only 60 FPS 4K (I suspect it will be more around 75), yes you would need to turn some setting down. If you use 4K 75 as the base metric, it means you could run 3440x1440 up to ~125 FPS on Ultra. I think this last scenario is realistic.

Now there will be some +/- to this as there are some rendering dissimilarities between 16:9 and 21:9 field of view.
 
Associate
Joined
15 Jun 2018
Posts
99
4K/60 is 498M pixel/sec demand. 3440x1440/144 is 713M pixels/sec demand. 43% greater demand for the latter. The latter having the same requirement around 100 Hz as 4K at 60 Hz/FPS.

So assuming in triple AAA on Ultra 2080 Ti did only 60 FPS 4K (I suspect it will be more around 75), yes you would need to turn some setting down. If you use 4K 75 as the base metric, it means you could run 3440x1440 up to ~125 FPS on Ultra. I think this last scenario is realistic.

Now there will be some +/- to this as there are some rendering dissimilarities between 16:9 and 21:9 field of view.

Ah I see. Interesting.

I used this calculator.
https://k.kramerav.com/support/bwcalculator.asp

And they state the same number as you regarding pixels/sec. Is "total signal bandwidth" not relevant in this case? Seems to be three times the pixel/s.
 
Associate
Joined
12 Dec 2010
Posts
1,837
Location
Washington D.C.
Pixel processing is different than connection bandwidth. Pixel processing/rasterization is the demand on the GPU. Displayport bandwidth is just the speed of the connection.

3440x1440 144 Hz at 1 FPS is the same Displayport bandwidth as 3440x1440 144 Hz at 144 FPS. The former just refreshes the same frame 144 times in a second.
 
Associate
Joined
20 Jan 2018
Posts
149
I think the image quality drop that you will experience with the gsync version is something similar to the problem people will see with the highend 4k hdr monitor from asus and acer.
https://m.hexus.net/tech/news/monitors/119354-g-sync-hdr-4k-monitor-images-softer-blurred-144hz/

Multiple users on reddit and here have posted that there is no chroma subsampling at 3440x1440 120hz on DP1.2. A guy on reddit I talked to about it used this test to check https://www.geeks3d.com/20141203/ho...-chroma-subsampling-used-with-your-4k-uhd-tv/

It does suck though, and people who bought those monitors are experiencing that. Thats too expensive of a purchase to have issues with lol.
 
Back
Top Bottom