LG 34GK950G, 3440x1440, G-Sync, 120Hz

Associate
Joined
29 May 2018
Posts
146
Yes, so if the gsync module can handle 2560x1440 @165hz then it should also be able to handle 3440x1440 @120hz. Correct?
Can every single Gsync module handle that? Thats the concern right now. One Gsync module might be able to, but another might not.
@Christopher G
That could be correct, but we understand next to nothing about how the G-SYNC module works internally, so we just can't be sure. :(

@ChrisPyzut
Graphic cards are often sold with a stock OC. To do so, the card OEMs test each GPU and bin them, so only those which can run with the OC are used for OC'd cards. If G-SYNC modules are in any way similar, it's virtually guaranteed monitor OEMs will also do some binning. The idea that some G-SYNC modules might reliably reach 120 Hz while others won't is highly unlikely (as is the case with stock OC'd GPUs).

I don't know why the 34GK950G allows overclocking to be enabled/disabled in the OSD menu, but one of the following scenarios is far more likely:
  • Overclockability is merely a psychological feature, meaning it could just as well have been run at 120 Hz all the time and the option in the OSD menu removed (hey enthusiast dimwits, the 34GK950G checks the OC box, yay!).
  • Overclockability is a legal or risk related feature. This would help LG mitigate risk if nVidia refused to make any guarantees for the G-SYNC module when run outside the specs. This amounts to LG passing that risk on to the consumer (hey enthusiast, we think this will will work fine 100% of the time, but you're not getting any guarantees when running OC'ed regardless).
  • Overclocking the G-SYNC module downgrades some other metric (color depth, chroma subsampling) to allow for higher refresh rates, which is why the ability to disable overclocking is useful.
 
Last edited:
Associate
Joined
28 Jul 2018
Posts
116
That's because you are clueless. The 1080Ti averages 120FPS @ 2560x1440p at ultra settings so its fair to assume the 2080Ti which is 30-40% more powerful will be able to average 144hz or close to it @ 3440x1440p. The 2080Ti is already said to average 100fps @ 4k and that's without DLSS enabled.




Because all the tech review websites are lying and your the one with the "truth" am I right? Lmfao get the **** outta here dude.



Now I'm afraid that's a biggest pile of ******** I've ever heard. 1080Ti is a 100+ fps card @ 3440x1440p. Hell even my Vega 64 averaged 80fps @ 3440x1440p. Theirs no way a 1080ti fails to get 60fps unless the game is still in alpha/beta.



It will still be far enough to max my 100hz Ultrawide in all games. :)



Um yes we do... It's called the 2080Ti for 4k and the 2080 for 1440p.



Average performance benchmarks do matter as they represent real world performance of a broad range of games tested at the highest settings. Only liar/uninformed idiot would say they don't matter.



What a load of utter nonsense.. I'm convinced you are clueless about how benchmarks are tested.

You know that you are supposed to read first and then reply? And read whole post, not just pick few words out of context? I mean, this kind of response in not overly surprising looking at all your posts here, but man, I didn't even start to get to any precise arguments and you are already all over the place, one step from getting a heart attack, before any real discussion even begins :p Better don't reply anymore because you will hurt yourself :D

I could debunk everything that you have said just by giving you a list of let's say 10 or 15 diversified games (only this few so you don't get too much work) and telling you to record how you play them on your Vega with 80 FPS at 3440x1440 maxed out, but since you have Vega that is clearly nowhere near capable of that, and given your poor mental state and getting frustrated in like 5 seconds so you can't even manage to read the whole post and think for a moment before replying, I think that it will be the best to just end here, for your own good :p
 

Stu

Stu

Soldato
OP
Joined
19 Oct 2002
Posts
2,737
Location
Wirral
@Daniel - LG

Would you be able to find out if internal testing has been done to make sure the 120hz is stable over time? With the Dell and Acer it seems like after a month or so of use at 120hz, flickering starts to develop.

Thanks for your responses!

Where are you hearing reports of the Dell monitor developing flickering after a month? Lots of owners in the Dell thread very happy and not complaining... I've only seen one report (very recently) of an issue.
 

Stu

Stu

Soldato
OP
Joined
19 Oct 2002
Posts
2,737
Location
Wirral
If the G-sync module has to be overclocked for 120 Hz operation at this resolution then it may be the same as with any overclocking, slight degradation happens over time and what was stable few months ago may no longer be stable. This is true not only for overclocking but for normal operation too, this is one of the reasons why things like CPUs or GPUs have very conservative stock clocks and use much higher voltage then is need to majority of them, it is done to account for not only the differences in quality and capability between all units of the same model that are coming from production imperfections, but also for degradation or wear down over time. So there is very little chance that something pushed to the limits is going to stay there stable for prolonged intensive use, it will give up relatively quickly and you will have to either increase the voltage (which may not give positive result) or reduce the clocks, or in this case refresh rate.

This is true in theory, and I'm sure sometimes in practise, but there are hundreds of people on this forum that set the overclock in week 1 of owning CPUs and GPUs and then never had a need to reduce the OC later in life.
 
Associate
Joined
20 Jan 2018
Posts
149
Where are you hearing reports of the Dell monitor developing flickering after a month? Lots of owners in the Dell thread very happy and not complaining... I've only seen one report (very recently) of an issue.

Reddit. Every week I see someone post something like that.
 
Associate
Joined
28 Jul 2018
Posts
116
This is true in theory, and I'm sure sometimes in practise, but there are hundreds of people on this forum that set the overclock in week 1 of owning CPUs and GPUs and then never had a need to reduce the OC later in life.

Certainly. I think I mentioned in my post that it doesn't have to be the case. If the overclock is "reasonable" which means not pushing up to the very last MHz and not setting the highest safe voltage available then there shouldn't be much issue, because there is significant headroom left. I was referring more to the situation where you really push to the max, to the edge of instability, artifacts and etc. This edge will certainly move down over time.

And if something gets flickering after just 2 months of use then it is very possible that it was pushed to the limits. But if the complains about this issue are very few then it was probably just a weaker module, or maybe panel fault, who knows.


This HDMI 1.4...

Ah and also the new Dell ultrawide that was rumored here few times is likely U3419W. Basically USB-C version of U3417W. One would think that it would get new panel with Nano IPS and HDR400 like LG 34WK95C, but I don't see any info about it.
 
Last edited:
Soldato
Joined
31 Dec 2006
Posts
7,224
Certainly. I think I mentioned in my post that it doesn't have to be the case. If the overclock is "reasonable" which means not pushing up to the very last MHz and not setting the highest safe voltage available then there shouldn't be much issue, because there is significant headroom left. I was referring more to the situation where you really push to the max, to the edge of instability, artifacts and etc. This edge will certainly move down over time.

And if something gets flickering after just 2 months of use then it is very possible that it was pushed to the limits. But if the complains about this issue are very few then it was probably just a weaker module, or maybe panel fault, who knows.



This HDMI 1.4...

Ah and also the new Dell ultrawide that was rumored here few times is likely U3419W. Basically USB-C version of U3417W. One would think that it would get new panel with Nano IPS and HDR400 like LG 34WK95C, but I don't see any info about it.


Yeah the U3419W is just a plain old UW professionally oriented monitor... not gaming and not aimed at same market as the AW34. The only upgrade path from the AW34 is if Dell do something with the UW5 panel. It would good if Dell were somehow able to utilise the 1.4 G-Sync module and make it competitive price wise, if that's even possible.
 
Caporegime
Joined
18 Oct 2002
Posts
29,677
You'll have to wait for these to drop to a decent price guys, LG always go in high then drop a bit when they want some actual sale numbers (at least on certain models, I think these would count). They do look like they'll be VERY nice though, always been happy with my LG monitors :)
 
Back
Top Bottom