LG 34GK950G, 3440x1440, G-Sync, 120Hz

So ACER is going to release the XB273K, a 27" inch monitor with 4K resolution, IPS, 144Hz because of the new GSync HDR module despite of it's only DisplayHDR 400. But 1499€.

And the LG 34GK950G is 1399€. Bad LG, overpriced. DOA.
 
I mean thats fine as long as you can live with tearing or occasional heavy drops down from 144hz to 70hz or so (ie being back ion the vsync world) because there will always be times where even the newest cards will drop below your monitor refresh rate when talking about 3440x1440 or 4K... at least for the next 5 years or so. I myself am unsure. I absolutely hate tearing, but I also hate when vsync suddenly drops me from 60hz to 30 or 45 hz. But maybe a drop from 144hz to 72 hz or 108hz wont be as bad... hmm geez i dunno such a hard decision and I only buy monitors like once every 7-9 years...

I am in the 60hz vsync world and I want out Out OUT!

you missing the point my friend.. some of us dont care about gsync if the refresh rate is high enough as it(high refresh rate) minimizes the effect of tearing a good amount. We are talking vsync off here. I run gsync off on games where i can get higher than my panels 165hz refresh rate cause there are benefits to this. I have even run the panel complete sync free for a week playing games at a static 144hz and wasn't bothered the slightest no matter the framerate( as long as minimums of course stayed above 65ish). To me personally variable refresh rate has its sweet spot around 60 to 99ish hz. Below is not a pleasant experience and above we get into diminishing returns of the positive effect it has, in my opinion. In the end there are of course many things that affects the end result and what you might want to go for. How well is the frame pacing of the game, what average and minimum framerates can be hit, are you sensitive to image persistence(like i am) or more to tearing.

That said, 60hz vsync world is dreadful and i hope you escape it no matter the means :D.
 
I believe it's the FPGA chip that is overclocked, but some say it's the Displayport 1.2. I guess it could be either or both.
The FPGA chip is the main hardware part that implements DP1.2.

Strictly speaking, DP1.2 is a standard. You can't overclock a standard. You can only overclock the hardware that implements that standard.

The FPGA and the companion components on the same board are what we call the g-sync module. This g-sync module is what receives and processes the GPU's video signal in accordance with the DP1.2 standard.

To "overclock the g-sync module" is the most accurate way of expressing this. To "overclock the FPGA chip" is more specific and still correct. To "overclock DP1.2" is a weird expression and technically not really a thing.
 
Last edited:
you missing the point my friend.. some of us dont care about gsync if the refresh rate is high enough as it(high refresh rate) minimizes the effect of tearing a good amount. We are talking vsync off here. I run gsync off on games where i can get higher than my panels 165hz refresh rate cause there are benefits to this. I have even run the panel complete sync free for a week playing games at a static 144hz and wasn't bothered the slightest no matter the framerate( as long as minimums of course stayed above 65ish). To me personally variable refresh rate has its sweet spot around 60 to 99ish hz. Below is not a pleasant experience and above we get into diminishing returns of the positive effect it has, in my opinion. In the end there are of course many things that affects the end result and what you might want to go for. How well is the frame pacing of the game, what average and minimum framerates can be hit, are you sensitive to image persistence(like i am) or more to tearing.

That said, 60hz vsync world is dreadful and i hope you escape it no matter the means :D.


But with new RTX cards you will not be getting anywhere near those kind of FPS, even at 1080p... ultrawide is going to struggle with ray tracing, and that's where G-Sync will prove incredibly beneficial. Of course, if you have no intention of buying a 20xx card and playing games with ray tracing, then you have a point. Even outside of that though, there will always come a time most gamers eventually find their GPU struggles with the latest titles... and they won't want to downscale the graphics to the point they look like a potato... so again, that's where G-Sync comes in to play when that frame rate dips. So yes, you are right in certain instances assuming one can hit extremely high FPS, but as you also state, the average/minimum frame rates will dictate how possible that actually is.

For some Nvidia owners I'm sure the Freesync version of this monitor may be preferable and absolutely suit their needs perfectly... depending on what games they play. But for others it really won't.
 
Last edited:
The FPGA chip is the main hardware part that implements DP1.2.

Strictly speaking, DP1.2 is a standard. You can't overclock a standard. You can only overclock the hardware that implements that standard.

The FPGA and the companion components on the same board are what we call the g-sync module. This g-sync module is what receives and processes the GPU's video signal in accordance with the DP1.2 standard.

To "overclock the g-sync module" is the most accurate way of expressing this. To "overclock the FPGA chip" is more specific and still correct. To "overclock DP1.2" is a weird expression and technically not really a thing.
You can overclock the mhz of the connections in the Displayport ..well port. It's 540mhz. This was upped to 810 for DP 1.3. To increase the bandwidth of the Displayport, you need to overclock it. This can be a separate controller chip, but it can probably also be built in to the FPGA chip. Either way, you are overclocking the clock rate, and thus break (go beyond) the spec. Sounds funny, but that is literally what you are doing.
 
I took that picture, it is the Dell UP2718Q. That is a TERRIBLE monitor and by far the worse FALD ever. My X27 is way better than that.
What a small world. I guess google failed me. Good to know, and yeah it looks like it has few zones in that panel. But I've heard other complain about bloom in the X27. After all, unless there are as many zones as pixels, you will always have some bloom. Question is, if you notice it in games and desktop work or movies.
 
You can overclock the mhz of the connections in the Displayport ..well port. It's 540mhz. This was upped to 810 for DP 1.3. To increase the bandwidth of the Displayport, you need to overclock it. This can be a separate controller chip, but it can probably also be built in to the FPGA chip. Either way, you are overclocking the clock rate, and thus break (go beyond) the spec. Sounds funny, but that is literally what you are doing.

Nope.

Again, you're NOT overclocking "the port". Nor are you overclocking "display port". Nor are you overclocking "the connection". All of those expressions are technically incorrect.

If anything, you're overclocking the controller (which receives and processes the display port signal) . For a g-sync monitor, which is what we're discussing here, the g-sync module is that controller (which includes the FPGA) .
 
Nope.

Again, you're NOT overclocking "the port". Nor are you overclocking "display port". Nor are you overclocking "the connection". All of those expressions are technically incorrect.

If anything, you're overclocking the controller (which receives and processes the display port signal) . For a g-sync monitor, which is what we're discussing here, the g-sync module is that controller (which includes the FPGA) .
And what happens if you overclock the controller? What happens to the connection over the displayport port and the displayport cables? Do they run at a higher frequenzy than the standard allows? If so, then yes, it's literally overclocked.
 
Is there anything in the rumour mill about any other brands releasing high end monitors?
Closest thing is a reddit user claimed to be at Gamescom and asked a Dell rep if there were any new monitors. The guy said they would be unveiling stuff the next day. Next day came and another rep said the first rep was mistaken. No new monitor was shown.

Either the reddit user was lying or the Dell rep slipped up. There have been lots of sales going on for the Dell lately as well and some sites have listed it as clearance. We can't really extrapolate anything substantial from that, so for now who knows whats going on?
 
8+2 bit FRC is enough for even DisplayHDR1000, so it's fine. This panel is 550 nits peak, so it's a lot closer to DisplayHDR600 than 400. Like I've stated elsewhere, it's odd LG didn't push it to DisplayHDR600 like the 5K ultrawide.

I have no interest in FALD. It creates way too much bloom/light blobs, that is a lot worse for image fidelity than what the added iris destruction capabilities of 1000 nits at less than 1 meters distance, adds:

I mean look at the bloom here. Looks like city lights in fog. It's not supposed to be there. And this is on the stupidly expensive Gsync DisplayHDR1000 monitors. No thanks.

You have no real information on 550 nits peak brightness. And like I said before, HDR600 requires at least some kind of local dimming, which 34GK950 does not have.

As for FALD... What you have shown is the worst scene possible, with very small bright object on the totally black background. But for normal scene, it has an ability to adjust brightness for many zones independently, which is exactly what HDR is about. It has it's limitations, and I am also not really interested in anything else than OLED when it comes to HDR, but FALD is still far better than the alternative which is edge-led, which will have the entire vertical or horizontal trail of light visible instead of the blooming you have shown. Something like this:

uhdtv_LG-49UB850V_blooming.jpg


And again, the concept of 1000 nit eye burning brightness is completely incorrect and points out to someone's lack of knowledge or experience with HDR. This kind of brightness is reserved only for small details and the average scene brightness in HDR is usually around 150 nits, which is very dim and there are a lot of complains about HDR being too dim for bright room viewing. Especially that display manufacturers usually prefer to dim the picture to preserve highlight details (highlights of brightness higher than TV can reach, so they are dimming and rest of the picture to keep as close as possible to original proportions between bright object and the rest of the scene), so you end up with average brightness that is even lower than original. Of course, if you have some poor display with poor backlight precision and poor dimming algorithm, there is a good chance of picture getting blown out, but if we talk about quality solutions, HDR picture is never too bright, no matter if it is 1000, 4000 or 10000 nits. It is just not what this brightness is used to, it is for the details, not for blowing the entire picture out to peak brightness.
 
Back
Top Bottom