Associate
Active cooling? Is there a fan blowing on the module to keep it cool?
Yup so far all monitors released with this module have active cooling because it runs hot...
Active cooling? Is there a fan blowing on the module to keep it cool?
Yupppp definitely not going with first gen monitors then. Gut feeling that its going to be a disaster. This LG might just be the best monitor till 2nd gen Gsync 1.4Yup so far all monitors released with this module have active cooling because it runs hot...
I mean thats fine as long as you can live with tearing or occasional heavy drops down from 144hz to 70hz or so (ie being back ion the vsync world) because there will always be times where even the newest cards will drop below your monitor refresh rate when talking about 3440x1440 or 4K... at least for the next 5 years or so. I myself am unsure. I absolutely hate tearing, but I also hate when vsync suddenly drops me from 60hz to 30 or 45 hz. But maybe a drop from 144hz to 72 hz or 108hz wont be as bad... hmm geez i dunno such a hard decision and I only buy monitors like once every 7-9 years...
I am in the 60hz vsync world and I want out Out OUT!
The FPGA chip is the main hardware part that implements DP1.2.I believe it's the FPGA chip that is overclocked, but some say it's the Displayport 1.2. I guess it could be either or both.
I read it. I see no explanation of the overclock. Where do you see that?Probably the best article yet: https://www.computerbase.de/2018-09/lg-ultragear-monitor-34gk950g-34gk950f-ifa/.
you missing the point my friend.. some of us dont care about gsync if the refresh rate is high enough as it(high refresh rate) minimizes the effect of tearing a good amount. We are talking vsync off here. I run gsync off on games where i can get higher than my panels 165hz refresh rate cause there are benefits to this. I have even run the panel complete sync free for a week playing games at a static 144hz and wasn't bothered the slightest no matter the framerate( as long as minimums of course stayed above 65ish). To me personally variable refresh rate has its sweet spot around 60 to 99ish hz. Below is not a pleasant experience and above we get into diminishing returns of the positive effect it has, in my opinion. In the end there are of course many things that affects the end result and what you might want to go for. How well is the frame pacing of the game, what average and minimum framerates can be hit, are you sensitive to image persistence(like i am) or more to tearing.
That said, 60hz vsync world is dreadful and i hope you escape it no matter the means .
You can overclock the mhz of the connections in the Displayport ..well port. It's 540mhz. This was upped to 810 for DP 1.3. To increase the bandwidth of the Displayport, you need to overclock it. This can be a separate controller chip, but it can probably also be built in to the FPGA chip. Either way, you are overclocking the clock rate, and thus break (go beyond) the spec. Sounds funny, but that is literally what you are doing.The FPGA chip is the main hardware part that implements DP1.2.
Strictly speaking, DP1.2 is a standard. You can't overclock a standard. You can only overclock the hardware that implements that standard.
The FPGA and the companion components on the same board are what we call the g-sync module. This g-sync module is what receives and processes the GPU's video signal in accordance with the DP1.2 standard.
To "overclock the g-sync module" is the most accurate way of expressing this. To "overclock the FPGA chip" is more specific and still correct. To "overclock DP1.2" is a weird expression and technically not really a thing.
What a small world. I guess google failed me. Good to know, and yeah it looks like it has few zones in that panel. But I've heard other complain about bloom in the X27. After all, unless there are as many zones as pixels, you will always have some bloom. Question is, if you notice it in games and desktop work or movies.I took that picture, it is the Dell UP2718Q. That is a TERRIBLE monitor and by far the worse FALD ever. My X27 is way better than that.
You can overclock the mhz of the connections in the Displayport ..well port. It's 540mhz. This was upped to 810 for DP 1.3. To increase the bandwidth of the Displayport, you need to overclock it. This can be a separate controller chip, but it can probably also be built in to the FPGA chip. Either way, you are overclocking the clock rate, and thus break (go beyond) the spec. Sounds funny, but that is literally what you are doing.
And what happens if you overclock the controller? What happens to the connection over the displayport port and the displayport cables? Do they run at a higher frequenzy than the standard allows? If so, then yes, it's literally overclocked.Nope.
Again, you're NOT overclocking "the port". Nor are you overclocking "display port". Nor are you overclocking "the connection". All of those expressions are technically incorrect.
If anything, you're overclocking the controller (which receives and processes the display port signal) . For a g-sync monitor, which is what we're discussing here, the g-sync module is that controller (which includes the FPGA) .
Closest thing is a reddit user claimed to be at Gamescom and asked a Dell rep if there were any new monitors. The guy said they would be unveiling stuff the next day. Next day came and another rep said the first rep was mistaken. No new monitor was shown.Is there anything in the rumour mill about any other brands releasing high end monitors?
8+2 bit FRC is enough for even DisplayHDR1000, so it's fine. This panel is 550 nits peak, so it's a lot closer to DisplayHDR600 than 400. Like I've stated elsewhere, it's odd LG didn't push it to DisplayHDR600 like the 5K ultrawide.
I have no interest in FALD. It creates way too much bloom/light blobs, that is a lot worse for image fidelity than what the added iris destruction capabilities of 1000 nits at less than 1 meters distance, adds:
I mean look at the bloom here. Looks like city lights in fog. It's not supposed to be there. And this is on the stupidly expensive Gsync DisplayHDR1000 monitors. No thanks.