LG 48CX OLED - 4K 120 Hz

I have set my desktop black, no icons and taskbar to auto hide, though for some reason sometimes it will 100% hide and other times there will be a couple of lines of pixels at the bottom edge. Not sure what influences it?
 
It's in the owner's manual, page 15.

https://www.lg.com/uk/support/manuals?csSalesCode=OLED48CX5LC.AEK

Examples of Images that may Cause
Image Retention
• Still images or fixed images containing certain information
that are displayed uninterrupted on the screen, such as channel
numbers, station logos, program titles, news or movie subtitles,
and headlines.
• Fixed menu or icons for video game consoles or broadcasting
set-top boxes.
Black bars shown on the left, right, top, or bottom of the screen,
such as in images with a 4:3 or 21:9 ratio
.
 
Just got my CX 48" today, really impressed. Few things to ask, what's the consensus on windows HDR? I know it used to be poor is that still the case? I play a lot of Warzone and guessing I need it ON to use HDR in the game? Also noticed whilst it was on and nvidia geforce experience was recording clips (in HDR I presume) it won't let me play the kills back? Is there a specific HDR setting for recordings using GF experience?

Any settings advice would great!

I've also made it so the taskbar isn't visible and removed desktop icons to help minimise burn in. Any other suggestions would be welcome!

I leave it on.
 
@Spikey
Got wrong end of stick I thought was OK for EU reading your post. Now having reread you say it isn't so will remove again
How are you finding the size and are you using as a desktop?
 
Last edited:
I see this random rumour hasn't punched you in the face yet. :)

I had pretty much decided to move to the 3080ti because by the time it released I would have had around a year's use of my 3080 and resale would have made it feasible. However, having used the 3080 with the 48CX I'm not even sure an upgrade is required until the 40 series. The 3080 is a perfect match for this screen and perfectly pitched in general which is why it's constantly sold out, I actually wouldn't be surprised if we never see a 3080ti.

I agree regarding the 3090 also, I just can't see the attraction for gaming other than the fact it's actually available which probably tells it's own story.

Hehe I wasn't completely wrong! Nvidia just announced that the 3080Ti has been indefinitely postponed. I'm sort of bummed actually because I was planning to try getting one. 3080 is plenty good though until games start to hit the 10GB limit (IIRC even cyberpunk at max settings only uses 7-8GB).
 
Watch dog legions is 10gb of Vram in RTX, in two years you might need more, that's why the 3090 was the best future deal

Godfall will use 12GB of VRAM at 4K Ultra settings

Interesting to know. Yeah the 10GB is really annoying. Feels like I'm gonna have to upgrade at next cycle - whereas I usually like to hold onto my GPU for at least 3-4 years. 3080Ti would be perfect of course but it's looking like we won't see it until late 2021.
 
Interesting to know. Yeah the 10GB is really annoying. Feels like I'm gonna have to upgrade at next cycle - whereas I usually like to hold onto my GPU for at least 3-4 years. 3080Ti would be perfect of course but it's looking like we won't see it until late 2021.

I'm always prepared to upgrade every gen if the conditions are right, but there are a couple of things that rarely get discussed when this Vram issue gets raised:

  • Did the particular reference game actually need the reported level of Vram, or did it just use what it could?
  • What was required to lower the Vram, and did it actually have a significant visual impact?

The second point was highlighted to me when I had my 1080ti. It wasn't Vram that was the issue, it was raw grunt at 4k. Surprisingly (to me), turning some settings down was barely noticeable and when I got my 48CX I actually played a couple of titles at 1440p quite happily.

The way this industry is shaping up, I'm more and more inclined to hold onto what I have and tweak some settings to keep things relevant. It's not a route every gamer or enthusiast will want to take, but it's definitely for me.
 
Interesting to know. Yeah the 10GB is really annoying. Feels like I'm gonna have to upgrade at next cycle - whereas I usually like to hold onto my GPU for at least 3-4 years. 3080Ti would be perfect of course but it's looking like we won't see it until late 2021.

If and when the 3080TI 20GB Vram comes out!, But for the price you paid for a 3080 + 3080TI you could have just bought the 3090 with 4gb of Vram more. Nvidia always dose this, it will be the same when the 4080 hits the market gamers will rush out and buy it than the TI will come out within five months latter with more vram and speed, that's why I always wait for TI version

Just maybe the 3090TI might come out this year! IMHO:D
 
Last edited:
So godfall actually only uses about 8GB at everything maxed out at 4k. Similarly with watchdog legions, it might request more but it never uses more than 8-9GB. Cyberpunk maxed out at 4k is around 7GB. Every game out right now is at 6-9GB at 4k. https://www.youtube.com/watch?v=UecujMBJW7Y shows how little actual difference there is, you're talking about 5fps difference, 10fps at most in some games (like 72FPS vs 80FPS). Wish I could convince myself the 3090 is worth it but it's just such a bad value proposition no matter how you look at it. The 3090 FE is actually in stock right now - the head says no but the heart says yes lol.
 
So godfall actually only uses about 8GB at everything maxed out at 4k. Similarly with watchdog legions, it might request more but it never uses more than 8-9GB. Cyberpunk maxed out at 4k is around 7GB. Every game out right now is at 6-9GB at 4k. https://www.youtube.com/watch?v=UecujMBJW7Y shows how little actual difference there is, you're talking about 5fps difference, 10fps at most in some games (like 72FPS vs 80FPS). Wish I could convince myself the 3090 is worth it but it's just such a bad value proposition no matter how you look at it. The 3090 FE is actually in stock right now - the head says no but the heart says yes lol.

1399 for a card with 24gb of vram and you,say it’s bad value? It’s a bargain as far as I’m concerned, I thought it would be 2400
 
Back
Top Bottom