LG 48CX OLED - 4K 120 Hz

1399 for a card with 24gb of vram and you,say it’s bad value? It’s a bargain as far as I’m concerned, I thought it would be 2400

I meant specifically for gaming of course on this LG CX. If you are a professional doing blender or whatever, then the 24GB might make sense since it sounds like you are comparing to the titan rtx.
 
If and when the 3080TI 20GB Vram comes out!, But for the price you paid for a 3080 + 3080TI you could have just bought the 3090 with 4gb of Vram more. Nvidia always dose this, it will be the same when the 4080 hits the market gamers will rush out and buy it than the TI will come out within five months latter with more vram and speed, that's why I always wait for TI version

Just maybe the 3090TI might come out this year! IMHO:D

More likely I'll be able to resell my 3080 for what I paid for it, based on current trends - paying 0 to rent it. :D Vs the 3090 which even now is only selling for break even on the bay. I was so tempted to grab a 3090 FE on scan, you have no idea haha, as I was always planning to buy the top tier card this gen. I know it's completely irrational, but this is my hobby and I want "the best". If it had like 20-30% uplift I would have gotten it without even thinking. But I think I'll be 100% buying the 3080Ti whenever it comes out, assuming I can grab one - I'm expecting it to equal the 3090 for performance, or be very close.
 
Last edited:
So godfall actually only uses about 8GB at everything maxed out at 4k. Similarly with watchdog legions, it might request more but it never uses more than 8-9GB. Cyberpunk maxed out at 4k is around 7GB. Every game out right now is at 6-9GB at 4k. https://www.youtube.com/watch?v=UecujMBJW7Y shows how little actual difference there is, you're talking about 5fps difference, 10fps at most in some games (like 72FPS vs 80FPS). Wish I could convince myself the 3090 is worth it but it's just such a bad value proposition no matter how you look at it. The 3090 FE is actually in stock right now - the head says no but the heart says yes lol.

Only if you have Ray Tracing disabled you can play it at 8gb 4k the same as watch dogs legion, if turn on RTX it uses more Vram 10gb at high resolution with max out settings

Godfall requires 12GB of VRAM for its 4K by 4K textures
https://hexus.net/gaming/news/pc/146626-godfall-requires-12gb-vram-4k-4k-textures/


Watch dog legions specs
https://news.ubisoft.com/en-us/article/WCiLJPAN9QHWwb9JBc1Wj/null
 
Last edited:
Quick question guys - Is there a fix for the CX not detecting a PC HDMI input on startup? I have disabled fast start on the panel in the settings, but still have the issue where HDMI isnt being detected.

Hitting reset on the PC will then have the panel detect HDMI. Gets a bit annoying having to reset every time i switch the PC on to detect the input.
 
Quick question guys - Is there a fix for the CX not detecting a PC HDMI input on startup? I have disabled fast start on the panel in the settings, but still have the issue where HDMI isnt being detected.

Hitting reset on the PC will then have the panel detect HDMI. Gets a bit annoying having to reset every time i switch the PC on to detect the input.

I'd be interested in that too as it annoys me.
 
Only if you have Ray Tracing disabled you can play it at 8gb 4k the same as watch dogs legion, if turn on RTX it uses more Vram 10gb at high resolution with max out settings

Godfall requires 12GB of VRAM for its 4K by 4K textures
https://hexus.net/gaming/news/pc/146626-godfall-requires-12gb-vram-4k-4k-textures/


Watch dog legions specs
https://news.ubisoft.com/en-us/article/WCiLJPAN9QHWwb9JBc1Wj/null

I know what the developer said before release. But they vastly exaggerated the actual requirements or were misquoted. Actual usage is like 6GB with fully maxed out everything. https://www.reddit.com/r/nvidia/comments/jspy61/godfall_requires_only_6gb8gb_vram_at_4k_maxed_out/

As you said you want the best well the RTX 3080 isn't the best card, it's the RTX 3090:) built for the future;)

Yah I know, I wanted the 3090, and was prepared to pay a lot for 20-30% better performance like the 2080Ti before it. But this generation 3090 was such a joke, I can't bring myself to spend that much on 5-10% uplift. What was Jensen smoking when he approved this. :D
 
Last edited:
Quick question guys - Is there a fix for the CX not detecting a PC HDMI input on startup? I have disabled fast start on the panel in the settings, but still have the issue where HDMI isnt being detected.

Hitting reset on the PC will then have the panel detect HDMI. Gets a bit annoying having to reset every time i switch the PC on to detect the input.

my 3090 picks up my 48cx, Alienware aw2721d and eizo 32 4k display on startup every time so I don’t know what your problem is. Even if I don’t turn on the 48cx while the computer is booting as soon as I do it’s detected.
 
Quick question guys - Is there a fix for the CX not detecting a PC HDMI input on startup? I have disabled fast start on the panel in the settings, but still have the issue where HDMI isnt being detected.

Hitting reset on the PC will then have the panel detect HDMI. Gets a bit annoying having to reset every time i switch the PC on to detect the input.

The problem is encountered when the TV doesn't detect the PC, not the other way round.

That's a classic case of the hdmi cable not either good enough (not hitting its specs properly) or the TV firmware not compatible with the cable, id try different hdmi cables any ones that you have lying around the shorter/thicker cables tend to have more luck.

If none work, try getting a Zeskit hdmi 2.1 cable shipped from US store directly (they have the fixed 2.1 certified cables) or another hdmi 2.1 certified cable, its a trial and error case.

I had the same issue with hdmi cables when 4K/60hertz TVs first came out, found an Ibra hdmi cable to work with luck and that was the 9th cable, hdmi 2.1 is much more tricky to hit the specs though.
 
Just with regards to HDR in windows, do you guys have black level set to auto or low?

Auto is default, but if you have your source ie PC Nvidia settings set to Full Dynamic range (under gpu settings) your LG CX should be set to HIGH, otherwise you crush the blacks and shadow detail or get washed out whites/greyish screens.

This guy here does an excellent job of tweaking the Nvidia or AMD gpu for the CX but its really involved and takes some effort.
https://www.reddit.com/r/OLED_Gaming/comments/ixhy39/lg_cx_gamingpc_monitor_recommended_settings/
 
Tip:- If you guys are having issues with GPU or say 120hertz or 10bit or anything not detecting or working:

Reduce your AMD/Nvidia settings ie 60hertz, 8bit or 1080p resolution and then reboot few times see if the issue is fixed ie screen then gets detected fine or resolution or hertz holds etc

Its a quick test to figure out if the hdmi cable can "hit" its intended specs.

Sometimes its the TV firmware support that needs tweaking (manufacture side) but for a quicker fix its the hdmi manufacturer and their product that needs to work or be updated, that is why companies like zeskit had to revise their hdmi 2.1 cables since they failed, there latest batch are now fixed :D
 
That's a classic case of the hdmi cable not either good enough (not hitting its specs properly) or the TV firmware not compatible with the cable, id try different hdmi cables any ones that you have lying around the shorter/thicker cables tend to have more luck.

If none work, try getting a Zeskit hdmi 2.1 cable shipped from US store directly (they have the fixed 2.1 certified cables) or another hdmi 2.1 certified cable, its a trial and error case.

I had the same issue with hdmi cables when 4K/60hertz TVs first came out, found an Ibra hdmi cable to work with luck and that was the 9th cable, hdmi 2.1 is much more tricky to hit the specs though.

I had thought this initially too... but having now bought the Ibra hdmi cable, Tru HQ cable and finally the Maxonar cable (the last one being HDMI 2.1 certified according to the amazon listing), I am thinking this is another issue apart from the cable.

To be fair each cable has performed admirably, although there was a bit off artifacting with the 'Tru HQ' cable, but it settled down. I switched it out for the Maxonar Certified cable though so havent tested it at length.
 
I'll give this a try, just to be doubly sure.

hmm tru hq and ibra cables tend to be pretty good but I have read some had issues. Its hard to be 100% since LGs firmwares sometimes make the hdmi cables more compatible as in it fixes it but sometimes the manufacturer openly says it won't work and you need the new revised edition.

yeah try it, switch from 10 bit to 8bit, if it still dont work drop from 120 hertz to 60 hertz.... if it holds signal or fixes what ever issue then you know its the cable not hitting its spec, its a bit like overclocking if you try 5ghz and it crashes you try fiddling with settings to make it stable or drop the cpu speed to 4.8ghz etc
 
Back
Top Bottom