LG 48CX OLED - 4K 120 Hz

Just wanted to mention that I've been using my 3080 on my CX48 for over a month and I'm extremely pleased with the combination. It's extremely stable at 4k, 120Hz, 4:4:4 10bit HDR, extremely smooth in all games with Gsync and an absolute joy to use.

I do not recommend coupling a CX48 with a AMD GPU, as it's absolutely not the same experience driver wise. Nvidia have some 'special sauce' fix in the drivers I'm sure, as black levels and seemingly other settings are pre-calibrated in the Nvidia drivers to make everything work properly out of the box.

3080 is adequate for smooth 4K gameplay (between 60-120FS depending on the game) with RTX either off, or on low. I still see RTX as a gimmick, as the performance hit is huge and not worth it, compared to having a much faster/smoother experience with it off.

If the 3080ti is a good 20% faster than the 3080 I'll be upgrading to it, else will wait for the 4080.
 
If the 3080ti is a good 20% faster than the 3080 I'll be upgrading to it, else will wait for the 4080.

3080-3080TI and the only differences would be 20% more Wow not worth the price. I wait for 4090 before I pay-out any money to Nvidia again

I do not recommend coupling a CX48 with a AMD GPU, as it's absolutely not the same experience driver wise. Nvidia have some 'special sauce' fix in the drivers I'm sure, as black levels and seemingly other settings are pre-calibrated in the Nvidia drivers to make everything work properly out of the box..
So now you are an expert on AMD- NVidia GPUs and clam to know the CX only works with Nvidia! That's why the CX have free-sync too,
 
Last edited:
What, you mean it hasn't released yet? What exactly does "coming soon" mean... Next month, next year....? ;)

Not playing your game anymore, if you can't accept the 3080ti is coming and feel the need to quote me every time and debate it, you can join my special list on this forum.

I'm assuming your concern/denial is based on that you don't want your 3080 to be quickly eclipsed by a superior card with suitable VRAM, to more match the new conbsoles? It's the nature of the game in the GPU market and nothing special. Either way, I'm done going through this with you each time.
 
Not playing your game anymore, if you can't accept the 3080ti is coming and feel the need to quote me every time and debate it, you can join my special list on this forum.

I'm assuming your concern/denial is based on that you don't want your 3080 to be quickly eclipsed by a superior card with suitable VRAM, to more match the new conbsoles? It's the nature of the game in the GPU market and nothing special. Either way, I'm done going through this with you each time.

I haven't quoted you relentlessly regarding the 3080ti, could you even point to the last time I discussed it with you? I can recall you randomly tagging me in this thread regarding this, so I decided to post that recent comment in a facetious context (hence the smile).

I had no idea you were this precious about it, so rest assured I won't discuss it with you again. And if you now have a list of people you can't converse with on this forum then I think you need to review the conversations you're having and consider why they often reach a point where you can't speak to someone going forward.
 
I haven't quoted you relentlessly regarding the 3080ti, could you even point to the last time I discussed it with you? I can recall you randomly tagging me in this thread regarding this, so I decided to post that recent comment in a facetious context (hence the smile).

I had no idea you were this precious about it, so rest assured I won't discuss it with you again. And if you now have a list of people you can't converse with on this forum then I think you need to review the conversations you're having and consider why they often reach a point where you can't speak to someone going forward.

Don't worry Dave has called half the guys on this chat a troll and even added me on his ignore list :p I think he has got his messages mixed up a bit, not seen anyone saying the CX series is not one of or the best OLED going otherwise why else would we all have spent stupid amounts of money on it.

Either that or he really loves his hardware too much !
 
saw on Ebay as a refurbished for £1,149 , he got 5 of them few days ago but sold out very quickly. maybe keep an eye on his shop if you ok with this price but only with 1 year Warranty

https://www.ebay.co.uk/itm/LG-OLED4...83.l10137.c10&nordt=true&rt=nc&orig_cvip=true


Awesome really for the price and value, but Id guess many returned them due to bad pixels and banding issues ! maybe tad risky but for value wont complain.

Another way to look at it though if you are spending 1k+ you might as well spend the extra few hundred on a new panel something like this would last you a good few years if not more.
 
I don't know if CX owners are aware of this but if you are fans of movies some interesting stuff has happen past year but its only coming more in the open lately.

Dolby Vision HDR has been made unofficially available, currently Nvidia Shield Pro supports it (somewhat buggy) and so does Plex. So basically you could ripp your blu ray copy of say Lord of the Rings in 4K with Dolby Vision track and just run if from your hard drive, nas/server server and it would be a direct copy also.
 
Awesome really for the price and value, but Id guess many returned them due to bad pixels and banding issues ! maybe tad risky but for value wont complain.

Another way to look at it though if you are spending 1k+ you might as well spend the extra few hundred on a new panel something like this would last you a good few years if not more.

It did say on listing that '1 Year Warranty: customer return item which has been fully checked and refurbished by LG to full working order without screen burn or dead pixels' so I think its not so bad.
 
It did say on listing that '1 Year Warranty: customer return item which has been fully checked and refurbished by LG to full working order without screen burn or dead pixels' so I think its not so bad.

Very impressive bargain then ! I see it now under read more button now, that thing would have sold out like hot cakes in minutes though:eek:
 
It did say on listing that '1 Year Warranty: customer return item which has been fully checked and refurbished by LG to full working order without screen burn or dead pixels' so I think its not so bad.
I wouldn't trust anything eBay sellers says, so you risk spending £1,149 on refurbished! VS £1,489 New LG CX-OLED with 5 year warranty saving only 340 quid WoW:eek:
 
Last edited:
It did say on listing that '1 Year Warranty: customer return item which has been fully checked and refurbished by LG to full working order without screen burn or dead pixels' so I think its not so bad.

Not manufacturer warranty - it's the ebay seller offering a 1 year warranty - so possibly LG didn't want to even sell this refurbished. They probably sold them at auction for ~500 quid and this retailer bought them, and put them on ebay.
 
If the 3080ti is a good 20% faster than the 3080 I'll be upgrading to it, else will wait for the 4080.

There's no way it will be 20% faster. The 3090 is already maxing ampere in a consumer card, and that is between 0 and 10% faster than the 3080 at 4k. The 3080Ti is a direct response to the 6900XT, and will probably be very similar performance to the 3080/3090 with 16/20GB GDDR6X. The main reason to get the 3080Ti is the extra ram in my opinion, which may become relevant in a couple years especially at 4k. But right now it probably doesn't make much sense from a value perspective.
 
3080-3080TI and the only differences would be 20% more Wow not worth the price. I wait for 4090 before I pay-out any money to Nvidia again


So now you are an expert on AMD- NVidia GPUs and clam to know the CX only works with Nvidia! That's why the CX have free-sync too,


I've used both AMD and Nvidia GPU's on my CX 48. Have you?

I'm no expert, I'm just sharing my personal experience after almost 6 months of using the thing. My experience was far superior on my 3080, for several reasons I've already listed.
 
There's no way it will be 20% faster. The 3090 is already maxing ampere in a consumer card, and that is between 0 and 10% faster than the 3080 at 4k. The 3080Ti is a direct response to the 6900XT, and will probably be very similar performance to the 3080/3090 with 16/20GB GDDR6X. The main reason to get the 3080Ti is the extra ram in my opinion, which may become relevant in a couple years especially at 4k. But right now it probably doesn't make much sense from a value perspective.

Common sense and logic are telling me that there's no room for another performance tier between a 3080 and 3090, the gap between them is already too small. I think nvidia are keen to have the 3080ti a tier above all current cards, including the 3090 and 6900XT, to cement Nvidia's dominance of the market once again.

We know from the card specifications that a fully enabled GA102, with perhaps faster memory, binned higher clocked/more efficient die, or perhaps a TSMC 7nm version of GA102 would easily be 20%+ faster than the 3080, perhaps even 10% faster than the 3090.

As you mention though, it's the VRAM increase that excites many. 10GB just doesn't cut it for a high resolution VR headset such as the Index, or for 4K games that will release next year.
 
Just updated to the 3.11.30 firmware via the normal TV update method. Seems to have fixed the VRR stutter at higher refresh rates - excellent.

In a few hours I'll know if it's fixed the "your TV will turn off in 5 minutes" thing that was broken in the 3.11.25 fw...
 
This is not correct, GA102-300-A1 (the die used on the 3090) is not a fully enabled chip.

3090 is reaching the power / thermal limits already. The amount of power it requires to get that additional 10% performance is immense. I doubt enabling anything in the die will get you anywhere unless you put another couple hundred watts into the card which means you need some insane cooling. I doubt we'll ever see much more performance improvement until a rearchitecture of some sort, or they use a new process to reduce power consumption. I suppose anything is possible...
 
3090 is reaching the power / thermal limits already. The amount of power it requires to get that additional 10% performance is immense. I doubt enabling anything in the die will get you anywhere unless you put another couple hundred watts into the card which means you need some insane cooling. I doubt we'll ever see much more performance improvement until a rearchitecture of some sort, or they use a new process to reduce power consumption. I suppose anything is possible...

Read my post, where I talk about them using either faster memory, a binned die, or manufacturing it on TSMC's 7nm vs Samsung's 8nm. It's just a few posts up...
 
Back
Top Bottom