LG 48CX OLED - 4K 120 Hz

Sign up to be notified and you'll be able to get them. They seem to get them in stock every week or two.

Curry's always has them. YMMV there, I believe Curry's has a direct LG support contract so warranty issues should get sorted. Dealing with Curry's CS is in general a huge pain (long wait times, etc.) but as long as you get it fixed it's okay. I have no idea how Curry's handles burn in on OLEDs.
 
Found a MSI 3080 Gaming X Trio at a local store today (Wales is out of lockdown from today) at MSRP, and jumped on it. Will use it with my CX48 while I wait for the 3080ti to launch (not convinced the 10GB will hold up well over the next year or so at 4K). Was originally going to get a 3090, though dismissed that idea when the 3080ti news broke, as I believe the 3080ti will be faster than the 3090 is for gaming. Nvidia have pressure on them to release a 3080ti that's a league ahead of the RX6900, so I think we'll see one sooner than most think.

Before I was running 4K 60hz 10Bit 4:2:2 on a Radeon VII. Now I'm of course running 4K 120Hz 10Bit 4:4:4 and the difference is just astounding. OLED at 120Hz is unbelievably smooth, the absolute fastest, smoothest gaming panel I've ever used. I'd gotten used to chroma sub-sampling on my Radeon VII, as I've been using it since my CX48 arrived in June, the move back to 4:4:4 makes all the colours pop and text beautifully crisp to read.

I think there's also some secret sauce thing going on between LG/Nvidia. Although I had Freesync premium and all other settings optimally configured, some video files, pictures looked a little too dark. I played about with the black level to no avail, auto was too dark for some content, high was too bright and negated the OLED contrast advantage. With the 3080, the black level seems changed, despite the settings on the panel remaining identical. Everything just looks better. I turned chroma sub-sampling back on with the 3080 just to see if it was only that, though found it was still an improvement over my Radeon VII in media. Strange.

Games are obviously fantastic, I've got many titles I was saving until I got a HDMI 2.1 card, so I'm like a kid at Christmas day at the moment!
 
Last edited:
When was this announced? I haven't seen any details for a new model :confused:

Just a random rumor. I doubt we'll see a 3080Ti until next summer / fall though. 3090 is such a colossal waste of money. Nvidia messed up big time on it. They should have put only 16GB on it and sold it for 999 to 1099. I have no idea why they put so much GDDR on it as a gaming card, which raised the price substantially. Now the best top end is going to be the 6900XT which will cost 999 and by all accounts performs better than the 3090.
 
Just a random rumor. I doubt we'll see a 3080Ti until next summer / fall though. 3090 is such a colossal waste of money. Nvidia messed up big time on it. They should have put only 16GB on it and sold it for 999 to 1099. I have no idea why they put so much GDDR on it as a gaming card, which raised the price substantially. Now the best top end is going to be the 6900XT which will cost 999 and by all accounts performs better than the 3090.

3080ti 'random rumour' will be punching you in the face when it releases in a couple months. It's extremely obvious and logical that it's coming, as Nvidia will want a mainstream 'flagship' card to be a clear cut performance wise over AMD's RX6900.

What isn't obvious yet is the performance of the 3080ti. I'm expecting it to outperform the 3090, though they may go for something that's practically identical performance wise, with > 10GB GDDR6X (12GB or 20GBand a smaller price tag.

Read up on GPU memory bus width implications, the available sizes of the GDDR6X modules upon 3080 release and you'll understand why the 3080 only has 10GB. Spoiler, it was because there were only 1GB modules available for purchase. The 3080 has 10, the 3090 has 24 modules. There's public information out there showing 2GB modules are in the works/now shipping. Apply a little logic and you'll see what can quickly happen here.
 
Just a random rumor. I doubt we'll see a 3080Ti until next summer / fall though. 3090 is such a colossal waste of money. Nvidia messed up big time on it. They should have put only 16GB on it and sold it for 999 to 1099. I have no idea why they put so much GDDR on it as a gaming card, which raised the price substantially. Now the best top end is going to be the 6900XT which will cost 999 and by all accounts performs better than the 3090.

Well not quite, to be accurate only on a Zen3/X570 platform with Rage Mode and Smart Memory Access enabled does it trade blows with a stock 3090 (except for RT performance), i wouldnt say it performs better. Of course at $500 cheaper its a no-brainer better 'value'...

Regarding 3080Ti vs 6900XT... availability will be king for me but assuming equal availability this is the way i see it:

3080Ti will be slightly slower in raster than 6900XT
3080Ti will be much faster in RT
3080Ti will have 4 more gigs of ram (who cares)
3080Ti will have a better software stack (DLSS, video encoding, RTX Voice).

Assuming equal availability I would lean back towards the 3080Ti but I am betting the 6900XT will be orders of magnitude more available (relatively speaking) to the 3080Ti. I am certainly not going to 'wait' for the 3080Ti if there are 6900XT's widely available.
 
Assuming equal availability then 3080Ti would be very good to buy in January. That is a huge assumption though! I would be pleasantly surprised if Nvidia can pull it off though. I would buy that over a 6900XT.
 
Well not quite, to be accurate only on a Zen3/X570 platform with Rage Mode and Smart Memory Access enabled does it trade blows with a stock 3090 (except for RT performance), i wouldnt say it performs better. Of course at $500 cheaper its a no-brainer better 'value'...

Regarding 3080Ti vs 6900XT... availability will be king for me but assuming equal availability this is the way i see it:

3080Ti will be slightly slower in raster than 6900XT
3080Ti will be much faster in RT
3080Ti will have 4 more gigs of ram (who cares)
3080Ti will have a better software stack (DLSS, video encoding, RTX Voice).

Assuming equal availability I would lean back towards the 3080Ti but I am betting the 6900XT will be orders of magnitude more available (relatively speaking) to the 3080Ti. I am certainly not going to 'wait' for the 3080Ti if there are 6900XT's widely available.

@Dirk Diggler and others are in complete denial the 3080ti is coming soon, despite it being extremely obvious!

I'll be getting a 3080ti on release, I knew full well it was coming and still picked up a 3080 on Monday, as I needed something with HDMI 2.1 while the 3080ti is being baked!
 
Got hold of a 3070 to replace my 2080Ti (just for HDMI 2.1) and it's glorious! The full RGB 4:4:4 is a shed load better than chroma subsampling! Loving it!
Now to wait until the 3080's are more easily available.
 
Got hold of a 3070 to replace my 2080Ti (just for HDMI 2.1) and it's glorious! The full RGB 4:4:4 is a shed load better than chroma subsampling! Loving it!
Now to wait until the 3080's are more easily available.

Yeah, it's hard to understate how beautiful 4k 10bit 4:4:4 looks! I'd completely gotten used to 4:2:2 10bit 4k on my Radeon VII, was an eye opener with my 3080 for sure!
 
Hoping you guys can answer some questions for me I have been looking at the G9 but now finding out the PS5 wont be supporting 1440 I am thinking this might be an alternative but not 100% sure as it will have to sit on my desk so wondering does anyone have it set on thier desk and what distance do you have it at and is it suitable will have it on for around 15 hours a day a bit of light work, internet and gaming will it hold up ?
 
Hoping you guys can answer some questions for me I have been looking at the G9 but now finding out the PS5 wont be supporting 1440 I am thinking this might be an alternative but not 100% sure as it will have to sit on my desk so wondering does anyone have it set on thier desk and what distance do you have it at and is it suitable will have it on for around 15 hours a day a bit of light work, internet and gaming will it hold up ?

This has been answered a fair few times in this thread recently.
 
I think there's also some secret sauce thing going on between LG/Nvidia. Although I had Freesync premium and all other settings optimally configured, some video files, pictures looked a little too dark. I played about with the black level to no avail, auto was too dark for some content, high was too bright and negated the OLED contrast advantage. With the 3080, the black level seems changed, despite the settings on the panel remaining identical. Everything just looks better. I turned chroma sub-sampling back on with the 3080 just to see if it was only that, though found it was still an improvement over my Radeon VII in media. Strange.

Games are obviously fantastic, I've got many titles I was saving until I got a HDMI 2.1 card, so I'm like a kid at Christmas day at the moment!

Curious in you saying your 3080 produces Video files better with better darkness, still sounds like a settings not enabled or disabled.

At first, I thought maybe the difference in your dark video content was due to you not enabling FULL output dynamic range under nvidia control panel, limited you get washy colours and not true blacks, or maybe you have not increased the Windows HDR/SDR contrast bar, I find it often needs to be topped up around 45-50 to bring out details in dark scenes during video playback.

But Vincent said you need to select black level on the CX to 'LOW' and also PC mode was be enabled in dashboard to get the full RGB support, obviously full hdmi ultra HD deep colour needs to be enabled also to get correct blacks and colours.

He covers it @ 13 minute mark here:

https://youtu.be/5Y9qcXMQg_s?t=808
 
Back
Top Bottom