• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

You're just wasting performance. Downscaling from DSR to that kind of small resolution is a waste of money.

HDR is available on the 30 and 20 series GPUs.
Motion clarity is just an FPS number. You can toggle settings to reach a decent motion resolution and DLSS 3.0 motion interpolation won't help with that.

An OLED is a display type independant of the 4090.

At your resolution, its a waste of money. You would have been better off getting a bigger, high pixel count display rather than trying to kid yourself that DLDSR is doing anything magical. You'll have to pixel peep to find the difference. DSR is never going to make a low resolution display a higher resolution one. Its not making your 1440p ultrawide monitor a 4K or 8K one.

A 4090 is best used for Triple-4K gamers and a 8K gamers or 4K panel high refresh rate people who NEED that FPS number as high as possible at max settings.

For the price of a 4090, shopping wisely, you can get very close to triple 4K OLED panels.
I use my PC with a Samsung Odyssey Neo G which is 4K HDR MiniLED for games which have a very bright presentation like Horizon Zero Dawn, Forza etc, an AW3423DW for games which are contrasty like Cyberpunk and Dying Light 2 and an LG CX 4K OLED when I just want to kick back with a controller. I use my AW2423DW simply because I like the ultrawide resolution as it feels more immersive whilst still offering the best HDR performance available on the marke tas QD-OLED has a brighter color volume which simply cannot be matched by any display currently available. Motion clarity on the OLED is a league above LCDs on the market. The AW3423DW at 175hz feels like a typical LCD at 240hz. These are not advantages I would give up for a sharper image and seeing haloing in dark games on a MiniLED monitor

I tried out 4k monitors and ultrawide and I prefer the ultrawide with DLDSR because the extra FOV is very immersive. It does add a lot to the sharpness of the image if you are using DLSS. Cyberpunk at 3440x1440 with DLSS Quality looks extremely blurry because its upscaling from a sub-1080p resolution. Using DLDSR, I can basically get it to run at 1440p and render at the same resolution as a form of DLAA for an incredible boost in image clarity.

Also I am note sure where people get the idea that the 4090 is well suited only for 4k. My OLED has a refresh rate of 175hz. Although I don't really notice the difference past 80-90 fps, you need a 4090 if you want to drive the full 175hz of this monitor. My 3080 Ti was barely cracking 60 FPS in Cyberpunk at this resolution and I am sure a 4080 and a 7900XTX wouldn't be much better. Only the 4090 can handle games like these.
 
Last edited:
So upgraded my CPU to the 5800X3d. 4090 GPU utilisation is now usually above 90% whereas was often in the 70% range when I had my 3900x.
Makes a noticeable difference in F1 22, no occasional jerkiness in replay mode with all maxed settings at 4k.
Managed to get for £328 during the amazon.de offer.
Thus very much recommend a CPU upgrade to all you lucky 4090 owners on older cpu's.
 
So upgraded my CPU to the 5800X3d. 4090 GPU utilisation is now usually above 90% whereas was often in the 70% range when I had my 3900x.
Makes a noticeable difference in F1 22, no occasional jerkiness in replay mode with all maxed settings at 4k.
Managed to get for £328 during the amazon.de offer.
Thus very much recommend a CPU upgrade to all you lucky 4090 owners on older cpu's.


Did the same from a 3800X
 
  • Like
Reactions: Rup
I use my PC with a Samsung Odyssey Neo G which is 4K HDR MiniLED for games which have a very bright presentation like Horizon Zero Dawn, Forza etc, an AW3423DW for games which are contrasty like Cyberpunk and Dying Light 2 and an LG CX 4K OLED when I just want to kick back with a controller. I use my AW2423DW simply because I like the ultrawide resolution as it feels more immersive whilst still offering the best HDR performance available on the marke tas QD-OLED has a brighter color volume which simply cannot be matched by any display currently available. Motion clarity on the OLED is a league above LCDs on the market. The AW3423DW at 175hz feels like a typical LCD at 240hz. These are not advantages I would give up for a sharper image and seeing haloing in dark games on a MiniLED monitor

I tried out 4k monitors and ultrawide and I prefer the ultrawide with DLDSR because the extra FOV is very immersive. It does add a lot to the sharpness of the image if you are using DLSS. Cyberpunk at 3440x1440 with DLSS Quality looks extremely blurry because its upscaling from a sub-1080p resolution. Using DLDSR, I can basically get it to run at 1440p and render at the same resolution as a form of DLAA for an incredible boost in image clarity.

Also I am note sure where people get the idea that the 4090 is well suited only for 4k. My OLED has a refresh rate of 175hz. Although I don't really notice the difference past 80-90 fps, you need a 4090 if you want to drive the full 175hz of this monitor. My 3080 Ti was barely cracking 60 FPS in Cyberpunk at this resolution and I am sure a 4080 and a 7900XTX wouldn't be much better. Only the 4090 can handle games like these.

I'm not debating miniLED vs OLED. I own triple OLEDs. I only like OLED and JVC projectors for my display types, having owned mini LED 32:9 monitors, 21:9, 16:9 high end FALD TVs. etc. However I do like FALDs for static content like in my gym.


The 4090 is going to increase the FPS number but even with a 3090ti, you will already be above 60fps by a decent margin in most games (bar Cyberpunk and FS2020 maybe?) . Spending £1600-£2000 for a single GPU just to hit a closer FPS target to many is a bit nonsensical given the crazy price increase this generation.

However for triple 4K and 8K oled, you're truly on the edge of 40-80fps dependant on title and thats where a 4090 is almost a necessary purchase unless you upscale from 1440p.

You can prefer whatever you like TBH. I've owned 32:9. Its nice, but 1440p is 1440p and your resolution is your resolution. DLDSR is not going to dramatically increase PQ over using a native panel of that resolution. People are just kidding themselves with the difference being dramatically. his has been quantified and proven. Plenty of videos and tests showing. Thats why we have higher resolution panels. Otherwise why not just DSR 8k onto a 1080p panel?
 
Last edited:
Got my 4080 FE this morning, haven't had time to do much testing but it's noticeably a lot more power efficient than my 3080. 60-80W less than my 3080 on the same settings on a quick Hunt Showdown test. Shame it's so over priced as it's a great product.
You only have yourself to blame for the pricing as you've just gone and validated it.
 
Got my 4080 FE this morning, haven't had time to do much testing but it's noticeably a lot more power efficient than my 3080. 60-80W less than my 3080 on the same settings on a quick Hunt Showdown test. Shame it's so over priced as it's a great product.
Nice to hear you've noticed some noticeable improvements in power draw. I do love the fact the FE is the same cooler as 4090 so it's damn efficient in cooling. Agree on price, it's a bit over top which is a shame.
 
Got my 4080 FE this morning, haven't had time to do much testing but it's noticeably a lot more power efficient than my 3080. 60-80W less than my 3080 on the same settings on a quick Hunt Showdown test. Shame it's so over priced as it's a great product.
Out of interest, what made you go for the card instead of waiting for the amd cards?
 
I'm not debating miniLED vs OLED. I own triple OLEDs. I only like OLED and JVC projectors for my display types, having owned mini LED 32:9 monitors, 21:9, 16:9 high end FALD TVs. etc. However I do like FALDs for static content like in my gym.


The 4090 is going to increase the FPS number but even with a 3090ti, you will already be above 60fps by a decent margin in most games (bar Cyberpunk and FS2020 maybe?) . Spending £1600-£2000 for a single GPU just to hit a closer FPS target to many is a bit nonsensical given the crazy price increase this generation.

However for triple 4K and 8K oled, you're truly on the edge of 40-80fps dependant on title and thats where a 4090 is almost a necessary purchase unless you upscale from 1440p.

You can prefer whatever you like TBH. I've owned 32:9. Its nice, but 1440p is 1440p and your resolution is your resolution. DLDSR is not going to dramatically increase PQ over using a native panel of that resolution. People are just kidding themselves with the difference being dramatically. his has been quantified and proven. Plenty of videos and tests showing. Thats why we have higher resolution panels. Otherwise why not just DSR 8k onto a 1080p panel?
I don’t really like playing at 60 fps. I want my FPS to be as high as possible. On an OLED display, it’s very easy to tell the difference between a rock solid 80-90 fps and 60 fps because if the instant pixel response times. Going to 4K from 1440p makes the image sharper but the drop in performance is noticeable and sometimes it’s more detrimental to my enjoyment then the sharpness gained. There are plenty of games which only 4090 can comfortably run at 1440p at high frame rates. Cyberpunk, Dying Light 2, Plague Tale, Ghost Wire Tokyo, Watch Dogs Legion, Metro Exodus, Red Dead Redemption 2 off the top of my head. If you want to run these games at native 3440x1440 or DLSS Quality at worst and get near 100 fps maxed out, you need a 4090. The 3090 Ti cannot run any of these games with those settings as my 3080 Ti sure could not. Cyberpunk with RT Overdrive will likely need DLSS performance mode at 4K even on 4090. I would get by with DLSS quality because if the headroom.

I bought the 4090 for high frame rate gaming with ray tracing. If I wanted to remain at 60-70 fos I would not have upgraded from 3080 Ti. There is also the issue that there is no 4K OLED on the market. I cannot tolerate IPS Glow or black smear on VA along with the terrible viewing angles and give up on the true HDR performance of OLED. HDR is a much more noticeable upgrade than 4K.
 
Last edited:
I don’t really like playing at 60 fps. I want my FPS to be as high as possible. On an OLED display, it’s very easy to tell the difference between a rock solid 80-90 fps and 60 fps because if the instant pixel response times. Going to 4K from 1440p makes the image sharper but the drop in performance is noticeable and sometimes it’s more detrimental to my enjoyment then the sharpness gained. There are plenty of games which only 4090 can comfortably run at 1440p at high frame rates. Cyberpunk, Dying Light 2, Plague Tale, Ghost Wire Tokyo, Watch Dogs Legion, Metro Exodus, Red Dead Redemption 2 off the top of my head. If you want to run these games at native 3440x1440 or DLSS Quality at worst and get near 100 fps maxed out, you need a 4090. The 3090 Ti cannot run any of these games with those settings as my 3080 Ti sure could not. Cyberpunk with RT Overdrive will likely need DLSS performance mode at 4K even on 4090. I would get by with DLSS quality because if the headroom.

I bought the 4090 for high frame rate gaming with ray tracing. If I wanted to remain at 60-70 fos I would not have upgraded from 3080 Ti. There is also the issue that there is no 4K OLED on the market. I cannot tolerate IPS Glow or black smear on VA along with the terrible viewing angles and give up on the true HDR performance of OLED. HDR is a much more noticeable upgrade than 4K.


Yup thats totally fair enough. All preferences. For me, I wouldn't feel comfortable chasing FPS for £1500 - £2,000. I don't mind for £500-£800 getting a nice increase in FPS but above that price range, I want something trans formative to the experience which both ultrawide, VR and triple monitor setups offer in my experience. 3D gaming also did the same but its a pain to get working with Geo-11 drivers.

High FPS is nice.. its noticeable but its not trans formative. As I said, at 3440x1440, for 99% of games, a 3090/3090TI would have definitley done the job and got you to a decent number FPS-wise. Its simply not a demanding resolution.

I'm not sure why you keep talking about OLEDs. As I said before, I only game on OLEDs and I'm well aware of their positives (and negatives). I hate VA & IPS panels.


Where the 4090 comes into play is at much higher resolutions IMO.

An upgrade path for the alienware ultraide IMO is triple 42/48/55 C2 OLEDs. They have a few drawbacks but in the right space, they're going to look downright cinematic and ridiculous compared to the ultrawide you have. For me, the 21:9 Alienware felt like a monitor. The big OLEDs feel like an insane experience - nearly as cool as my projector - but each to their own.
 
Last edited:
They did eventually sell out though from the official place… albeit much more slowly than usual.

Just had another fly in the ointment of my hopes to go AMD this time around though - adding to the issue of certain high end VR headsets not supporting AMD cards, I see mbucchia from Microsoft has managed to use the optical flow hardware in the 4000 series to make a much improved version of motion reprojection in VR with substantially reduced artefacts. He also seems to be working with them on getting DLSS 3.0 working specifically in VR.

Those are both pretty significant wins for VR specific use cases when Nvidia was already the better performance choice for VR anyway.

I understand that VR is a small part of the market, but so is 4k (steam survey numbers between the two were actually pretty similar). I do wish AMD would put more effort into it personally.

At the moment it’s looking like due to VR being by far my main use case yet again I’m pretty much locked to Nvidia.
 
Yup thats totally fair enough. All preferences. For me, I wouldn't feel comfortable chasing FPS for £1500 - £2,000. I don't mind for £500-£800 getting a nice increase in FPS but above that price range, I want something trans formative to the experience which both ultrawide, VR and triple monitor setups offer in my experience. 3D gaming also did the same but its a pain to get working with Geo-11 drivers.

High FPS is nice.. its noticeable but its not trans formative. As I said, at 3440x1440, for 99% of games, a 3090/3090TI would have definitley done the job and got you to a decent number FPS-wise. Its simply not a demanding resolution.

I'm not sure why you keep talking about OLEDs. As I said before, I only game on OLEDs and I'm well aware of their positives (and negatives). I hate VA & IPS panels.


Where the 4090 comes into play is at much higher resolutions IMO.

An upgrade path for the alienware ultraide IMO is triple 42/48/55 C2 OLEDs. They have a few drawbacks but in the right space, they're going to look downright cinematic and ridiculous compared to the ultrawide you have. For me, the 21:9 Alienware felt like a monitor. The big OLEDs feel like an insane experience - nearly as cool as my projector - but each to their own.
I was using the 48 LG CX as a monitor prior to the Alienware and it required a lot of adjustments to get it to be usable for PC gaming. I had to push my desk back quite a bit to get the whole thing in my FOV but at that point you lose the advantages of 4K resolution and are stuck with the bad performance. That’s when a I setup a custom 3840X1600 resolution to get those games in my FOV and sit close to it but the black bars bothered me and since the TV isn’t curved, it didnt feel that immersive to me.

The reason I bring up OLED is because it has a tendency to make good content look great but bad content looks horrible. If you are using a 3090 at 4K, you will be hovering around 60 fps in many games without resorting to the lower dlss modes, and OLED has this juddering effect at that fps making it feel lower than it actually is which I detest. 60 FPS on my Neo feels like 70 fps on the OLED but conversely 90-100 fps on the OLED feels like 120 fps on the Samsung. Maybe it’s because I am more sensitive to this sort of effect.

For me the perfect monitor would have to be the HDR peak brightness of PG32UQX with the black levels of OLED. The problem with OLEDs is they just don’t get bright enough like a MiniLED in high APL games and it legit looks like SDR on the OLED when playing games having a bright sunny day compared to the MiniLED. That’s why I kept both my Samsung and AW so I can switch based on the type of game. Instead of triple OLEDs and obscenely high resolutions, I would like a monitor which displays true HDR all of the time. I just don’t find the sharpness upgrade of 4K a big deal compared to HDR, RT and a higher refresh rate. The latency drop is noticeable to me.
 
At the moment it’s looking like due to VR being by far my main use case yet again I’m pretty much locked to Nvidia.
I don't know what your exact criteria are, but I play VR on a Quest 2 with a physical link cable on an RX6800 & it's fantastic. I don't have anything else to compare it to, but everything seems to run great & look good. I'm playing Moss 2, HL Alyx, Myst VR, SW Squadrons, Lone Echo etc...Maybe for sims it's less ideal. I play MSFS 2020 sometimes but never tried it VR.
 
Last edited:
I don't know what your exact criteria are, but I play VR on a Quest 2 with a physical link cable on an RX6800 & it's fantastic. I don't have anything else to compare it to, but everything seems to run great & look good. I'm playing Moss 2, HL Alyx, Myst VR, SW Squadrons, Lone Echo etc...Maybe for sims it's less ideal. I play MSFS 2020 sometimes but never tried it VR.

Yeah I’m sure those games are great on the quest with a 6800… it’s not like AMD are unusable.

Unfortunately I’m running higher resolution (G2 currently and have a Crystal preorder), and my most commonly played games are also possibly the worst for performance in VR, sims like ACC and MSFS.

The biggest barrier however is that both Varjo and now Pimax simply don’t support AMD with their newest headsets due to the frameworks Nvidia provides that underpins how they work… you have to wonder if that’s going to become more and more common as these more advanced headsets with foveated rendering become the norm. As one example according to mbucchia (developer of OpenXR toolkit and WMR dev) AMD doesn’t support foveated rendering at all in DX11 and only in DX12, which obviously isn’t much use when the majority of games are still DX11. Nvidia have provided the tools to get it working in both.
 
Last edited:
Yeah I’m sure those games are great on the quest with a 6800… it’s not like AMD are unusable.

Unfortunately I’m running higher resolution (G2 currently and have a Crystal preorder), and my most commonly played games are also possibly the worst for performance in VR, sims like ACC and MSFS.

The biggest barrier however is that both Varjo and now Pimax simply don’t support AMD with their newest headsets due to the frameworks Nvidia provides that underpins how they work… you have to wonder if that’s going to become more and more common as these more advanced headsets with foveated rendering become the norm. As one example according to mbucchia (developer of OpenXR toolkit and WMR dev) AMD doesn’t support foveated rendering at all in DX11 and only in DX12, which obviously isn’t much use when the majority of games are still DX11. Nvidia have provided the tools to get it working in both.
Fair enough, your use case is fairly niche. The headsets & features you mention are a niche of a niche. Even PCVR is a small fraction of a market that's probably been entirely loss-making for it's whole life. The only mass-market VR platforms ever so far afaik are Quest 2 Standalone & PSVR 1. I think there's more cause for worrying that PCVR disappears entirely than that AMD are left out of the party. To give an analogy - I don't think it's comparable to CUDA (Which I use at work). But we'll see, I guess.
 
Last edited:
Reccession is for the poor not the ones who can afford a new pc that wouldn't buy a 4080 or 4090 anyway
How did you get to be not poor?

January:
throw in poverty actual real poverty..... and you have a pretty terrible existence

I buy the odd cheap game to keep me sane, my computers terribly old (7 years) but still keeps up somehow so at least I have that as my only luxury.

I only get £88 per week? I struggle to survive with bills even though I only have basic services

btw before someone mentions but you post in the stock market trading thread I literally have like £600-700 in the stock market and that was birthday/Christmas money from my parents over the years that I'm saving for when my ancient pc finally dies.

July:
Amazon delivered
:D
my 34inch ultrawide and 0dead/stuck pixels
:D

glad DPD were so crap now tbh

how the hell do people get a curved monitor level? even the bottom of the monitor surround seems curved down at the corners unless its an optical illusion

I just built a 12700k system at £1500
Stuck my old 980ti inside it
:D
who wants to pay a last gen gpu at current prices NVDA are laughing their way to the bank with their 40-50% net profit margin, I bet AMD aren't much lower...

I was tempted to get a 3060ti FE to tide me over ,but meh why bother, there's no games coming out that need a new gpu anyway

once the recession hits and they need to fight for the sales so their stocks don't crash the good prices might come

October:
 
Back
Top Bottom