• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Soldato
Joined
31 Oct 2002
Posts
9,953
like ive said 120 isnt my target, not every game has dlss and i dont give a toss about oled until burn in is not an issue for years of heavy use

Have you been living under a rock? LG OLED's last years upon years of heavy use with no burn in. Had my CX48 for over 3 years, very heavy use, no burn in. Also professional tests have been done by Rtings.com - after thousands of hours of unrealistic tests, they conclude burn in is not an issue https://www.rtings.com/tv/learn/real-life-oled-burn-in-test
 
Soldato
Joined
31 Oct 2002
Posts
9,953
I'm not too sure 4090 can hold consistently (without a single drop), 60fps during gameplay, in CP77, with path tracing NATIVE, at 1080p.4k is a no go.

Where did I say it could? I played Cyberpunk through at 4k, RT ultra, all else ultra, with DLSS 3.0, steady 120FPS, beautiful experience. Have not tried path tracing, though that'll have to wait for 5090 for 4k to be ~120FPS I imagine.
 
Soldato
Joined
6 Feb 2019
Posts
17,921
I'm not impressed by the temps; Asus says this 4090 stays at or under 60c at 450w with fans at 1000rpm. I'm able to get 5c to 10c better temps at 480w with 1300rpm fans on my gigabyte aio 4090 and it doesn't have fancy Liquid Metal TIM

 
Last edited:
Soldato
Joined
18 Feb 2015
Posts
6,492
Can someone explain to me why the 4090 scales poorly at lower resolutions? Is it because of drivers that have significant overhead, as HardwareUnboxed suggests, or is it because it lacks a certain scheduler, if I understood correctly?
It's none of those reasons, the driver overhead limits your max fps but strictly speaking that doesn't deal with across resolutions scaling.

The reason it's scaling less at lower resolutions is because the easiest way (and universal across all games) to put all the shader cores to work is to simply increase the number of pixels that they have to work on. Thus a GPU with so many cores like a 4090, the only way you can get close to actually utilising all those (or near all) cores is to increase the resolution - otherwise the game has to individually have specific settings, or be coded in a way, that properly distributes work across all said cores. Due to how unique and different each game engine & even just how each game might use a particular engine are, that means you generally won't see a high utilisation of the card except by upping the resolution. Plus some (rendering) work (in games) has to be done in a certain order/at a certain time, so you'll never get that full parallelization that you might get in for ex. rendering a scene with blender, super-computer simulations with tailor-made software etc.

There's a lot more to it, but that's the gist. If you want some interesting reading you can look into the changes AMD did from GCN to RDNA wrt bundling work et al; Vega f.ex. was a monster on paper, with lots of cores, but had trouble putting them all to work, so in practice it ended up disappointing in games compared to on-paper specs. Actually utilising the hardware is no small task, and a lot of work has to go into how the hardware is designed as well as thinking about the software it's meant for and what the needs/can use. A lambo going off-road is not gonna be very speedy.
 
Soldato
Joined
14 Aug 2009
Posts
2,930
Where did I say it could? I played Cyberpunk through at 4k, RT ultra, all else ultra, with DLSS 3.0, steady 120FPS, beautiful experience. Have not tried path tracing, though that'll have to wait for 5090 for 4k to be ~120FPS I imagine.
You said 1080p and 1440p is old and that 4k truly matters.
Well, if you want native or as close as possible, 4k is too much in my example. Ergo those resolution are quite relevant.
 
Permabanned
Joined
31 May 2023
Posts
56
Location
Europe
Where did I say it could? I played Cyberpunk through at 4k, RT ultra, all else ultra, with DLSS 3.0, steady 120FPS, beautiful experience. Have not tried path tracing, though that'll have to wait for 5090 for 4k to be ~120FPS I imagine.
I'm aware that you're trolling, you do it on every forum using the same pattern. You present yourself as a high-end consumer and then talk about how things are outdated and how everyone should migrate to something better, etc. However, I'm quite certain that you don't actually possess any of what you write, and it's just part of your trolling. We all know how much trust can be placed in people on the internet, especially confirmed trolls. But let's get back to the topic. I chose 1440p not because I can't afford 4K, but because the current hardware isn't good enough for what I consider an optimal gaming experience in 4K resolution. Future games will further expose the weaknesses of the 4090, which already struggles in many games. That's why I opted for a middle ground with very high frame rates. When the 5090 is released, then I'll consider 4K resolution, and hopefully, I'll be able to tell a different story.

edit: I forgot to mention that I don't consider DLSS 3.0 fake frames as actual frames because they aren't. It's a partial, semi-solution that I don't take into account, except for DLSS 2.0 with the quality setting.
 
Last edited:
Soldato
Joined
22 May 2010
Posts
12,363
Location
Minibotpc
I'm not impressed by the temps; Asus says this 4090 stays at or under 60c at 450w with fans at 1000rpm. I'm able to get 5c to 10c better temps at 480w with 1300rpm fans on my gigabyte aio 4090 and it doesn't have fancy Liquid Metal TIM


That's underwhelming, something seems off with that Asus block.
 

RSR

RSR

Soldato
Joined
17 Aug 2006
Posts
9,632
4090 only smart choice this gen, disregard all others.

Poor people not allowed.
oAs0gzP.jpg
 
Back
Top Bottom