Don
Admins are busy today!
Seems unlikely unless there ends up being a cut down RTX4080 die used for 4070 Super or somethingI assume there is going to be a 4070 16GB one day?
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Admins are busy today!
Seems unlikely unless there ends up being a cut down RTX4080 die used for 4070 Super or somethingI assume there is going to be a 4070 16GB one day?
Seems unlikely at this point. The AD104 chip on the 4070 doesn't support a bus configuration that would work with 16GB without massively tanking the memory bandwidth by going to 128-bit.Admins are busy today!
I assume there is going to be a 4070 16GB one day? I'd be tempted by one of those, but only when it's cheaper.
I disagree with you for one reason - ray tracing. Ray tracing is a feature that can bog down even the 4090, let alone weaker cards. Now that we are in the era of ray tracing, generational performance leaps need to be much larger, but unfortunately, they aren't. We saw in the case of Cyberpunk and path tracing, or whatever it's called, it can reduce performance by up to 75%. DLSS 2.0 helps a bit, but this DLSS 3.0 is not worth mentioning because those aren't true frames, and they don't provide the feeling of playing at that many frames per second. So, without DLSS, the 4090 runs at 16 FPS. With DLSS, how many frames would it reach, 30 or 40? And we also saw in the Lord of the Rings game, if I remember correctly, where at native resolution, the 4090 only provides 48.1 FPS. What about future UE5 games? If 40 FPS is good enough for you, feel free to continue playing, but it's not sufficient for me.Lower resolutions, such as 1080P and 1440P should be consigned to the history books already, especially when considering a 4090.
There's only one extremely niche use case, that's professional esport gamers (counterstrike, pubg etc) where lower latency and 360hz monitors actually matter, though that's such a small subset of the population it's barely worth mentioning.
4090 needs 4k to shine. 4k is also now mainstream, thanks to the 3 year old current gen of consoles.
I assume there is going to be a 4070 16GB one day? I'd be tempted by one of those, but only when it's cheaper.
Thanks. I suppose 12GB is just about enough for 1440p. It's such a weird combo to have the 4070 no 16GB and the 4060 Ti 16GB. No idea what Nvidia was thinking there, it's a messy lineup.
This might be OT but the price of Zen 4 vs 5 cpu/motherboard/ram price has doubled. Albeit 16GB vs 32GB. So it's not just Nvidia that is charging more.
Most triple A games can't get to 140+ fps due to cpu limitations.no it doesnt at all, i use a 4090 at 1440p because i like my games to be at least 140+ not the "mainstream" 30fps console peasant edition 4k
That's right, people have gotten the impression that processors are too weak due to lazy developers and insufficient optimizations, but the truth is far from that. Unfortunately, the trend is such that we are receiving half-baked games. In the past, game development studios used to pay beta testers, but now we are paying them to test their games.CPU limitations in the way of poor engine optimisation, not the CPU itself being the issue ^^
Take last of us as a prime example. Launch time it was sub 85fps a lot of the time, now it never drops before 85 lol on the same card. 4090 is over 100fps but still can't hit 144fps with or without DLSS because the engine is still not optimised for multi core CPU utilisation properly.
CPU limitations in the way of poor engine optimisation, not the CPU itself being the issue ^^
Take last of us as a prime example. Launch time it was sub 85fps a lot of the time, now it never drops before 85 lol on the same card. 4090 is over 100fps but still can't hit 144fps with or without DLSS because the engine is still not optimised for multi core CPU utilisation properly.
I think the "lazy developers" thing is a ridiculous trope. Game devs are one of the hardest working groups of software developers in general (and the pay is generally **** for the skillset required as well). In software development in general, game dev is notorious for long hours, putting in weekends and "death marches".That's right, people have gotten the impression that processors are too weak due to lazy developers and insufficient optimizations, but the truth is far from that. Unfortunately, the trend is such that we are receiving half-baked games. In the past, game development studios used to pay beta testers, but now we are paying them to test their games.
no it doesnt at all, i use a 4090 at 1440p because i like my games to be at least 140+ not the "mainstream" 30fps console peasant edition 4k
Only until FarCry 7 is released with the HD texture pack . Released in Oct 2021 I think, FarCry 6 was a struggle on a 10GB 3080 using the HD text pack.Thanks. I suppose 12GB is just about enough for 1440p. It's such a weird combo to have the 4070 no 16GB and the 4060 Ti 16GB. No idea what Nvidia was thinking there, it's a messy lineup.
This might be OT but the price of Zen 4 vs 5 cpu/motherboard/ram price has doubled. Albeit 16GB vs 32GB. So it's not just Nvidia that is charging more.
like ive said 120 isnt my target, not every game has dlss and i dont give a toss about oled until burn in is not an issue for years of heavy use4090 is powerful enough to get 120fps at 4k on my LG OLED in many games. Looks absolutely beautiful. In the ones it can't, DLSS comes to the rescue and bang, 120hz.
1440P is horribly pixelated, has a weird aspect ratio. I hated my time with a 1440P monitor many years ago. 1440P Monitors are also garbage compared to 4k OLED in general.
I like your trolling style.4090 is powerful enough to get 120fps at 4k on my LG OLED in many games. Looks absolutely beautiful. In the ones it can't, DLSS comes to the rescue and bang, 120hz.
1440P is horribly pixelated, has a weird aspect ratio. I hated my time with a 1440P monitor many years ago. 1440P Monitors are also garbage compared to 4k OLED in general.
I'm not too sure 4090 can hold consistently (without a single drop), 60fps during gameplay, in CP77, with path tracing NATIVE, at 1080p.4k is a no go.Lower resolutions, such as 1080P and 1440P should be consigned to the history books already, especially when considering a 4090.
There's only one extremely niche use case, that's professional esport gamers (counterstrike, pubg etc) where lower latency and 360hz monitors actually matter, though that's such a small subset of the population it's barely worth mentioning.
4090 needs 4k to shine. 4k is also now mainstream, thanks to the 3 year old current gen of consoles.