• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Soldato
Joined
11 Sep 2007
Posts
5,740
Location
from the internet
Admins are busy today!

I assume there is going to be a 4070 16GB one day? I'd be tempted by one of those, but only when it's cheaper.
Seems unlikely at this point. The AD104 chip on the 4070 doesn't support a bus configuration that would work with 16GB without massively tanking the memory bandwidth by going to 128-bit.

I'd say a cut down variant of the 4080 that still supports 16GB is more plausible, maybe as a part of a refresh of the 4070 ti? But it's hard to say to be honest. Sometimes later into the GPU's lifespan they do weird things with the higher end dies like the 3080 12GB which was actually a cut down 3090.

Presumably this happens once they have a large enough collection of defective parts.
 
Soldato
Joined
6 Feb 2019
Posts
17,921
On a side note, Chinese GPU maker, MooreThreads has released their new S70 GPU. It features 22% less performance than the S80 GPU.

The S70 comes with 7GB memory on a 224bit bus, a very odd combo.

These GPUs are still very limited in their capabilities and they still cannot run even a single DX11 or DX12 game. Most of the games they can run are DX9. The issue seems to be mix of drivers needing a lot of work and also these GPUs don't have the onboard hardware support for newer APIs

 
Last edited:
Permabanned
Joined
31 May 2023
Posts
56
Location
Europe
Lower resolutions, such as 1080P and 1440P should be consigned to the history books already, especially when considering a 4090.

There's only one extremely niche use case, that's professional esport gamers (counterstrike, pubg etc) where lower latency and 360hz monitors actually matter, though that's such a small subset of the population it's barely worth mentioning.

4090 needs 4k to shine. 4k is also now mainstream, thanks to the 3 year old current gen of consoles.
I disagree with you for one reason - ray tracing. Ray tracing is a feature that can bog down even the 4090, let alone weaker cards. Now that we are in the era of ray tracing, generational performance leaps need to be much larger, but unfortunately, they aren't. We saw in the case of Cyberpunk and path tracing, or whatever it's called, it can reduce performance by up to 75%. DLSS 2.0 helps a bit, but this DLSS 3.0 is not worth mentioning because those aren't true frames, and they don't provide the feeling of playing at that many frames per second. So, without DLSS, the 4090 runs at 16 FPS. With DLSS, how many frames would it reach, 30 or 40? And we also saw in the Lord of the Rings game, if I remember correctly, where at native resolution, the 4090 only provides 48.1 FPS. What about future UE5 games? If 40 FPS is good enough for you, feel free to continue playing, but it's not sufficient for me.
 
Soldato
Joined
17 Jul 2005
Posts
9,707
I assume there is going to be a 4070 16GB one day? I'd be tempted by one of those, but only when it's cheaper.

Functionally not possible unless they use harvested AD103 dies (4080) with a fully functioning back end (or drop to a lower bus width - 128bit for 16GB or 160bit for 20GB).

A 4070 Super is possible although with the relative positioning of the 4070-4070Ti its difficult to see the point.

Honestly this just highlights if nothing else that the (current) 4070Ti should be the 4070 and the 4070 should be the 4060Ti (ignoring any pricing for the sake of this comparison).

That would align the "4070" with the 3090Ti in line with how the 70 series has generally matched the top single GPU from the previous gen.
 
Associate
Joined
27 Jan 2022
Posts
710
Location
UK
Thanks. I suppose 12GB is just about enough for 1440p. It's such a weird combo to have the 4070 no 16GB and the 4060 Ti 16GB. No idea what Nvidia was thinking there, it's a messy lineup.

This might be OT but the price of Zen 4 vs 5 cpu/motherboard/ram price has doubled. Albeit 16GB vs 32GB. So it's not just Nvidia that is charging more.
 
Soldato
Joined
22 May 2010
Posts
12,362
Location
Minibotpc
Thanks. I suppose 12GB is just about enough for 1440p. It's such a weird combo to have the 4070 no 16GB and the 4060 Ti 16GB. No idea what Nvidia was thinking there, it's a messy lineup.

This might be OT but the price of Zen 4 vs 5 cpu/motherboard/ram price has doubled. Albeit 16GB vs 32GB. So it's not just Nvidia that is charging more.

Almost everyone is on the charge more band wagon tbh, everyone mostly blames inflation but whether they will come down again is anyones guess. Usually prices stay up especially if they know they can get away with it.

As a person in business myself, given the opportunity to keep prices high i probably would unless i saw a huge slump in sales figures then i would consider dropping them again to regain some of that bulk order and potentially keep my sales nice and high.
 

mrk

mrk

Man of Honour
Joined
18 Oct 2002
Posts
101,336
Location
South Coast
CPU limitations in the way of poor engine optimisation, not the CPU itself being the issue ^^

Take last of us as a prime example. Launch time it was sub 85fps a lot of the time, now it never drops before 85 lol on the same card. 4090 is over 100fps but still can't hit 144fps with or without DLSS because the engine is still not optimised for multi core CPU utilisation properly.
 
Permabanned
Joined
31 May 2023
Posts
56
Location
Europe
CPU limitations in the way of poor engine optimisation, not the CPU itself being the issue ^^

Take last of us as a prime example. Launch time it was sub 85fps a lot of the time, now it never drops before 85 lol on the same card. 4090 is over 100fps but still can't hit 144fps with or without DLSS because the engine is still not optimised for multi core CPU utilisation properly.
That's right, people have gotten the impression that processors are too weak due to lazy developers and insufficient optimizations, but the truth is far from that. Unfortunately, the trend is such that we are receiving half-baked games. In the past, game development studios used to pay beta testers, but now we are paying them to test their games.
 
Soldato
Joined
6 Feb 2019
Posts
17,921
CPU limitations in the way of poor engine optimisation, not the CPU itself being the issue ^^

Take last of us as a prime example. Launch time it was sub 85fps a lot of the time, now it never drops before 85 lol on the same card. 4090 is over 100fps but still can't hit 144fps with or without DLSS because the engine is still not optimised for multi core CPU utilisation properly.


I can excuse this game at least, it's a 2013 game with updated graphics it's probably still on th same engine
 
Last edited:
Permabanned
Joined
31 May 2023
Posts
56
Location
Europe
What about DirectX 12 and all those Ultimate features? Are developers utilizing any of the things that Microsoft has been highlighting so much? Even Microsoft itself, as far as I know, doesn't use DirectX 12 for MSFS, lol. In theory, if all the DirectX 12 Ultimate features were properly utilized, processors and graphics cards would age much slower.
 
Soldato
Joined
11 Sep 2007
Posts
5,740
Location
from the internet
That's right, people have gotten the impression that processors are too weak due to lazy developers and insufficient optimizations, but the truth is far from that. Unfortunately, the trend is such that we are receiving half-baked games. In the past, game development studios used to pay beta testers, but now we are paying them to test their games.
I think the "lazy developers" thing is a ridiculous trope. Game devs are one of the hardest working groups of software developers in general (and the pay is generally **** for the skillset required as well). In software development in general, game dev is notorious for long hours, putting in weekends and "death marches".

It's a matter of prioritisation and organisational power. Devs can only work on what project managers want them to work on. If the boss wants more features then that's what will be worked on rather than optimisations. It's a far cry from the 80s idea of a lone programmer locking themselves in a closet and doing whatever they feel like for several months. It's also true that developers who are good at optimising code bases are generally more senior and harder to come by.

On a massive code base like a modern game or game engine, you really have to focus those people on organisational priorities. They're often leading entire teams as well which will take away from their ability to individually contribute to the code base.
 
Last edited:
Soldato
Joined
31 Oct 2002
Posts
9,953
no it doesnt at all, i use a 4090 at 1440p because i like my games to be at least 140+ not the "mainstream" 30fps console peasant edition 4k

4090 is powerful enough to get 120fps at 4k on my LG OLED in many games. Looks absolutely beautiful. In the ones it can't, DLSS comes to the rescue and bang, 120hz.

1440P is horribly pixelated, ultra-wides have an odd aspect ratio. I hated my time with a 1440P monitor many years ago. 1440P Monitors are also garbage compared to 4k OLED in general.
 
Last edited:
Soldato
Joined
19 Oct 2008
Posts
5,954
Thanks. I suppose 12GB is just about enough for 1440p. It's such a weird combo to have the 4070 no 16GB and the 4060 Ti 16GB. No idea what Nvidia was thinking there, it's a messy lineup.

This might be OT but the price of Zen 4 vs 5 cpu/motherboard/ram price has doubled. Albeit 16GB vs 32GB. So it's not just Nvidia that is charging more.
Only until FarCry 7 is released with the HD texture pack :D . Released in Oct 2021 I think, FarCry 6 was a struggle on a 10GB 3080 using the HD text pack.

I don't get the 4060 Ti 16GB. Nvidia in my opinion in the past have been quite good at offering an optimal package targeted at a particular market or resolution in the past. 4060Ti to me is mid-low end, you're going to be dialling back settings anyway so why on earth offer that in 16GB format rather than the 4070/Ti is beyond me, unless I'm totally missing something.

I'm convinced Nvidia are trying their best to not sell anything other than 4090's this generation :D
 
Last edited:
  • Haha
Reactions: TNA
Associate
Joined
11 Jan 2021
Posts
1,111
4090 is powerful enough to get 120fps at 4k on my LG OLED in many games. Looks absolutely beautiful. In the ones it can't, DLSS comes to the rescue and bang, 120hz.

1440P is horribly pixelated, has a weird aspect ratio. I hated my time with a 1440P monitor many years ago. 1440P Monitors are also garbage compared to 4k OLED in general.
like ive said 120 isnt my target, not every game has dlss and i dont give a toss about oled until burn in is not an issue for years of heavy use
 
Permabanned
Joined
31 May 2023
Posts
56
Location
Europe
4090 is powerful enough to get 120fps at 4k on my LG OLED in many games. Looks absolutely beautiful. In the ones it can't, DLSS comes to the rescue and bang, 120hz.

1440P is horribly pixelated, has a weird aspect ratio. I hated my time with a 1440P monitor many years ago. 1440P Monitors are also garbage compared to 4k OLED in general.
I like your trolling style.
 
Soldato
Joined
14 Aug 2009
Posts
2,930
Lower resolutions, such as 1080P and 1440P should be consigned to the history books already, especially when considering a 4090.

There's only one extremely niche use case, that's professional esport gamers (counterstrike, pubg etc) where lower latency and 360hz monitors actually matter, though that's such a small subset of the population it's barely worth mentioning.

4090 needs 4k to shine. 4k is also now mainstream, thanks to the 3 year old current gen of consoles.
I'm not too sure 4090 can hold consistently (without a single drop), 60fps during gameplay, in CP77, with path tracing NATIVE, at 1080p.4k is a no go.
 
Back
Top Bottom