• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

How many romps do the 40 series have compared to the 30?

According to Gibbo at least 2000 romps :cry:


Can't find many below £500 on a nationwide search.

If so, looks like @TNA has just overpaid :p

Haha. It’s cool. One will have approximately two and half year warranty and used by trusted members and unlikely to be mined on and is a LHR card. The other would have no warranty, been mined on its whole life and if it goes kaput so does your £500. I think I know which I would go for :D
 
Everyones saying the 4080 12gb is the 4070 but when you look at the cuda difference the 4080 16gb has 40.6% less than a 4090 which in itself puts it in xx70 class territory while the 4080 12gb lines up perfectly with the 3060ti from last gen.

3090 > 3080 = -17% less cuda
3090 > 3070 = -43% less cuda
3090 > 3060ti = -53% less cuda
4090 > 4080 16gb = -40% less cuda
4090 > 4080 12gb = -53% less cuda

Even looking at the weaksauce 2080 its still closer to a 2080ti than a 4080 16 is to a 4090

2080ti > 2080 = -29% less cuda

This cognitive dissonance and departure from the obvious should have been expected as it's often a known symptom of being a billionaire; :)

You have Trump with his "I won the election but it was stolen".
You have Putin with his "it's not a war but a special operation"
and now you have Jensen with his 'it's not a 4070 it's a 4080 12Gb'. ;)
 
Last edited:
Yeah there's a *lot* more to it than that - https://www.club386.com/nvidia-geforce-rtx-4080-12gb-loses-out-on-more-than-just-memory/

tl;dr - 22% fewer SMs, meaning 22% fewer CUDA cores, tensor cores, RT cores etc, impacting throughput by about a quarter (we obviously won't know how much it affects gaming until it's tested by third parties.)
The 12GB is a different chip (AD104 vs AD103), with a narrower memory bus (192 vs 256). Basically, it's the xx70 card given an xx80 name. Also there's no FE for this one.
Yeah watched jay video now, wtf are Nvidia doing giving it the same model number, trading regulators need to step in on that.
 
rasta is on the path of diminishing returns, unless you are okay with 5 slot graphic cards.. its pretty straightforward to see what dlss brings in terms of efficiency
reviewers seem to be stuck in the past if they cant see an obvious technological revolution, they are going to eat those words 5 years down the line and would have lost all credibility
but i am only looking at the 4090 when i make these statements, but still i believe this is the right time to consider dlss as a core offering instead of a shiny new add-on
Personally I dont think is many credible reviewers right now.

They dont seem to want to adapt their review methods, and they fast becoming out of date, lack of proper VRAM analysis and also tend to only bench games that are well optimised. So I suppose pretty much what you said not properly adapting to a changing landscape.

The problem with DLSS is it still remains only supported by an absolutely tiny amount of games. partially remedied by DLDSR.
 
Last edited:
Yeah watched jay video now, wtf are Nvidia doing giving it the same model number, trading regulators need to step in on that.

lol, do you think a company the size of nVidia hasn't run this strategy past their lawyers first? It was likely one of the first discussions they had.
 
I'm sure they worked all through the 30xx series life-cycle without ever once getting it running on 30xx series cards ;)

It's worth noting that DLSS 2.0 was released publicly in April 2020 (5 months before the launch of the 30 series) given that the only major new component in DLSS 3.0 *is* the frame doubler, it's safe to assume that feature's been in development for at least 2.5 years (using 20 and 30 series cards to prototype it). It's *possible* that hardware changes were added to Ada to facilitate/improve it, but it would have been up and running to some degree on 30 series cards in nVidia's labs long before Ada was fabbed.

This reeks of a marketing decision to me.
Who wants the feature though?

I have seen loads hating on it.

Apparently it will hurt competitive shooters. Because the frames are unpredictable.
Not needed for low frame rate games because the cards are now powerful enough to run everything at 60 or below at least with DLSS or with no RT.
I also disable Nvidia Reflex personally.
 
I won't be buying these are current prices, my benchmark is 3070 cost me £370 so the 4070 is worth £400ish to me, when it's that price I'll entertain the idea. There isn't even anything my PC can't run at max settings so maybe if a most-have game comes out but they usually get delayed anyway.
Thats the issue for me.

We have this 4090 which has way too much grunt lol, who needs all that horsepower now, unless you want to play games at 800fps or something. :)

The 4080 12 gig I expect will still be faster than the current 3080 so again enough grunt, but not enough VRAM.

The bits I am interested in are 16 gig VRAM and the new encoder. But now the encoder is part of the spec sheet I expect the 4070 and below to get a gimped version, and I will be surprised if there is a 16 gig variant of the 4070.

Really wish GPUs were modular. :(
 
Last edited:
We have this 4090 which has way too much grunt lol, who needs all that horsepower now, unless you want to play games at 800fps or something. :)

Literally anybody who is playing games above 1080p with a 144fps monitor

The 4090 stock apparently gets 59fps in Cyberpunk.... at 1440p - you can never have too much horsepower in a world where game optimisation is the last priority for most game developers.
 
The 4090 is 67% faster than the 3090Ti in Cyberpunk 2077.

It has 16384 shaders at 2520Mhz

The 4080 16GB has 9728 shaders at 2505Mhz
The 4080 12GB has 7860 shaders at 2610Mhz

For 67% higher performance than the 3090Ti it has 68% more shaders than the 4080 16GB at the same clock, the 4090 has 108% more shaders than the 4080 12GB, which has a 3.5% higher clock.

I just found that interesting, the 4080 16GB is very much cut down from the 4090, with much less memory bandwidth than the 3090 its probably around that performance, the 4080 12GB has less than half the shaders of the 4090, which again is only 67% faster than the 3090Ti, and nearly half the memory bandwidth, that's going to be more like a £950 3080.

Have fun with that...

PS: whose buying a 4080 of any sort?

s3qrdm8.png
 
Last edited:
This cognitive dissonance and departure from the obvious should have been expected as it's often a known symptom of being a billionaire; :)

You have Trump with his "I won the election but it was stolen".
You have Putin with his "it's not a war but a special operation"
and now you have Jensen with his 'it's not a 4070 it's a 4080 12Gb'. ;)
The sad truth is the 12gb model is not even really worthy of being called a 4070 as its closer in specs to what the 3060ti would have been to a 3090 than what a 3070 would have been, now the 16gb version on the other hand lines up nicely with the 3070ti so it don't even look like we're getting the real 4080 this time at all.

Both cards are further away in spec to the 4090 than what the 2080 was to a 2080ti and that card was regarded as pretty poor yet this time we have similar situation but with almost double the prices.
 
I'll probably get the 16gb, if I don't get a 4090

Its entirely your choice, but IMO i feel like i have to tell you you're buying a GPU for what? £1300? that's not going to be any better than a 3090 that you can buy right now for £1000 :)
 
Literally anybody who is playing games above 1080p with a 144fps monitor

The 4090 stock apparently gets 59fps in Cyberpunk.... at 1440p - you can never have too much horsepower in a world where game optimisation is the last priority for most game developers.
So whats happened there, thats lower than the 3080 FE got in June 2021 on the game at 1440p.

Has the game code been changed to sell newer GPUs or something?
 
Last edited:
Back
Top Bottom