• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

What a world we live in...

$399 for a 1.15x improvement over the previous generation

ZUKCbMG.png





So 15% faster than the 3060Ti confirmed so only about 3070 performance.

Wow.
 
And here's Nvidia's response to the whole VRAM thing: https://www.nvidia.com/en-us/geforce/news/rtx-40-series-vram-video-memory-explained/

TL;DR version: "4000 series has more cache than previous gens, and we offer tools to lower VRAM usage".
KHhr8H8.jpg


:cry:

Great article that TBF! Certainly beats the armchair experts insight we often get on here :p Good to see them referring to game patches fixing issues too ;)

"but even that isn’t always accurate."

With reference to in game vram usage, so true, currently playing resident evil 4 remake and according to in game meter, it's 14gb yet in game with MSI afterburner, dedicated usage isn't even breaking 8gb and that's with max settings + ray tracing at native 3440x1440. Looks great and running beautifully too.


Hey great article.. Nvidia even states 16GB cards now 1080p ... 4060Ti....

The article is nothing more than them explaining how they are ripping people off with VRAM and pretending to have software tricks to help and on silicon cache that their competitor has been doing for ages now and they copied it basically and found out they can basically make weaker cards perform better with it and have an excuse why they don't need more VRAM on their cards than their competitor that has more VRAM and Cache...:cry:


:rolleyes:

Ohh also don't forget DLSS 3 AKA fake frames generator... just to make them FPS tools read high fake numbers. :cry:
 
Last edited:
So 15% faster than the 3060Ti confirmed so only about 3070 performance.

Wow.

This was always going to be the case based on the initially rumored, but now confirmed specs for the 4060Ti. The GPU with higher CU count albeit lower clocks is already in use in the RTX 4070 mobile GPU. It performs generally around the level of the 3070Ti Mobile (faster at 1080p, about the same at 1440p due to the bandwidth limitations). The latter in its higher TGP iteration gets pretty close to the Desktop 3070 (Same core, same memory, slightly lower sustained boost clocks etc). Thus a bit of back of the envelope maths put the 4060Ti at 3070 Desktop performance but with DLSS3.0 being heavily leaned on from a marketing push PoV.
 
They'll keep doing it aslong as people keep buying it. The only thing i keep hearing about this gen is "lower power usage", seems to be the key selling point :cry:.

This gens high end uses 100W more power so far 3090 was 350w and 4090 is 450w... no power savings there ... yes I know 3090ti was 450w too, but when the 4090ti comes out at 550w as it will then we will see the 100w increase again.. apples to apples. This gen is a joke all round really a huge insult.
 
This gens high end uses 100W more power so far 3090 was 350w and 4090 is 450w... no power savings there ... yes I know 3090ti was 450w too, but when the 4090ti comes out at 550w as it will then we will see the 100w increase again.. apples to apples. This gen is a joke all round really a huge insult.

Its just one of those things when were nothing left to talk about so lets find one thing thats worth mentioning and keep going on about it.
 
Shows what a lack of competitive competition does.

Nvidia have an effective, 80+/20-% semi-monopoly.

And they're totally happy screwing us with it.
Kinda. Sadly AMD will not show anything better, at best people can grab a cheap 6800 if they can find one but that's it (and that also means giving up on RT & FG). It's hard to believe that they are in a worse situation with every new generation (since Vega) but somehow they've managed it.
 
Kinda. Sadly AMD will not show anything better, at best people can grab a cheap 6800 if they can find one but that's it (and that also means giving up on RT & FG). It's hard to believe that they are in a worse situation with every new generation (since Vega) but somehow they've managed it.

Uncle Jensen said so, you can't overtake me but you are allowed to be on par but mostly subpar.

Pretty much the yearly family gathering :cry:
 
More melty connectors :S


Hey it's the famous power saving feature on the 4000 series.... Cook the connectors so you can't use them...and save power....

Why didn't the 3090ti that had the same 450w power use have the same problem and the same connectors on the GPU as the 4090 ? :rolleyes:

Also the card in his hand is a 4090FE funny enough...

The whole user error thing is still smelling like BS to me as we have never seen this sort of thing before and yes there will always be user error but at this rate by users that buy high end and hopefully understand how to build a pc.. Honestly I wish Nvidia would go back to the old style connectors and drop these fragile fire hazard connectors. I have 2 x 3090's in my setup with each having 3 x 8pin and in total 6 of them and much happier to have them than 2 16pin connectors.
 
Last edited:
This gens high end uses 100W more power so far 3090 was 350w and 4090 is 450w... no power savings there ... yes I know 3090ti was 450w too, but when the 4090ti comes out at 550w as it will then we will see the 100w increase again.. apples to apples. This gen is a joke all round really a huge insult.
I have both, and I assure you the 4090 is way way way more power efficient than a 3090. A 4090 at 320 watts is around 50+% faster than a 3090 at 550 watts.
 
I have both, and I assure you the 4090 is way way way more power efficient than a 3090. A 4090 at 320 watts is around 50+% faster than a 3090 at 550 watts.

Yup just have to watch plenty of 3090 vs 4090 and 3080 vs 4080 comparisons in actual gaming scenarios, ada is stupidly efficient hence why they run so cool and quiet too (obviously big coolers help but 3090 also had big coolers) but wasting your time on purgatory :cry:
 
  • Haha
Reactions: TNA
Yeah it's "up to" 450 watts, that doesn't mean it's using 450 watts even at 100% utilisation depending on the game.

Just double checked in Cyberpunk with PT enabled and went into HWINFO:

SnHfC4n.png


A 400 watt max peak, obviously the average is much lower, 237 watts. Now the 3080 Ti on the other hand would be over 300 watts average.

Edit*
3080 Ti with the same settings, 314 watts but at 31 fps whereas the 4090 is at over 100fps :cry:
 
Last edited:
I have both, and I assure you the 4090 is way way way more power efficient than a 3090. A 4090 at 320 watts is around 50+% faster than a 3090 at 550 watts.

Not talking about efficiency... rated power use 3090 350W TGP vs 4090 450W TGP..


Of course it's more efficient with a two node advantage and higher clocks compared to the 3090 node. BUT it has a 100W higher TGP 3090 FE vs 4090 FE. More power use should equal more performance even without any other changes.
 
Not talking about efficiency... rated power use 3090 350W TGP vs 4090 450W TGP..


Of course it's more efficient with a two node advantage and higher clocks compared to the 3090 node. BUT it has a 100W higher TGP 3090 FE vs 4090 FE. More power use should equal more performance even without any other changes.
The TGP is kinda meaningless. A 3090 at 350w performs like a headless chicken, it's constantly hitting power limits and clocks down. A 4090 with a 450w limit is 99% of the time hitting voltage / clock limits, it rarely hits the 450w power limit. They are not really comparable. If you had a 3x8pin 3090 you'd know, even with a custom 550w bios I still am power limited in most games with the 3090. The 4090 is absolutely nothing like that.
 
Yup just have to watch plenty of 3090 vs 4090 and 3080 vs 4080 comparisons in actual gaming scenarios, ada is stupidly efficient hence why they run so cool and quiet too (obviously big coolers help but 3090 also had big coolers) but wasting your time on purgatory :cry:

Only wasting his time or your time when you don't read what was said correctly.. Or should I put up Nvidia's specs for you too ?


Efficiency and rated power use are two different things.

Also seems I waste your time a lot Nexus too because you don't like hearing the truth too. Anyways no time for trolls.
 
The TGP is kinda meaningless. A 3090 at 350w performs like a headless chicken, it's constantly hitting power limits and clocks down. A 4090 with a 450w limit is 99% of the time hitting voltage / clock limits, it rarely hits the 450w power limit. They are not really comparable. If you had a 3x8pin 3090 you'd know, even with a custom 550w bios I still am power limited in most games with the 3090. The 4090 is absolutely nothing like that.


Again rated card power use from Nvidia .. 3090 350w and 4090 450w. Not talking about anything else here.
 
Back
Top Bottom