• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

For anybody interested

I messaged superflower a few hours ago asking if they will be releasing a 12+4 that is direct from psu to gpu. This was the response

Dear Customer,


Thank you for your email and asking.

We are planning to sell a 2*8 (or 3*8) to 12+4 direct cable. However since we don't have 40 series card on hand, we need to do more researches and testing about the compatibility.

From our point of view current adapters provided by Nvidia should be okay to use with existing PSU.

Thank you for your support to us.

Best Regards,

Customer Service
 
According to IgorsLab the reason for the 600w rumours and the reason for these huge 4090 coolers is because Nvidia was considering using Samsung and the Samsung version of the 4090 would have needed a 600w TDP due to how much more inefficient it is compared to TSMC's node.

Because these coolers were designed for a 600w+ GPU, you'll see that all 4090's run very cool, the leak of 55c 4090 a few weeks ago was not fake

 
Last edited:
Doubt that.. Igor's going in conspiracy nut territory.

I admit, I have some doubts but where it kinda makes sense is that the AIB 4090 coolers are massive compared to 3090ti coolers even though both cards ship with 450w TDP. Why did the AIBs all increase their cooler sizes by like 30%-40% just to dissipate the same heat as last generation, unless Nvidia asked them to prepare for much more heat and then the AIBs didn't want to waste their development costs so stuck with larger coolers anyway to recover their costs.

we know Nvidia hides info from AIB and they may not have told AIB that the 4090 won't be 600w+ anymore until they had already finished developing their next Gen coolers..
 
Last edited:
The price hasn't doubled. I do wish people would stop repeating that. It's +70% or so. You could argue that's as good as double, but 'doubled' just isn't accurate.

Not going to defend the pricing of the 4080s. It's ludicrous, with no excuse. The only reason they're priced that way is because of the millions of 30-series NVIDIA still has to sell. So people buying a 4080 are going to keep paying for NVIDIA's failure to predict the end of crypto-mining and it's totally unfair.

If the 4080 is what you want, I would strongly advise waiting till next year after the 30-series inventory has hopefully cleared. Bit of luck pricing will come way down along with it.
Price has gone from 650 to 1269? So a few quid off 100%? Where are you seeing them at only 70% more?
 
I admit, I have some doubts but where it kinda makes sense is that the AIB 4090 coolers are massive compared to 3090ti coolers even though both cards ship with 450w TDP. Why did the AIBs all increase their cooler sizes by like 30%-40% just to dissipate the same heat as last generation, unless Nvidia asked them to prepare for much more heat and then the AIBs didn't want to waste their development costs so stuck with larger coolers anyway to recover their costs.

we know Nvidia hides info from AIB and they may not have told AIB that the 4090 won't be 600w+ anymore until they had already finished developing their next Gen coolers..

but still wouldn't they have just reused 3090 ti designs.. I am guessing the bigger cards are actually pushing 500-600w and will be overclocked a fair bit.. I had seen a 3.4 GHz tspy leak somewhere
 
Last edited:
Considering these cards can draw up to 600w and one PCIe out from the PSU is only rated up to 150w, this could cause problems, no?

Additional PCIe cables for specific power supplies are harder to find than you might imagine. So one thing I would do is ensure I have four separate PCIe cables before attempting to install the 4090.

If I got something wrong here, please let me know, guys. Just thinking out loud.
I've bought an 8 pin pcei connector that splits into two 8 pin connectors for my psu/gpu that i'm going to use for a 4090. there's already 1 there with 2 x 8 pin connector on 1 end (the gpu end). yeah i'm worried it might not work because of the 150W per pcie cable power limit. i want a 4090FE, power draw is 450W. i see elsewhere someone is wondering whether to buy a 12 pin to 2 x 8 pin gpu to psu connector, so they'll be using just one 8 pin pcie cable to connect to the gpu's 12 pin connector. seems even worse than what i'm thinking of attempting.
 
Last edited:
Price has gone from 650 to 1269? So a few quid off 100%? Where are you seeing them at only 70% more?
$700 to $1200 is 70% – that's one way to look at it. Also, what you're doing there is comparing the lowest-spec 3080 with the highest-spec 4080. The right comparison is 3080 10GB versus 4080 12GB (£649 versus £849 – 30% increase), and 3080 12GB versus 4080 16GB (£749 versus £1199 – 60% increase). There are lots of ways of slicing it, but the most dishonest way is to claim the 4080 is 'twice as expensive' as the 3080. It really isn't, so repeating that it is isn't helpful, it's just rhetoric designed to make people angry.

I'm not going to say the pricing of the 4080s isn't ridiculous. It absolutely is, and no one should buy one in an ideal world! But twice the price is a stretch too far, and unhelpful to the conversation.
 
Last edited:
I've bought an 8 pin pcei connector that splits into two 8 pin connectors for my psu/gpu that i'm going to use for a 4090. there's already 1 there with 2 x 8 pin connector on 1 end (the gpu end). yeah i'm worried it might not work because of the 150W per pcie cable power limit. i want a 4090FE, power draw is 450W. i see elsewhere someone is wondering whether to buy a 12 pin to 2 x 8 pin gpu to psu connector, so they'll be using just one 8 pin pcie cable to connect to the gpu's 12 pin connector. seems even worse than what i'm thinking of attempting.
Yeah, I would be concerned. The other poster a few messages back who posted the email from Superflower seems to confirm these doubts too. They basically are unwilling to say it's safe because they haven't tested it yet.
 
Are Zotec cards bad?
People on forums keep mentioning them, but it seems more like the 80's when people used to say get a Skoda as a joke thing.
i had a zotac 3090 trinity. going by benchmarks it was 1 of the slowest 3090s, the 3080ti fe that replaced it was faster (both stock). also, despite my best efforts it just would not flash a new bios (i've had no problems doing this before with other cards). i will not buy a zotac 4090 trinity, personally.
 
No they are not I've had 2 with ZERO problems 5 yeear warranty.
Good to hear. I’m thinking of getting a zotac this time around, one of the overclockers ones, although from what I’ve seen I don’t think the oc is actually that much.

Although interesting what tyke has said above me.
 
Last edited:
Back
Top Bottom