• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Caporegime
Joined
18 Oct 2002
Posts
30,142
Welcome to the Nvidia Omniverse Matrix... Pick your pill...



DIGI4TS.gif


QyV2dnX.gif


JssB2Kk.gif


Gtfknr7.jpg
image.png


:D
 
Associate
Joined
28 Mar 2018
Posts
1,430
4090 owners have you overclocked you cards and are you seeing a massive difference in framerates because i have seen some reviews and the overclocked scores are a bit underwhelming.
 
Soldato
Joined
7 Dec 2010
Posts
8,299
Location
Leeds
4090 owners have you overclocked you cards and are you seeing a massive difference in framerates because i have seen some reviews and the overclocked scores are a bit underwhelming.

It's worse than Ampere for overclocking and that's saying something as Ampere wasn't impressive at all for overclocking, they set ADA to 11 out of the box. BUT let's be honest ADA is nothing more than Ampere with more cores and a die shrink.. 60% more cores and 35% faster clocks for basically 60% faster card..So not really earth shattering considering there is 3x the transistors (as Nvidia tells us, but I'm still calling BS on the transistor count) in the ADA die too compared to Ampere.


 
Last edited:
Soldato
Joined
30 Dec 2010
Posts
14,652
Location
Over here
Would anyone be kind enough to list cpu and PSU they have. I am starting from nothing and in the do I need a 1000w gold/plat or is 850 fine back and forth.
Will be with a 7700x (later whatever 3d chip is).
Not interested in waiting for ATX 3
 
Associate
Joined
28 Mar 2018
Posts
1,430
It's worse than Ampere for overclocking and that's saying something as Ampere wasn't impressive at all for overclocking, they set ADA to 11 out of the box. BUT let's be honest ADA is nothing more than Ampere with more cores and a die shrink.. 60% more cores and 35% faster clocks for basically 60% faster card..So not really earth shattering considering there is 3x the transistors (as Nvidia tells us, but I'm still calling BS on the transistor count) in the ADA die too compared to Ampere.
Nvidia screwed over there AIB partners all the cards are basically the same buy whichever one is the cheapest.
 
Caporegime
Joined
17 Mar 2012
Posts
48,361
Location
ARC-L1, Stanton System
My Cablemods cable turned up today :D will still need an adapter to get it perfectly straight but this will have to do for now as with the side panel on it starts to compress the cable a little bit but no worse than the Nvidia one and that hasn't melted in 2 weeks of use :cry: Its so much nicer having a single cable to deal with and get a nice clean run to the psu in the back where it terminates to 3 x 8 pin ( 600w if needed ) .

20221025-151322.jpg

20221025-131348.jpg

20221025-133848.jpg

20221025-135334.jpg

I like your loop loops.... nice job, the whole thing looks really good :)
 
Associate
Joined
8 Sep 2020
Posts
1,460
I like your loop loops.... nice job, the whole thing looks really good :)
Thanks mate , much appreciated :) will be much better when i get the EK waterblock for this card as it currently isn't ideal for temps at all with the gpu being feed preheated air by the bottom and side rads then the top rad is like a furnace at the minute as its getting rid of the heat from the other 2 rads plus a overclocked 4090 on air :cry:
 
Associate
Joined
17 Mar 2017
Posts
866
Location
Manchester
In some respects I can't blame AIB's for trying to cut costs when the FE is literally as fast and almost as quiet as their higher priced alternatives (can completely understand EVGA leaving) but this little oversight could end up coming back to burn them long term.
Yeah I can see the cost cutting scenario and why especially if they had to keep them at MSRP, If they end up Having to do a recall or sending new better adaptors out it will be from Nvidia's pocket and their fault.

We'll know when more people get them in their hands I guess.
 
Associate
Joined
24 Sep 2020
Posts
90
You know what would be nice for the 5090 series. An AIB (or more than one) that was marketed as:

- Basically an FE for people who lost to the bots
- Same power requirement as the FE
- No bigger than the FE
- Runs as cool as we could reasonably get it to
- No unicorn-vomit lighting

I've managed to snag AIBs that basically do that for a couple of generations running, but they should just advertise it - put it on the box.
 
Last edited:
Soldato
Joined
27 Nov 2005
Posts
24,806
Location
Guernsey
Am thinking what happening to cause these connectors to melt is that the female side of the connector is opening up from the pressure of the cable being bent..
Fixing the symptom, not the cause.

The issue of excessive heat, leading to melting, is due to poor electrical contact. The solution is to fix the contact, not just dissipate the additional heat.

Everything points to a bend in the cable causing some of the pins to be partially retracted, reducing contact area, increasing resistance and thus producing more heat.

Solution for now is to keep the cable as straight as possible, even if it means running without a side panel and having it poking out the side, until you can get hold of a better cable or something like CableMod's forthcoming right-angle adapter.

I'm willing to bet that, after a suitable period of claiming there's no issue, NVidia will be forced to provide updated cables and/or adapters to solve this.

This is what maybe happening with the pin power connectors when someone bends the cable
oGwEd5Q.jpg
 
Last edited:
Associate
Joined
10 Jan 2013
Posts
235
Location
London
That's just... I don't even know what to say on that it's just so stupid...

The worst thing about all the power socket issues (and the 4080 that's really a 4060/4070) is that we know nvidia are basically going to do sod all about it and they'll still have pc gamers queuing up to get their new shiny gpu from nvidia no matter what....
In fairness it isn't something new, gamers nexus did a video and apparently the same is officially true of other pcie connectors.
However, I've never had one break and never had a manufacturer come stressing the limits!
 
Back
Top Bottom