• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Soldato
Joined
30 Dec 2010
Posts
14,755
Location
Over here
First cpu comparison


I'm sure someone will correct me but to me that seems a very strange comparison. Why would you choose the top of the range Intel vs. Whatever a 5800X is.

Unless it's to highlight the value because of 5,800X costs a lot less I think.
 
Last edited:
Associate
Joined
31 Jan 2012
Posts
2,034
Location
Droitwich, UK
I'm sure someone will correct me but to me that seems a very strange comparison. Why would you choose the top of the range Intel vs. Whatever a 5800X is.

Unless it's to highlight the value because of 5,800X costs a lot less I think.
Maybe due to the popularity of the 5000 series, to highlight what users would get now vs what they'd experience with a more recent DDR5 equipped system should they buy a 4090. I don't think it's intended as an AMD Vs Intel piece, particularly due to the difference in system cost as you say.
 
Last edited:
Soldato
Joined
21 Apr 2007
Posts
2,511
First cpu comparison

anything below 4k is for the birds imho, but yes a small Intel advantage (not worth an upgrade though)
 
Associate
Joined
11 Oct 2022
Posts
58
Location
UK
4090 scalped cards are not holding up on FB. Prices dropping like crazy back to RRP.

Yep give it a few weeks or so and hopefully things will start to stabilise.

Some of the prices these greedy scalpers are charging are obscene - can’t see why anybody would pay it. No warranty either.

Ironically you can guy a prebuilt system with a 4090 for a similar price to what they are being advertised at: madness really.
 
Last edited:
Soldato
Joined
30 Dec 2010
Posts
14,755
Location
Over here
Maybe due to the popularity of the 5000 series, to highlight what users would get now vs what they'd experience with a more recent DDR5 equipped system should they buy a 4090. I don't think it's intended as an AMD Vs Intel piece, particularly due to the difference in system cost as you say.

Yes, I suppose the 15% or 7% performance may make a decision clear on an upgrade, especially if you know how your CPU compares to the 5800X.
Hopefully they do more of these.
 
Last edited:
Soldato
Joined
12 Jan 2009
Posts
2,607
Maybe due to the popularity of the 5000 series, to highlight what users would get now vs what they'd experience with a more recent DDR5 equipped system should they buy a 4090. I don't think it's intended as an AMD Vs Intel piece, particularly due to the difference in system cost as you say.
Then they should add the X3D as a 3rd chip imo as that's a simple upgrade, vs a full refit
 
Soldato
Joined
8 Jun 2018
Posts
2,827
First cpu comparison

Hmmm, something is particularly off with those results on the 5800x. And I suspect it's the ram. Ryzen can be heavily penalized with higher latency ram. And at 4000Mhz 20-23-23 is the culprit.
Most people who go ryzen know they need bdie. Something like 3200Mhz CL14-14-14, 3600Mhz CL14-14-14 or 3800Mhz CL14-14-14 would have vastly change the landscape of those results in games.

And, I do wonder why he didn't use better time. A quick check on Aida64 Cache and Memory Benchmark would show that latency greater then 55ms would show a red flag. I am sure his results would be in the sub 60ms at a guess.

Furthermore, he used DDR5-6000 36-36-36-76 2T / Gear 2 memory for the 12900k. Make it a rigged result. He used the worst DDR4 for DDR5. All for the ssake of throughput. If anything the same DDR4 4000 20-23-23 kit should have been used for the cpu.

Wow, and I thought I could trust this guy. Live and learn...live and learn...
 
Last edited:
Soldato
Joined
10 Nov 2006
Posts
8,552
Location
Lincolnshire
I wouldn’t use two. Three cables into the psu should be fine with one of them daisy chained into the forth connector of the 4090. Hope that makes sense.

i think 2 should be good because thats what corsair are recommending with their recently released 12vhpwr cable

Its not the connectors at the PSU end that are the issue, its the cables. They are good for 150w each. The new cables they are sending out have higher rated wiring.

I wouldn't use less than 3.

yeah it would be best to ask corssair tech support, but arent these rated for 25 amps?

I used 3 cables and just doubled up on one, i think i ran out of pci-e cables.

So in reply to all of the above I found this post, sounds right to me, I shall use 2 cables to connect to the 4 way plug.


The PCle 8 pin standard is essentially implicit communication. It says that if all 8 pins are connected correctly, the device (GPU) can draw 150W. Even if you make a cable and PS capable of supplying 1.21 jigawatts, the device (GPU) has no way to
know that, so will onlv ever draw upto 150W sustained per 8 pin connector.
Corsair's cables can sustain 300W. But that will onlv be utilised if provided over two 8 pin PCle connections, so their cables terminate with those two 8 pin PCle connections
At the other end, there's an 8 pin connection to the PSU. That connector IS NOT an 8 pin PCle connector. It's a connector bespoke to the PS manufacturer. It can output up to 300W. It only works correctly on the manufacturer's PSU, as the the
pinout is often not the same. (NEVER use a modular cable from a different manufacturer, unless you fancy risking connecting a ground pin on your GPU to a 12V pin on your PSU.)
So, each connector on the Corsair PSU absolutely CAN deliver up to 300W, and using the twin 8 pin PCle connectors at the other end CAN allow 300W to be drawn.
The nVidia adaptor needs four 8 pin PCle connections as each such CONNECTOR only guarantees 150W. You could be attaching for separate 150W PSUs. You could be attaching a nuclear power station over solid copper bars. nVidia can't tell,
so they will only ever draw 150W per 8 pin PCle connector
Corsair have made a cable for THEIR PSUs. They wire the 12+4 pin connector to tell the GPU that 600W is available. They make the gauge of the cabling capable of delivering that. Thev make the PSU end of the cable connect to TWO of their 300W
PSU outputs. Thus, their single cable CAN sustain 600W from their PSUs.
Note that if you plugged in only One connection at the PSU end, only one of the two sense wires would be grounded (the other would be left "open"), and the GPU would be able to tell. Thus, the GPU would only be "allowed" to draw 300W from
that connector.
An 850W Corsair PS absolutely CAN deliver 600W over either two of corsair's twin headed PCle cables, or one of their 12+4 pin HPWR cables. As that leaves only 250W for everything else, including transients, I'm not sure l'd want to. I'd rather
ensure the GPU be limited to 450W to leave enough capacitv for vour other components.
 
Last edited:
Associate
Joined
17 Oct 2007
Posts
1,778
Location
Some where in England
so have u installed your 4090 into your i7 8700K system? what do u think of it? do u think you're cpu bottlenecked?
From what I have seen if you play in 4k Native yes it is but not by a lot ! But if you are using the new DSLL 3 then 100% No , Just played A Plague Tale: Requiem 4k Ultra settings Quality DLSS and I was getting on average 160 - 185 FPS.
 
Associate
Joined
17 Oct 2007
Posts
1,778
Location
Some where in England
First cpu comparison

So 100% Stay away from AMD if you have 4090 then ! Massive different. Ill keep my eye on there 13gen review over the next coming days and work out what CPU to get.
 
Last edited:
Associate
Joined
28 Mar 2018
Posts
1,430
Yep give it a few weeks or so and hopefully things will start to stabilise.

Some of the prices these greedy scalpers are charging are obscene - can’t see why anybody would pay it. No warranty either.

Ironically you can guy a prebuilt system with a 4090 for a similar price to what they are being advertised at: madness really.
Spending that kind of money you need a warranty i only had my 3090 for a month and had to return it because the fans were faulty imagine if i had no warranty 1500 quid down the drain.
 
Associate
Joined
28 Mar 2018
Posts
1,430
First cpu comparison

Blimey not only do you have to fork out for a 2000 pound graphics card you also need to spend over a thousand more for a new CPU motherboard and memory just to play games not worth it in my opinion.
 
Back
Top Bottom