• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

I own a 1000w PSU but it doesn't have the 12 pin plug - have I skimped? ;)

Could someone explain what 12 pins provide that the current 16 pin (2*8) doesn't?
The rumor is gauge of cable is higher. Up to 600w per 12 pin.

6 pins do 175-250 each I recall

I got a seasonic 1300w 80+ tit. I didn't skimp either ;)
 
I have a 1000W. I remember it coming in handy when I bought a 295x2 of @RanxZy :)

First game I remember enjoying on it was Crysis 3 which ran beautifully on 4K. No stutter or anything, if all games had crossfire support like that game did.
 
1000W is enough for everyone :p

I'm running two 1080s in SLI and an overclocked 6700K, and my PSU still feels like complete overkill. The most I've managed to draw from the wall is 550W. Of course if I had two 2080Ti's that might push it up by 200W, so no regrets.
 
All this 'it needs a new PSU' stuff is probably some bad translation.

Assuming these support PCIe 4.0 then it probably comes down to this:

PCIe 3.0 cards were limited to a total power draw of 300 watts (75 watts from the motherboard slot, and 225 watts from external PCIe power cables).

The PCIe 4.0 specification will raise this limit above 300 watts, allowing expansion cards to draw more than 225 watts from external cables
.
 
People need to get with the program (figuratively or an actual finance program): a 3000 series GPU for £3000 and an Nvidia-sponsored 3000 watt PSU, and a dildo3000 leather edition!!
 
All this 'it needs a new PSU' stuff is probably some bad translation.

Assuming these support PCIe 4.0 then it probably comes down to this:

PCIe 3.0 cards were limited to a total power draw of 300 watts (75 watts from the motherboard slot, and 225 watts from external PCIe power cables).

The PCIe 4.0 specification will raise this limit above 300 watts, allowing expansion cards to draw more than 225 watts from external cables
.
Don’t think this is the case. Plus only a small percentage of the market will have PCIe 4.0 anyway. I have it :D
 
All this 'it needs a new PSU' stuff is probably some bad translation.

Assuming these support PCIe 4.0 then it probably comes down to this:

PCIe 3.0 cards were limited to a total power draw of 300 watts (75 watts from the motherboard slot, and 225 watts from external PCIe power cables).

The PCIe 4.0 specification will raise this limit above 300 watts, allowing expansion cards to draw more than 225 watts from external cables
.


loads of cards already came with 2x 8-pin connectors, so 150w+150w+75w = 375w rated and some even had bioses that allowed more than that anyway
its really just a recommendation rather than a rule that was actually followed

the pcie specification really has nothing to do with it, if card makers want the cards to use more power they just keep adding more 6/8pin connectors, which is what they already do

even if the new cards do come with a new connector, there will be an adaptor with instructions on how to use

of course some numpty is bound to use too many adaptors/splitters and burn up their psu, but that already happens now
 
I have a 1000W. I remember it coming in handy when I bought a 295x2 of @RanxZy :)

First game I remember enjoying on it was Crysis 3 which ran beautifully on 4K. No stutter or anything, if all games had crossfire support like that game did.
If only, I loved by 295x2 quadfire setup...gaming in my undercrackers in the middle of winter without the heating on in the room!
 
Back
Top Bottom