• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

amps for graphics

Associate
Joined
12 Sep 2011
Posts
261
Hey guys I was wondering there is a amp requirement for the +12v rail for graphics would the card not perform or work at all if the amp rating from the psu was less then what's required even though there was enough watts.
 
A = Watt x Voltage, so +12 V is fixed within the ATX spec your current (Ampere) will be 12 V x W.

You can't have a PSU that outputs the correct wattage, but not enough current, because it's a fixed voltage.

So your scenario isn't possible in our universe. :)
 
The question is a little murky but I believe the op is wondering if too low an current on the 12v rail could cause problems even if the PSU has a fairly high overall wattage, the answer to which is yes.

Most PSUs will split the overall wattage over multiple rails. If this is the case then you have to check the amps available on the rail you will connect the graphics card to. Use the above formula to work out the available wattage on each rail.
 
I think I understand the question.

Cheap generic PSUs often add up their rails and sell them as a peak power unit, where maybe only 50% of the power of the unit can be taken from the 12v rail, the remainder is then on the 5v and 3.3v rails.

If the graphics card requires 30 Amperes just by itself then you'd need considerably more Amperes on the PSU as you also have to run the CPU and motherboard 12v requirements.

What PSU is it? What graphics card is it?

Graphics card makers tend to inflate their current requirements so even though it may ask for 30A it may never draw anything like that by itself.
 
It was just a general question with all graphics card I was just wondering about it because I plan on getting another graphics soon and was concerned about the amps though after checking my psu it had enough amps on the +12v rail but then that got me thinking what would happen if it wasn't enough, sorry for the confusing question and thanks for replies.

So amps don't really matter just the rated wattage becuase if it's a decent psu it will have enough amps for the card regardless, I have a cool master 600watt psu.
 
Although the above equation may be true for a single rails PSU, I'm not so sure for multiple rail PSUs as the ampage has a ceiling which acts as a cut-off.

The “BeQuiet Pure Power L8 530W '80 Plus Bronze' Modular Power Supply” has two 12v multi-rails .. one has 28amp rail (+12V1) and one 20amp rail (+12V2).

but going by the ampage calculation, 530w/12v = 44amps.. ??


It was just a general question with all graphics card I was just wondering about it because I plan on getting another graphics soon and was concerned about the amps though after checking my psu it had enough amps on the +12v rail but then that got me thinking what would happen if it wasn't enough, sorry for the confusing question and thanks for replies.

So amps don't really matter just the rated wattage becuase if it's a decent psu it will have enough amps for the card regardless, I have a cool master 600watt psu.


To be honest I've never really looked into this much and never had any issues. Buying an SLi or Crossfire branded PSU will give you the piece of mind that they are suppling adequate ampage to the rails needed for a dual card configuration but tbh, providing you make sure your PSU is at least 80% efficient (the amount of power it draws from the mains to achieve its rating) and meets all your system power requirements with at least 20% capacity to spare then you shouldn’t run into too many issues
 
A = Watt x Voltage, so +12 V is fixed within the ATX spec your current (Ampere) will be 12 V x W.

You can't have a PSU that outputs the correct wattage, but not enough current, because it's a fixed voltage.

So your scenario isn't possible in our universe. :)

Current (I) is measured in amperes. Power in Watts.

The equation is actually P = VI
(Note. That's for DC. For AC you get: P = VI Cos Y where Y is current-voltage phase angle.)


so in terms of I you get: I = P/V which in your symbols is:
A = Watts / Voltage and not A = Watts x Voltage.

And in reply to the OP. The total PSU power in watts is the total power over all components. If a Graphics card requires X amperes in current and this is not available then it won't have enough power.
 
It was just a general question with all graphics card I was just wondering about it because I plan on getting another graphics soon and was concerned about the amps though after checking my psu it had enough amps on the +12v rail but then that got me thinking what would happen if it wasn't enough, sorry for the confusing question and thanks for replies.

So amps don't really matter just the rated wattage becuase if it's a decent psu it will have enough amps for the card regardless, I have a cool master 600watt psu.

You were more correct initially than people have given You credit for !!

Look at it this way... Watts is like Horsepower for Engines, really it's Bull !, It tells You very little (just impresses the Boys).
Put a 500 HP Ferrari motor in a truck that had a 150hp motor and see how it performs :D

Watts is a similar measurement to Horsepower in as much as its the sum of multiple components.
Torque is what gets the job done for motors and Amps for electrical power.

For Your new GFX card it will need sufficient Current (amps) on the 12v rail, Lets say that's 20A, in watts that's simply 12volts x 20 Amps = 240 watts.

The Confusion always comes because manufactures add up the potential Watts from all the different Voltage rails, so You talking about available Amps at the relevant voltage output is spot on ;)
If You have insufficient it does not mean Your Card will not work, It will lead to unreliable performance under high workload. Also consider other components in the system being able to reduce that available power if they are pulling excess amps.

Basically Good clean(electrically, low noise) power supply's are the fundamental requirement for all electronics.

Hope I helped a little ;)
 
You were more correct initially than people have given You credit for !!

Look at it this way... Watts is like Horsepower for Engines, really it's Bull !, It tells You very little (just impresses the Boys).
Put a 500 HP Ferrari motor in a truck that had a 150hp motor and see how it performs :D

Watts is a similar measurement to Horsepower in as much as its the sum of multiple components.
Torque is what gets the job done for motors and Amps for electrical power.

For Your new GFX card it will need sufficient Current (amps) on the 12v rail, Lets say that's 20A, in watts that's simply 12volts x 20 Amps = 240 watts.

The Confusion always comes because manufactures add up the potential Watts from all the different Voltage rails, so You talking about available Amps at the relevant voltage output is spot on ;)
If You have insufficient it does not mean Your Card will not work, It will lead to unreliable performance under high workload. Also consider other components in the system being able to reduce that available power if they are pulling excess amps.

Basically Good clean(electrically, low noise) power supply's are the fundamental requirement for all electronics.

Hope I helped a little ;)

Thanks for the reply, that clears things up abit confusing manufactures :S.
 
Back
Top Bottom