When a large corporation has decided to build a new model computer, how the engineers decide what CORRECT power supply in electrical term WATTS is need for the new computer.
By National Electrical Code (in USA only) a power supply rating of 500 watts means it can output 500 watts all the time without any problems such as overheating. And, brand name A to Z makes no difference 500 watts output from brand A is the same as 500 watts output from brand Z.
Watts in direct current circuits is a product of electrical terms VOLTS times AMPS. So watts=volts x amps
Volts = pressure or potential applied to the + and - power supply leads
Amps or millamps means the amount of current draw in each electrical circuit.
Hard drives, optical drives inside a computer require by electrical design 12 volts applied to the unit for it to function properly. And the amount of current draw depends of the design of the electrical circuits in the unit.
typical example:
Optical drive 12 volts times 3 amps equals 36 watts of power for that unit.
hard drive 12 volts times 4 amps equals 48 watts of power for that unit.
The addition in watts of all of these units inside your computer gives the MINIMUM of power in watts that is mandatory for the power supply.
So in reality if the sumation of all of these units equals 300 watts a correct size power supply is 500 watts. Always have a MINIMUM of 200 spare watts for the computer to function properly.
By National Electrical Code (in USA only) a power supply rating of 500 watts means it can output 500 watts all the time without any problems such as overheating. And, brand name A to Z makes no difference 500 watts output from brand A is the same as 500 watts output from brand Z.
Watts in direct current circuits is a product of electrical terms VOLTS times AMPS. So watts=volts x amps
Volts = pressure or potential applied to the + and - power supply leads
Amps or millamps means the amount of current draw in each electrical circuit.
Hard drives, optical drives inside a computer require by electrical design 12 volts applied to the unit for it to function properly. And the amount of current draw depends of the design of the electrical circuits in the unit.
typical example:
Optical drive 12 volts times 3 amps equals 36 watts of power for that unit.
hard drive 12 volts times 4 amps equals 48 watts of power for that unit.
The addition in watts of all of these units inside your computer gives the MINIMUM of power in watts that is mandatory for the power supply.
So in reality if the sumation of all of these units equals 300 watts a correct size power supply is 500 watts. Always have a MINIMUM of 200 spare watts for the computer to function properly.
Comment