Watts, Volts, and Amps

May 2, 2005
6 min. read

or How not to look Stupid and Burn Down a House

Note: This was an article written for a now defunct site about film making. Thus, the intended audience of the article were low budget independent filmmakers. I wrote this to address may of the problems I witnessed on various sets. However, this applies to everyone that uses extension cords for anything.

This article is going to be a little technical and possibly boring to some, but the subject is important. I’m an Electrical Engineer, who also works on films, so I tend to take all of this knowledge for granted. The math involved is really simple, so hopefully this will be more educational than painful for those of you who hate math.

When lighting is used on a set, you must always be conscious of how the power is getting distributed. If not, possibilities include starting a fire, blowing fuses and breakers, or other dangerous and embarrassing things. It is all about the current.

Lights are rated in wattage, a measurement of electrical power. There is a relationship between Power, Current, and Voltage that states: the Power in Watts (W) is equal to the Current in Amps (A) times the Voltage in Volts (V).

In a formula this would be: Power = Voltage x Current or in units Watts = Volts x Amps.

We have all seen several voltages in the real world. AAA, AA, C, or D batteries are 1.5V, car batteries are 12V, and house wiring in the US is mostly 120V. Current is a measure of the flow volume of electrons and much less familiar to the average person. This is where we have to be careful, as too much current flow makes bad things happen.

Here is how to find the current for a light: Above we learned that Watts = Volts x Amps. If we divide both sides by Volts, we see that Amps = Watts / Volts.

Lets use this in a few examples. A 100W bulb, plugged into a 120V socket would use 100W/120V or 0.83A of current. A 500W bulb would use 500W/120V or 4.2A of current. Now that we can calculate current, what should we do with it?

Most of the people reading this article will be using lower wattage lamps (1000W and under) that plug into standard household sockets. These circuits are almost always rated at 15A. This means that if you plug in enough things to draw more than 15A, either a fuse or circuit breaker will blow. Assume that you are lighting a kitchen scene and you require three 500W hot lamps. (OK, I know that this would both light the scene AND cook the food, but just play along.) From the calculation above, we know that a single 500W lamp requires 4.2A and three lamps will pull three times as much or 12.6A.

Looking at that value, you see that you can plug those into the outlets in the kitchen, because the lights draw less that 15A. The scene is lit up, and the actors are in place. “Action.” The dialog starts, just as the fridge kicks on and “BOOM.” The power goes out to the kitchen and all the lights go off. Since the refrigerator is on the same kitchen circuit, when the compressor kicked on, that circuit is now quite a bit over 15A and the breaker or fuse blows. This is embarrassing and annoying. It can also be a sign to the location owner that you really don’t know what you are doing.

The easiest solution to this is to run a couple extension cords (or “stingers” in movie vernacular) from other rooms. Typically, each room will have its own 15A circuit. If you are unsure, you can ask to see the fuse box or breaker panel. Hopefully the circuits will be listed in a readable format, where you can determine which circuits are separate. Occasionally you will find a breaker panel that is full of nothing but hieroglyphics. In this case, your best bet is to spread out the outlets used as much as possible.

I’ll wrap up with a little advice on the use of stingers. When you purchase some extension cords for lighting, make sure to look at the possible current draws for which you will use it. In the US, wire size is specified by gauge. As the gauge number gets smaller, the wire diameter gets larger. While you will most likely be fine with cheaper cords for short runs inside, if you are using quite a few lights outside this is something you need to think about. When lighting up a scene outside, typically only one cord distributes the power out to all the lights. If this run is very long, you need to be careful. To the right is a table showing the extension cord gauge that must be used to safely power lights drawing different currents at different cord lengths.

{} | Length | 10A | 12A | 15A | |——–|———|———|——–| | 25’ | 16 ga. | 14 ga. | 14 ga.| | 50’ | 14 ga. | 14 ga. | 12 ga.| | 100’ | 12 ga. | 12 ga. | 10 ga.| | 150’ | 12 ga. | 10 ga. | 8 ga.| | 200’ | 10 ga. | 8 ga. | NO!|

If smaller cords (higher gauge numbers) are used, you have the possibility of overheating the wires and starting fires. This is much more dangerous than just blowing a circuit breaker. The circuit breaker will not blow on an overloaded extension cord, if the current is within the amperage allowed by the circuit breaker. It has no understanding that you tried to use a 100’ cord with 16 or 18 gauge to power 15A worth of lights. But, when you grab that cord it will be warm or possibly hot and that is not good at all.

I don’t expect that everyone will bring a calculator to set to do power calculations, but I hope everyone takes away some respect for the problem. This same problem will cause numerous house fires this holiday season, when people decorating their homes or Christmas trees overload the cords. While Christmas lights don’t draw a great deal, they are also built with very small wire. If you chain enough of the lights together, the first light strand will carry all the current for all the lights. When this is next to wood or a Christmas tree, powered 247, it is a recipe for a fire. Everyone be careful out there.

comments powered by Disqus