If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

# Changing power supply to LEDs

Options
• 16-04-2011 12:31am
Registered Users Posts: 352 ✭✭

A DIY electronics venture.......

I have a set of 20 LEDs that are battery powered, and I'd like to connect them to a permanent 12v DC supply that is available.

The LEDs I have are Red (2.0 vd) connected in parallel, supplied by 3 x AA batteries in series (4.5v). There is also a 20 ohm resistor in circuit.

Can I simply replace the batteries with the 12v DC supply?

I'm thinking....

Supply = 12v
VD across resistor = 2.0v (usually)
LED Current = 20mA (usually)
Number of LEDs = 20
Resistor = 20-25 ohms

Would I be right in saying the LEDs would work under the higher supply?

EDIT, Just tested the circuit with 8xAA Batteries in Series (12v). The LEDs arebrighter and work fine. I'm guessing they will be fine long term?
Tagged:

• Options
#2
Closed Accounts Posts: 13,422 ✭✭✭✭

You could probably do with putting a second 20 ohm resistor in series with the first one.

• Options
#3
Registered Users Posts: 352 ✭✭

Thanks for the reply Robbie. If I didn't add another resistor, what might be the side effects? I'm thinking the single existing resistor may just generate a little more heat then it did with the 4.5v supply, but the increase would be negligible.

The LED's would only ever be turned on for a few hours max, not constantly on.

• Options
#4
Closed Accounts Posts: 13,422 ✭✭✭✭

If each led requires 20ma then 20 in parallel will require 400ma.

Circuit resistor using 12v/.4 = 30 ohms.

So it needs a 30 ohm resistor if its 20 x 20ma LED`s used. But we are assuming the LED`s are 20ma loading.

If you take the original setup of 4.5 v and 20 ohm resistor thats 4.5/20 = 0.225 Amps in the circuit. (We assume the diodes have 0 ohm resistance)
This means each LED is taking 11ma`s.

So 12/0.225 = 53 ohms. So a 50 ohm resistor would be the size required.

The wattage dissipation in the resistor would be 0.225 x 0.225 x 50 =2.5w

If you leave as is with the 12v on it, its 12v/20ohm = 0.6 amps. This would be 30ma`s per LED, so maybe they can take that, but to keep it the same as it was with the battery supply, put in a 50 ohm resistor or whatever is nearest.

0.6 amps through a 20 ohm resistor would be 7w dissipation in the resistor.

If the LED`s are overloaded, they will fail prematurely.

Basically by increasing the supply voltage, the circuit resistor has to drop more voltage across it, so it has to be a higher value to keep the required circuit current the same. If the circuit resistor is left the same, a higher voltage is still dropped across it as the supply is increased, but the current through it also increases, so the dissipatated power(heat) in the resistor goes up faster(I squared) than if the resistor is increased to keep the current the same at the higher voltage.

• Options
#5
Registered Users Posts: 352 ✭✭

Nice one for the detailed info! Really apreciate it. I'm currently looking at getting the power supply closer to the original, but if I can't I'll put in another resistor.

Cheers Robbie, much appreciated!

• Options
#6
Registered Users Posts: 368 ✭✭

See my answers to a similar question in Eng. forum a few years back. Diodes are non-linear components so calculations are not as straightforward as you may think.
If you're impatient, with 20 LEDs with Vd=2.0V and If=20 mA on a 12 V supply, you should make four parallel circuits, each with five diodes and one 100 Ω, 0.125 W resistor in series.
12 V - (5 * 2.0 V) / 0.02 A = 100 Ω
Power dissipated by resistor, 100 Ω * (0.02 A)^2 = 40 mW
Total current, 4 * 20 mA = 80 mA
Total power requirement, 12 V * 80 mA = 960 mW.