Leaderboard

Question about Current.

Volitions Advocate

Hero Member
Messages
1,239
I'm building an LED array for a project at school and I'm having some trouble with figuring out what to use for a power source.

Each LED has a 1.5V drop and draws 100 mA.

I'll be making an array of around 120 of them.

My plan is to wire them up in parallel banks of 6 - 8,  I don't want to use a resistor for every LED, but I do want parallel circuits so I can easily see which LED dies if one ever does, and fix it.  I've attached my schematic (which might not be perfectly accurate, im' still learning how to do this)

If I have 15 - 16 of those circuits wired up in parallel from the power source then things should work well.
But I need a power source.

I was thinking at first to just use a wall wart and hard wire it into the array, but most wall warts only supply around 500 mA of current.  So a whole 5 LED's in and I'm done.  I found This power supply here:

http://www.ledssuperbright.com/led-power-supply-c-20/12v-150w-power-supply-12-5a-p-235?zenid=12ea276960d3566ebea66e70fb814212

Which would work, I'm pretty sure.  But some of the guys who do projects like the one I'm doing say they just use a laptop power supply.  the the ones I see only supply about 3.5 amps.

Here's my Question:  if each LED draws 100 mA and I have 120 of them.  Does that mean that I will be drawing 12 Amps of power when I run them all?  Or does it not work quite that way?  Logically I would assume so, but I'm not an expert.

EDIT: I found a better way to ask my question.  Does current drop the same way that voltage does across circuit components?
 

Attachments

>> I found a better way to ask my question.  Does current drop the same way that voltage does across circuit components?

Current doesn't drop like voltage. It's limited by the load. It's always the same through each component in a series circuit, and it divides proportionally through individual components in a parallel circuit.

If your LED truly do draw 100ma each, then yeah. You're going to need 12 amps to drive that load. But, unless they're unusual LEDs, they probably won't draw that much, or at least don't need to. You can put current limiting resistors on them to keep things down to a dull roar.

The part you have spec'd out is an IR LED. Won't make much of a display, unless you're looking at it through night goggles. That might be why it's drawing so much power.

I don't know what you're trying to accomplish, but you may need to dig a little further into the wild, woolly world of LEDs. Your choices are incredibly numerous these days.
 
In parallel, voltage is the same across all points in the circuit.  If each LED is 1.5 volts, then supply it with 1.5 volts instead of 12v.  Also, 120 LEDs at 1.5v and .100 amps is only 18 watts. 
 
Personally, I would run them in series to save current. Don't be worried about LEDs burning out, that almost never happens unless you run them above the maximum forward voltage.

It's not a good idea to run parallel LEDs sharing a single resistor.
 
Yes, but you can definitely open them up with too high a voltage or current. They're essentially diodes, so many of the same sort of caveats apply to their use.
 
My electrical theory is quite elementary at the moment,  the only thing I know about a diode is that it's a one way valve, basically.

Yes, they are IR led's, I'll be looking at them through a modified webcam.  I'm building a Multi Touch surface for my computer.  These are going all the way around a frame pointing to the inside of a 9mm thick sheet of acrylic.

I could wire them like this, http://ledcalculator.net/default.aspx?values=12,1.5,100,120,0

it says it only draws 1500 mA, but I don't understand how that calculation works.  In this case it would be a simple as snipping the end off one of my 1spot's since they'll supply up to 1700 mA.

EDIT: except the 1spot is a 9v not 12v. and doing the same circuit boosts it up to 2 A...  GRRR. this is confusing.
 
It draws 1.5A, because you have 15 sets of LEDs that draw 100mA, running in parallel. So that's 15*0.1A=1.5A.
The eight LEDs in each series set draw 100mA. Each LED draws 100mA, but together, they do not consume any more than that, because the voltage supplied to the set allows you to follow Ohm's Law. 1.5V@ 100mA is 15 Ohms. (1.5V/0.1A=15Ohms) 1.5V through 15 Ohms is 100mA, 3V through 30 Ohms is 100mA, 4.5V through 45 Ohms is 100mA, and so on.
 
Alright, thanks for the help.

Read up on some webpages that talk about why its unreliable to wire in parallel with shared resistors,  So i'll be wiring it up in series parallel.  I've got an 18V 2A adapter for a power source, which should do me just fine.

I'll be sure to show off mycreation when i'm done, it'll be pretty interesting.
 
Volitions Advocate said:
Alright, thanks for the help.

Read up on some webpages that talk about why its unreliable to wire in parallel with shared resistors,  So i'll be wiring it up in series parallel.  I've got an 18V 2A adapter for a power source, which should do me just fine.

I'll be sure to show off mycreation when i'm done, it'll be pretty interesting.

At 18V, you should be able to run twelve 1.5V LEDs in series. Run ten sets in parallel, each with their own current-limiting resistor, and you're in business. At 100mA per set, you won't top an Amp, which is perfect with your 2A supply.

One thing you might want to do is check each LED with a Voltmeter after installation, to make sure that they each receive an appropriate voltage. Not all LEDs are created equal, but in general, they can take a decent range of input voltages. You won't be able to see IR-range wavelengths with your eyes to tell if there are any differences in brightness.
 
That's exactly how I wired it:

here are the buggy results, Preliminary, but promising:

http://www.youtube.com/watch?v=4nDjzyHtD4U&list=UUbpNqkAhCpE6McV31gX-XcA&index=1&feature=plcp
 
Back
Top