From what I read you need to run a resistor to limit the current so they dont burn out right away. In my EE lab I actually lit an LED on fire by pushing a large amount of voltage through it at like 2 amps. I was hoping to use a 9V battery, and calculated the necessary 9.6V by using the "Forward Voltage" of each LED which is 3.2V. since they are in series, adding up the 3 LED's will give you the 9.6V. Then wiring multiple Series of 3 LEDs in parallel would allow each parallel series to get equal voltage of 9V.
Im pretty sure I figured it out though, here is what I decided on. Im gonna do my best to explain it, but i might just give in and draw a picture for you guys!
I also need to go see what radioshack has in stock for LED's, but I read that Blue LED's typically run at 20milliAmps and require a Forward Voltage of 3.2V to produce maximum output. If these numbers change, its pretty simple to re calc what I need, I just want to double check my method with you guys.
Assuming, 9V source, the 3.2V and 20mA needed for each LED, 2 LED's per series, and a total of 20 LEDs in 10 parallel series.
Resistance= (Vs-FV)/C= (9V-(3.2V*2))/.02A= 130 Ohms = resistor for each series of LED's
This will work since the voltage will be the same to each series since they are in parallel, and then the current will stay the same across each series as the voltage drops. Right?
|