Hi All,
I'm tackling an interesting problem. I have a current limited battery source (for intrinsic safety) which is limited to 250mA. This battery can either be 3V or 6V. If I can get it working with 3V then that would be ideal. The problem is that I need to pulse an IR LED with 600mA for 5us every 26us in order to transmit data. The source is limited to 250mA by a series resistor. This resistor causes issues due to undesirable voltage drops. I want to use capacitors to provide the extra current, but I also need to stay under 100uF for intrinsic safety requirements.
Assuming our source voltage is 6V and we regulate it down to 3V, I've done some very crude calculations to try and determine how much capacitance is required. We know our required energy, E = (600mA*3V)*5us = 10uJ. And using E = 1/2*C*V^2, I can find the capacitance required, which is around 2uF. Since I don't want to use all the energy in the capacitor in one shot, we can use 20uF for a 10% loss of energy in each pulse. I am not confident that this is the right way to calculate things.
I did some simulations in LTSpice and was getting confusing results, especially when there was a series resistor put inline with the source. It seems that in every configuration, the current being drawn from the regulator is too large and I end up getting huge voltage drops across the source resistor which ends up messing everything up.
I then proceeded to simulate without a series resistor to see how I can lower the current draw from the regulator (thus the source), and shift current output more towards the capacitance. See the pictures below for the schematic and the test outputs. Red is the current going through R3 (I did not have a SPICE model of the LED, so I just used the resistor as the load), blue is the current leaving the regulator (R4), yellow is the current leaving R5, and white or green is the voltage across the capacitor C2.
Using a capacitance of 100uF (C2), I can reduce the current from the regulator by almost two fold compared to using a 30uF capacitor. However, this current will still cause issues with a series resistor.
Does anyone have some advice on the circuit and also the theory on how to calculate these things out? I'd like to be able to calculate things out to determine if this is even possible. What I'm particularly interested in, is how I can manipulate the circuit to minimize current from the regulator and move most of the current output to the capacitors so that I do not cause a large voltage drop on the source. Should I also put a current limiting resistor on the regulator output side? I assume this will effect my capacitor charging time.
Any help would be greatly appreciated, thanks everyone!

I'm tackling an interesting problem. I have a current limited battery source (for intrinsic safety) which is limited to 250mA. This battery can either be 3V or 6V. If I can get it working with 3V then that would be ideal. The problem is that I need to pulse an IR LED with 600mA for 5us every 26us in order to transmit data. The source is limited to 250mA by a series resistor. This resistor causes issues due to undesirable voltage drops. I want to use capacitors to provide the extra current, but I also need to stay under 100uF for intrinsic safety requirements.
Assuming our source voltage is 6V and we regulate it down to 3V, I've done some very crude calculations to try and determine how much capacitance is required. We know our required energy, E = (600mA*3V)*5us = 10uJ. And using E = 1/2*C*V^2, I can find the capacitance required, which is around 2uF. Since I don't want to use all the energy in the capacitor in one shot, we can use 20uF for a 10% loss of energy in each pulse. I am not confident that this is the right way to calculate things.
I did some simulations in LTSpice and was getting confusing results, especially when there was a series resistor put inline with the source. It seems that in every configuration, the current being drawn from the regulator is too large and I end up getting huge voltage drops across the source resistor which ends up messing everything up.
I then proceeded to simulate without a series resistor to see how I can lower the current draw from the regulator (thus the source), and shift current output more towards the capacitance. See the pictures below for the schematic and the test outputs. Red is the current going through R3 (I did not have a SPICE model of the LED, so I just used the resistor as the load), blue is the current leaving the regulator (R4), yellow is the current leaving R5, and white or green is the voltage across the capacitor C2.
Using a capacitance of 100uF (C2), I can reduce the current from the regulator by almost two fold compared to using a 30uF capacitor. However, this current will still cause issues with a series resistor.
Does anyone have some advice on the circuit and also the theory on how to calculate these things out? I'd like to be able to calculate things out to determine if this is even possible. What I'm particularly interested in, is how I can manipulate the circuit to minimize current from the regulator and move most of the current output to the capacitors so that I do not cause a large voltage drop on the source. Should I also put a current limiting resistor on the regulator output side? I assume this will effect my capacitor charging time.
Any help would be greatly appreciated, thanks everyone!


