Hi,
First of all I would like to wish you a happy New Year!
I'm new with Analog design and I'm trying to figure out how to improve the behaviour or my LED driver. My design is powered using a single Lithium Cell. First I have a Step-up stage to boost the input voltage (2.5V-4.2V) up to 6V using the TPS61089RNRR.
Then, I have a Attiny85 controlling a MOSFET to drive two High Intensity LED's. Each LED draws almost 3A with a VF of 3.1V. I have two in series supplied from the Step-Up stage showed above.
The circuit works perfect but I had some issues that I think are related to resistors precision and temperature.
Firstly, I've asembled different prototypes but at the beggining I've found that the current draw from the LED's was to high for the Step-Up and this was switching on and off. Then, I reduced the output voltage from the step-up in order to reduce the current, and this worked great but after some time the issue is repeating again.
I've checked the voltage output from the Step-Up and it is rising up (starting at 6.04V, up to 6.16-6.20V) so the current is also increasing and, I think, this causes the switching regulator to touch the current limit.
To solve this Issue, I'm thinking to add an OP-AMP at the gate of the MOSFET to control the current no mather the supply voltage rises up.
This is what my limited knowledge and research achieved, but I don't know, what kind of OP-AMP use, use high side or low side current sensing and how to improve the circuit.
I've simulated the following circuit but, in order to reduce the power losses on the sensing resistor (R7), I ended up with very low voltage in the inverting input on the OP-AMP. Then I had to apply a voltage divider on the non-inverting input in order to be able to dim the LED brightness.
Do you think the approach I'm having is correct or there is better ways to do this?
I really appreciate if somebody can help me to solve this issue.
Sorry about the lentgh of the topic. Thank you in advance for reading.
Many thanks.
Regards,
First of all I would like to wish you a happy New Year!
I'm new with Analog design and I'm trying to figure out how to improve the behaviour or my LED driver. My design is powered using a single Lithium Cell. First I have a Step-up stage to boost the input voltage (2.5V-4.2V) up to 6V using the TPS61089RNRR.
Then, I have a Attiny85 controlling a MOSFET to drive two High Intensity LED's. Each LED draws almost 3A with a VF of 3.1V. I have two in series supplied from the Step-Up stage showed above.
The circuit works perfect but I had some issues that I think are related to resistors precision and temperature.
Firstly, I've asembled different prototypes but at the beggining I've found that the current draw from the LED's was to high for the Step-Up and this was switching on and off. Then, I reduced the output voltage from the step-up in order to reduce the current, and this worked great but after some time the issue is repeating again.
I've checked the voltage output from the Step-Up and it is rising up (starting at 6.04V, up to 6.16-6.20V) so the current is also increasing and, I think, this causes the switching regulator to touch the current limit.
To solve this Issue, I'm thinking to add an OP-AMP at the gate of the MOSFET to control the current no mather the supply voltage rises up.
This is what my limited knowledge and research achieved, but I don't know, what kind of OP-AMP use, use high side or low side current sensing and how to improve the circuit.
I've simulated the following circuit but, in order to reduce the power losses on the sensing resistor (R7), I ended up with very low voltage in the inverting input on the OP-AMP. Then I had to apply a voltage divider on the non-inverting input in order to be able to dim the LED brightness.
Do you think the approach I'm having is correct or there is better ways to do this?
I really appreciate if somebody can help me to solve this issue.
Sorry about the lentgh of the topic. Thank you in advance for reading.
Many thanks.
Regards,