Let me first say to the major contributors to this site THANKS! This site was a great help when we covered something in school and I needed some additional info to make it stick. Sometimes it made me wonder why I was spending so much to learn the same things....
Anyway, I am currently designing a Lithium Ion / Polymer battery charger. LiPo chargers at their hearts are just simple Constant Current supplies until each cell is at the max voltage (usually 4.2V) and after that they are Constant Voltage supplies until the current has reached a negligible amount. I am very confident with the microcontroller/code side of things, but I am weak transistor/power design department.
Key Design Points -
Voltage Measurements - An array of SSRs connected to a single accurate/calibrated buffer+ADC will switch through each cell and measure the voltage. A single buffer+ADC line is preferable to have high accuracy/repeatability, with a lower cost than having moderate accuracy for all lines.
Current Measurements - Allegro Microsystems Inc Hall Effect sensor for high accuracy, low loss, safe isolated design.
Input 12-19VDC (commonly used power supply in the hobby, most have 20A+ supplies)
Charging Output 1S-6S Lithium Based Battery (3.0V minimum 25.2V max). I want at least 10amps, but if a higher current is only limited by a more expensive FET/bigger heat sink, that would be preferable.
A lot of designs for chargers of this type use some sort of pre-built voltage buck/boost chip or schematic with discrete components. If I am already taking so much care to measure the voltage and current with a high level of accuracy, why not just have the microcontroller modulate the converter? The only cons to this type of design I can think of is relying on software for power control can be dangerous, but an external watch dog circuit can be devised. What would the current supply circuit look like? I am assuming some sort of Boost, Flyback, Cuk, or Sepic supply, but at first glance it looks like those are primarily design to take a constant input voltage and supply a constant output voltage. Can I just switch a FET? Any input on this would be great.
Anyway, I am currently designing a Lithium Ion / Polymer battery charger. LiPo chargers at their hearts are just simple Constant Current supplies until each cell is at the max voltage (usually 4.2V) and after that they are Constant Voltage supplies until the current has reached a negligible amount. I am very confident with the microcontroller/code side of things, but I am weak transistor/power design department.
Key Design Points -
Voltage Measurements - An array of SSRs connected to a single accurate/calibrated buffer+ADC will switch through each cell and measure the voltage. A single buffer+ADC line is preferable to have high accuracy/repeatability, with a lower cost than having moderate accuracy for all lines.
Current Measurements - Allegro Microsystems Inc Hall Effect sensor for high accuracy, low loss, safe isolated design.
Input 12-19VDC (commonly used power supply in the hobby, most have 20A+ supplies)
Charging Output 1S-6S Lithium Based Battery (3.0V minimum 25.2V max). I want at least 10amps, but if a higher current is only limited by a more expensive FET/bigger heat sink, that would be preferable.
A lot of designs for chargers of this type use some sort of pre-built voltage buck/boost chip or schematic with discrete components. If I am already taking so much care to measure the voltage and current with a high level of accuracy, why not just have the microcontroller modulate the converter? The only cons to this type of design I can think of is relying on software for power control can be dangerous, but an external watch dog circuit can be devised. What would the current supply circuit look like? I am assuming some sort of Boost, Flyback, Cuk, or Sepic supply, but at first glance it looks like those are primarily design to take a constant input voltage and supply a constant output voltage. Can I just switch a FET? Any input on this would be great.