Hi guys I want to ask sth
I am working on a dc dc buck-boost converter which feed a battery and a load in parallel .
the batery can be modeled as an internal voltage sourse in series with its internal resistance which is 0.1Ohm.
the output of the converter is in two stages: In the first stage when the converter charges the battery the converter feed the battery with an 20 A current and its internal voltage is 59.7V so the voltage on the battery (and the output voltage of the converter) is Vb1= 59.7+20*0.1=61.7V
During the second stage when the battery is charged the converter feed the battery with 1A current and the internal voltage of the battery is 61.5V so the voltage on the battery (and the output voltage of the converter) is Vb2=61.5 +0.1 * 1 =61.6V.
The question is would in real world this slight difference between the voltages Vb1 and Vb2 of the two stages create problems in the operation of the converter or on the circuit?
I am working on a dc dc buck-boost converter which feed a battery and a load in parallel .
the batery can be modeled as an internal voltage sourse in series with its internal resistance which is 0.1Ohm.
the output of the converter is in two stages: In the first stage when the converter charges the battery the converter feed the battery with an 20 A current and its internal voltage is 59.7V so the voltage on the battery (and the output voltage of the converter) is Vb1= 59.7+20*0.1=61.7V
During the second stage when the battery is charged the converter feed the battery with 1A current and the internal voltage of the battery is 61.5V so the voltage on the battery (and the output voltage of the converter) is Vb2=61.5 +0.1 * 1 =61.6V.
The question is would in real world this slight difference between the voltages Vb1 and Vb2 of the two stages create problems in the operation of the converter or on the circuit?