Hello, I don't know for sure if this is my first post because I have been reading this site for years and I found here all the answers to my questions even before asking. But this is a doubt I have been having for a long time.
We all know that to charge a battery, say a 12v lead acid battery, voltage should be around 13.5 to 15 v, depending on state of charge and charging current.
I found a 12v lead battery laying around, voltage was 9.8 when checked with the multimeter. I don't have a specific charger for this battery, so I used a variable voltage battery eliminator, yeah those that have a selectable voltage 3, 4.5, 6, 7.5, 9, 12.
The one I have is a very cheap one, it's rated 1 ampere. When I selected 7.5 volts, the multimeter read 14.00 volts, or about the right charging voltage. When I measured its current capabilities (measureing current with no load), it read 1.5 amps. So I thought, it gives 1.5 amps at 14 volts, that's exactly what I need.
But then when I connected the charger to the battery, the meter would read 10 volts, just a bit more than the already discharged battery. I thought the power source wasn't really able to give the 1.5 amps. So I selected 12v in the eliminator, with no load the meter read 23.5v and 1.25 amps. Again I connected it to the battery and read the voltage, 13.8 volts, exactly what I needed, and current was .37amps.
The different voltages and currents indicated were measured while charging the battery, to measure the current I disconnected one terminal and then put the meter in series.
The problem here is that I don't know which is REALLY the charging voltage, is the battery receiving a 1.25amps @ 23.5v charge? Or a .37amps @ 13.8 v charge? Why are the current and voltage lower when charging if the power source is able to handle that?
thank you in advance, sorry for being repetitive but I wanted to be as clear as possible.
We all know that to charge a battery, say a 12v lead acid battery, voltage should be around 13.5 to 15 v, depending on state of charge and charging current.
I found a 12v lead battery laying around, voltage was 9.8 when checked with the multimeter. I don't have a specific charger for this battery, so I used a variable voltage battery eliminator, yeah those that have a selectable voltage 3, 4.5, 6, 7.5, 9, 12.
The one I have is a very cheap one, it's rated 1 ampere. When I selected 7.5 volts, the multimeter read 14.00 volts, or about the right charging voltage. When I measured its current capabilities (measureing current with no load), it read 1.5 amps. So I thought, it gives 1.5 amps at 14 volts, that's exactly what I need.
But then when I connected the charger to the battery, the meter would read 10 volts, just a bit more than the already discharged battery. I thought the power source wasn't really able to give the 1.5 amps. So I selected 12v in the eliminator, with no load the meter read 23.5v and 1.25 amps. Again I connected it to the battery and read the voltage, 13.8 volts, exactly what I needed, and current was .37amps.
The different voltages and currents indicated were measured while charging the battery, to measure the current I disconnected one terminal and then put the meter in series.
The problem here is that I don't know which is REALLY the charging voltage, is the battery receiving a 1.25amps @ 23.5v charge? Or a .37amps @ 13.8 v charge? Why are the current and voltage lower when charging if the power source is able to handle that?
thank you in advance, sorry for being repetitive but I wanted to be as clear as possible.