I have a battery question that has proved difficult to answer. Apparently it is a tricky question because consultation of over 30 research papers, a 1000 page battery book and questioning several engineers has not revealed the answer. This, despite the fact that it's basically a very simple question. I have a high regard for the depth of knowledge and deep-thinking ability of the people here, so here goes!
First, here's a statement of well-established fact. It is known that the maximum battery capacity (i.e. AHr rating) increases as temperature increases (within reasonable operation limits, of course). This means that a warm battery is capable of storing more energy/charge, compared to a colder one.
This leads to my question.
If you have a colder battery that is fully charged to it's maximum capacity, and then heat it, will the battery have more usable energy, or the same energy? In other words, will the battery still be charged to its (now higher) maximum capacity, or will it now be charged at a capacity less than it's new maximum value? In the second case, you would then need to charge the battery further to take advantage of the higher capacity, while in the first case, you would damage the battery if you charge it further.
I have come up with my own unverified answer, based on logic, but I need a real answer and a scientific explanation of why it is true. My own thought is that a fully charged battery stays near it's maximum capacity as temperature changes and the stored energy goes up and down as temperature goes up and down. I say this because otherwise a battery fully charged at room temperature would be damaged when cooled because it would then exceed it's maximum allowed charge. However, I'm in a situation where guessing, even if rooting in logic, is not allowed.
The basic reason for my question is that I'm making an equivalent circuit model for batteries based on what is available in the literature. I've now found several usable models which include temperature dependence, but seem to predict that capacity changes with temperature, but stored energy does not change with temperature. In other words, it contradicts my logical argument above.
First, here's a statement of well-established fact. It is known that the maximum battery capacity (i.e. AHr rating) increases as temperature increases (within reasonable operation limits, of course). This means that a warm battery is capable of storing more energy/charge, compared to a colder one.
This leads to my question.
If you have a colder battery that is fully charged to it's maximum capacity, and then heat it, will the battery have more usable energy, or the same energy? In other words, will the battery still be charged to its (now higher) maximum capacity, or will it now be charged at a capacity less than it's new maximum value? In the second case, you would then need to charge the battery further to take advantage of the higher capacity, while in the first case, you would damage the battery if you charge it further.
I have come up with my own unverified answer, based on logic, but I need a real answer and a scientific explanation of why it is true. My own thought is that a fully charged battery stays near it's maximum capacity as temperature changes and the stored energy goes up and down as temperature goes up and down. I say this because otherwise a battery fully charged at room temperature would be damaged when cooled because it would then exceed it's maximum allowed charge. However, I'm in a situation where guessing, even if rooting in logic, is not allowed.
The basic reason for my question is that I'm making an equivalent circuit model for batteries based on what is available in the literature. I've now found several usable models which include temperature dependence, but seem to predict that capacity changes with temperature, but stored energy does not change with temperature. In other words, it contradicts my logical argument above.