I am currently designing a rocketry flight computer that must fire an ignitor (1-ohm bridgewire resistance, 1A recommended firing current). In my previous version I used a battery with a high enough current discharge rating to fire the ignitor straight from the battery. With this new version it is crucial to make the board and the device as a whole as small as possible. The battery I used was the largest component in the device and it's capacity was far more than the device required for it's application. I want to use a smaller battery but with the constraint that I can't use a LiPo (as per competition rules) the current required to fire the ignitor is over 3 times the max discharge current rating of the battery I have. How can I increase the current through the ignitor to 1A if my battery can only discharge 0.3A? I am trying to gain a better conceptual understanding of the relationship between V, R and I in practical application. Ohm's law says that my 3.7V battery connected in series with a 1-ohm resistor (the ignitor) should have 3.7A flowing through it right? But the battery can't discharge 3.7A. If the ignition happens when the resistance to the flow of current creates enough heat to ignite the pyrogen is it the amount of current that does that or the power (I*V)? Is a buck converter able to do this by stepping the voltage down from ~3.7V to ~1V is the output current then increased to 1A? Or only if the connected load draws that much? The output power hasn't changed other than minor losses but could that now successfully fire the ignitor? Are there other solutions? I am aware that a large capacitor would accomplish this but I am more focused on the conceptual understanding here as well as finding a solution that has a smaller footprint/volume than the required capacitor. Any explanations or learning resources anyone could share would be greatly appreciated!
