I'm not a true engineer, so my question probably little bit naive. I'm trying to figure out how to power up a DC device via 600 meters cable. What do I have.
Device has Nickel battery inside, charger gives up to 28V
Device have external power mode which require 24V
According to manufacturer, there is +/- 10% voltage tolerance
I inputed 28V, when device started, voltage dropped to 22V due to cable impedance, but it is still within tolerance threshold. I think that I can increase input to 29V, so it should be OK for idle, and when this thing starts I should get something close to 23V, and it would be good enough for me.
The thing I do not fully understand, what I need to do with amperage. To my understanding, device would draw as much current as it need, so if I keep voltage at 29V, and amperage, let say 10 or 20 percent higher of what is required, there is no risk to damage something. But is it really true, or I'm missing something?
Device has Nickel battery inside, charger gives up to 28V
Device have external power mode which require 24V
According to manufacturer, there is +/- 10% voltage tolerance
I inputed 28V, when device started, voltage dropped to 22V due to cable impedance, but it is still within tolerance threshold. I think that I can increase input to 29V, so it should be OK for idle, and when this thing starts I should get something close to 23V, and it would be good enough for me.
The thing I do not fully understand, what I need to do with amperage. To my understanding, device would draw as much current as it need, so if I keep voltage at 29V, and amperage, let say 10 or 20 percent higher of what is required, there is no risk to damage something. But is it really true, or I'm missing something?
Last edited: