Powering DC device via 600 m cable

Thread Starter

Lehmen

Joined Sep 19, 2025
6
I'm not a true engineer, so my question probably little bit naive. I'm trying to figure out how to power up a DC device via 600 meters cable. What do I have.

Device has Nickel battery inside, charger gives up to 28V
Device have external power mode which require 24V
According to manufacturer, there is +/- 10% voltage tolerance

I inputed 28V, when device started, voltage dropped to 22V due to cable impedance, but it is still within tolerance threshold. I think that I can increase input to 29V, so it should be OK for idle, and when this thing starts I should get something close to 23V, and it would be good enough for me.

The thing I do not fully understand, what I need to do with amperage. To my understanding, device would draw as much current as it need, so if I keep voltage at 29V, and amperage, let say 10 or 20 percent higher of what is required, there is no risk to damage something. But is it really true, or I'm missing something?
 
Last edited:

crutschow

Joined Mar 14, 2008
38,401
I keep voltage at 29V, and amperage, let say 10 or 20 percent higher of what is required, there is no risk to damage something.
24V±10% is 21.6V to 26.4V so 29V would not be recommended

One solution is to add a voltage regulator at the far end to supply 24V independent of the voltage drop due to the load current.
If the current is high, you can use a switching regulator for best efficiency, or you can use a simpler linear regulator if the current is small.

What is the maximum load current?
 

Thread Starter

Lehmen

Joined Sep 19, 2025
6
24V±10% is 21.6V to 26.4V so 29V would not be recommended

One solution is to add a voltage regulator at the far end to supply 24V independent of the voltage drop due to the load current.
If the current is high, you can use a switching regulator for best efficiency, or you can use a simpler linear regulator if the current is small.

What is the maximum load current?
24V is for external power mode, charger (connected to the same pin) gives 28V. Max current 85 mA. Adding something would be problematic, this device is working underwater.
 

Thread Starter

Lehmen

Joined Sep 19, 2025
6
So I'm confused.
Isn't that what we are talking about?
What then does the 10% tolerance refer to?
Device has two modes, external power (24V) and battery charging (28V). Plus running on battery, of course. Both modes are on the same physical pins. I do not know how exactly it implemented, my guess when it see 28V it charging the battery, when it see 24V it enters into external power mode, bypassing the battery.
 

ChuckMcM

Joined May 20, 2024
3
Hi,
You'll find that transmitting AC is easier than DC for the same reason Westinghouse did, which was that higher voltages mean less resistive loss. And 600m is not a hugely long run for AC. So send AC out, and at the other end put a 24V power supply of the current you need (or the charger) and plug that into your device. If you want to send data back you can use a power line modulator to send data back through the same wire.
 

Thread Starter

Lehmen

Joined Sep 19, 2025
6
Why do you not know?
How do you expect to power it if you don't know how it works?
Guessing doesn't work well when dealing with electronics. :rolleyes:
I know that it is not charging the battery in the external power mode. And charging on charger. I just do not know how exactly they doing that :) Anyway, what is important to me, that 29V on input shouldn't harm anything and 28V is a normal voltage for it. But I will not try anything until have final confirmation from their support regarding 29V. My question was regarding regulating the current on the input. Are my assumptions correct, that device will draw as much current as it needs, and having more available amperage is not a big deal?
 

crutschow

Joined Mar 14, 2008
38,401
Are my assumptions correct, that device will draw as much current as it needs, and having more available amperage is not a big deal?
Yes, it will draw only what it needs.
The main effect will be the voltage drop due to the wire resistance for whatever current it takes.
 

ChuckMcM

Joined May 20, 2024
3
Oh, and the EE math here; Let's say you are using 10 gauge wire which is about a milliohm per foot and 600m is about 2000 feet so thats a wire resistance of 2 ohms. As you draw current the voltage will be pulled down by current * 2 ohms, and the power the wire is dissipating as heat will be current^2 * 2 ohms. Thicker wire, less resistance. If you don't know how much current it will draw it is hard to make an estimated guess as to what you should set the voltage too. Now you could run a sense wire back from the device and plug it into the power supply's sense input. Then the supply would raise its voltage until the voltage it was reading was the same (and since the two wires are, in theory, identical) the voltage at your device would be correct.
 

Thread Starter

Lehmen

Joined Sep 19, 2025
6
Oh, and the EE math here; Let's say you are using 10 gauge wire which is about a milliohm per foot and 600m is about 2000 feet so thats a wire resistance of 2 ohms. As you draw current the voltage will be pulled down by current * 2 ohms, and the power the wire is dissipating as heat will be current^2 * 2 ohms. Thicker wire, less resistance. If you don't know how much current it will draw it is hard to make an estimated guess as to what you should set the voltage too. Now you could run a sense wire back from the device and plug it into the power supply's sense input. Then the supply would raise its voltage until the voltage it was reading was the same (and since the two wires are, in theory, identical) the voltage at your device would be correct.
I did that, and voltage I actually get is slightly higher than I calculated. Probably, I have one of the variables wrong. Anyway, my goal was to get peace of mind, that I'm not doing something extremely stupid, and I think I got just that. Thanks to everyone!
 

crutschow

Joined Mar 14, 2008
38,401
Now you could run a sense wire back from the device and plug it into the power supply's sense input. Then the supply would raise its voltage until the voltage it was reading was the same (and since the two wires are, in theory, identical) the voltage at your device would be correct.
Another method, that doesn't require an additional wire, is to add a circuit that measures the load current at the power supply, and increase its sense voltage based on that current and the calculated wire resistance, which will then essentially keep the voltage constant at the load end.
 
Last edited:

Thread Starter

Lehmen

Joined Sep 19, 2025
6
Ok guys, I think I owe you some explanations. Warning - the following are some guessing and speculations on my part (crutschow, sorry!), because manufacturer is not giving specific details on how this device really works. Device in question is USBL beacon. Basically, it is a simple pinger, emitting a short acoustical pulse up to 187 dB at rate 1-2 Hz. On battery, it runs from 21V (max charge) to 17V. How it could work? My guess, it has a big capacitor (or several) that discharges during the pulse and charging between them. Therefore, in my understanding, it should have quite high tolerance to power needs (and it is, 21V-17V is not a small margin).
 
Last edited:

MisterBill2

Joined Jan 23, 2018
27,315
The resistance total in the wire is (600M+600M) x(ohms per meter) of the wire. So the device exhibits some effective resistance in each mode, and the current it draws passes thru all of those resistances, so there is a voltage drop.
Does the TS know what size of wire is used?? Or is that the variable that must be determined??
 
Top