So I have a stack of unused 1150w power supplies.
They input 110v and output 12V@25A and -52V@16A. That's 300W on the 12V side and 830ish on the -52V side.
So my goal is to power a remote device with 24V @ 40A, using the thinnest cable possible. That's about 900W, only really need 20A but 40 is the goal.
My question is for the genius's out there, and possibly to keep me from killing myself.
Logic tells me that i can get a combined power differential of 64V if i wire + to the 12V+, and - to the + on the 52V side. But if one side provides 300W and the other side provides 900W, will i have an issue if I draw too much current? (read: Will the 12V side "blow" above 300W of current?)
Second question. Would it be better to run two power supplies in serial and run -104V through the cable? or in parallel for more amperage and @ -52V. Assuming on the other side i have a power supply that can handle 19-72VDC OR 72-144VDC (depending on part number). Since it's negative voltage, i'd just swap polarities at the DC-DC converter at the other side.
Also cable thickness, I know that there's voltage drop over distance, so i guess running 104V would be better since it's lower amperage. But as far as using the thinnest cable possible, would i run three in series to get to 160V and let the cable loss bring it down in the 144VDC range.
Cable length would be 100-200'. Looking for as thin as possible, to make this work. What gauge wire would be recommended?
Can someone check my logic here before i do something stupid?
Power supply at the other end is MW SD-1000H-24 or SD-1000L-24 depending on input voltage range.
Thanks in advance!
They input 110v and output 12V@25A and -52V@16A. That's 300W on the 12V side and 830ish on the -52V side.
So my goal is to power a remote device with 24V @ 40A, using the thinnest cable possible. That's about 900W, only really need 20A but 40 is the goal.
My question is for the genius's out there, and possibly to keep me from killing myself.
Logic tells me that i can get a combined power differential of 64V if i wire + to the 12V+, and - to the + on the 52V side. But if one side provides 300W and the other side provides 900W, will i have an issue if I draw too much current? (read: Will the 12V side "blow" above 300W of current?)
Second question. Would it be better to run two power supplies in serial and run -104V through the cable? or in parallel for more amperage and @ -52V. Assuming on the other side i have a power supply that can handle 19-72VDC OR 72-144VDC (depending on part number). Since it's negative voltage, i'd just swap polarities at the DC-DC converter at the other side.
Also cable thickness, I know that there's voltage drop over distance, so i guess running 104V would be better since it's lower amperage. But as far as using the thinnest cable possible, would i run three in series to get to 160V and let the cable loss bring it down in the 144VDC range.
Cable length would be 100-200'. Looking for as thin as possible, to make this work. What gauge wire would be recommended?
Can someone check my logic here before i do something stupid?
Power supply at the other end is MW SD-1000H-24 or SD-1000L-24 depending on input voltage range.
Thanks in advance!