Hi guys!
Can anyone help me sort out what I am doing wrong?
I am planning to setup Addressable RGB LED lighting and run it across ceiling in the living room with the perimeter length of about 20 meters (or ~60'). So with basic math figured out that I will need 4 of these strips (https://www.amazon.com/gp/product/B07QGD3YHH) and planning to run in parallel.

I used their specifications to calculate power needed to run it and ended up with ~25 Amps. According to specifications the strip is using 14.4 W/M. Single strip is 5 meters so total power usage for whole strip should be 72W. Assuming voltage is 12V I should expect that this strip will draw total of 72W/12V=6A. And now assuming I connect them in parallel it will draw total of 6A * 4 = 24 Amps.
So far everything seems to be ok until I started to calculate wire gauge required to power these. And for some reason numbers do not add up or they just feel weird and/or strange. So I really need your help to try to figure out. Here is how I do my calculations.
Assuming I want to power the farthest strip I would need 5m * 3 strips = 15 meters of wire to power the last one. According to this site https://www.engineeringtoolbox.com/amps-wire-gauge-d_730.html (and many others) and considering the Amps and length, I should really be using 4 AWG wire. According to several other sites 4 AWG wire should be about 5mm in diameter. Which is a bit bizarre to run it so thick. So, I thought, ok, maybe its just that I never had to deal with this and decided to accept it and go with it for time being.
But here is where things start to get confusing and interesting. If I look at the LED strip itself I see they use 20 AWG power wire. If I perform same calculations I end up with at least 12 AWG (not 20) - 6A @ 5 meters. When I did try to power it, it worked fine without a hint at heating up. And when I measured voltage and current I got ~11.9V and 3.12A - running all LEDs in white at full brightness. This tells me that either manufacturer is misrepresenting their numbers, or running under some assumptions that really never happen or I am calculating something incorrectly.
And now the question (drum roll)... what the hell am I missing?
Anyone got any hints, thoughts, ideas?
Thanks!
Can anyone help me sort out what I am doing wrong?
I am planning to setup Addressable RGB LED lighting and run it across ceiling in the living room with the perimeter length of about 20 meters (or ~60'). So with basic math figured out that I will need 4 of these strips (https://www.amazon.com/gp/product/B07QGD3YHH) and planning to run in parallel.

I used their specifications to calculate power needed to run it and ended up with ~25 Amps. According to specifications the strip is using 14.4 W/M. Single strip is 5 meters so total power usage for whole strip should be 72W. Assuming voltage is 12V I should expect that this strip will draw total of 72W/12V=6A. And now assuming I connect them in parallel it will draw total of 6A * 4 = 24 Amps.
So far everything seems to be ok until I started to calculate wire gauge required to power these. And for some reason numbers do not add up or they just feel weird and/or strange. So I really need your help to try to figure out. Here is how I do my calculations.
Assuming I want to power the farthest strip I would need 5m * 3 strips = 15 meters of wire to power the last one. According to this site https://www.engineeringtoolbox.com/amps-wire-gauge-d_730.html (and many others) and considering the Amps and length, I should really be using 4 AWG wire. According to several other sites 4 AWG wire should be about 5mm in diameter. Which is a bit bizarre to run it so thick. So, I thought, ok, maybe its just that I never had to deal with this and decided to accept it and go with it for time being.
But here is where things start to get confusing and interesting. If I look at the LED strip itself I see they use 20 AWG power wire. If I perform same calculations I end up with at least 12 AWG (not 20) - 6A @ 5 meters. When I did try to power it, it worked fine without a hint at heating up. And when I measured voltage and current I got ~11.9V and 3.12A - running all LEDs in white at full brightness. This tells me that either manufacturer is misrepresenting their numbers, or running under some assumptions that really never happen or I am calculating something incorrectly.
And now the question (drum roll)... what the hell am I missing?
Anyone got any hints, thoughts, ideas?
Thanks!