Computer USB 1.0, 2.0 Safe Amp Output

Thread Starter

Vegasbob

Joined Feb 18, 2019
18
I currently have a device on the market that is powered by a 12V, 500mA wall adapter. At 12V, it draws about 175mA. We have been asked by a very large company if we can supply a USB 1 or 2 computer powered version. I have been experimenting with some cheap Ebay 5V to 12V buck step up converters where the PCB is molded right in a USB cable. Using my power supply, my device normally draws about 400mA, on the USB 5V side of the converter but can momentarily jump to a little over 500mA at certain times. I have tested my setup on 2 laptops and 4 desktop computers and the arrangement is fine. That is, I haven't fried any USB ports yet by exceeding 500mA. I'm a chemical engineer so I'm somewhat lost when it comes to electronics and double E stuff. I don't want the liability of frying thousands of this companies computers so is this safe or should I try another approach?
 

Ya’akov

Joined Jan 27, 2019
9,071
As a rule, you should not count on a USB port providing more than 500mA, even though most will do so for at least short periods. The standards for those ports say 500mA and that should be your peak if you want to be sure your device will operate.

Can you find a more efficient boost converter than the cheap one?
Can you improve the circuit under power so it uses less current?

Your current arrangement (that might be a pun, and I am not going to disclaim it if it is) will probably work on a very large percentage of USB ports in the wild, but there will also probably be this cases where the USB port shuts down with an overcurrent error. I'd guess that actually damaging a port with brief, small excursions over .5A is unlikely.
 

ArakelTheDragon

Joined Nov 18, 2016
1,362
If you want it to be certainly safe and with high quality, you need to add a current limitation or regulator that will not exceede the "500mADC-1ADC", USB 3 will provide more current. You can put a battery in order to power the battery from the USB at "500mADC" or more and then draw from the battery without worrying or outages. The USB current depends on the protocol initiated in the host OS. If no protocol is initiated, the current can drop to "130mADC".

https://www.extremetech.com/computi...ks-or-how-to-avoid-blowing-up-your-smartphone
 

djsfantasi

Joined Apr 11, 2010
9,156
USB 3.0 will supply up to 900mA of current. Contemporary computers will have one or more USB 3.0 ports. They are identified by a lightning bolt icon above the port.
 

Thread Starter

Vegasbob

Joined Feb 18, 2019
18
The specification for a USB 2 port is 5V at 500mA or 2.5 Watts of power. One of my USB 2 ports on my desktop only puts out 4.75V. To provide 2.5 Watts of power, my port would need to put out 526mA of current. Is Watts or power the actual limiting factor for a USB port or is it separately Volts and Amps. That is, could I have any combination of Volts and Amps and not risk frying anything so long as I didn't exceed 2.5 Watts of power? This makes sense since a Watt is the amount of energy, Joules, that flows through a device per second. If the energy is the same, does it really matter what the individual Volt and Amps are? Maybe I'm missing something. Been a very long time since I studied electricity and magnetism and I really didn't understand it then.
 

ArakelTheDragon

Joined Nov 18, 2016
1,362
There is a tolerance for both. Lets say 10%. Its not perfect, its not needed to be perfect for a USB. Everything is either a voltage source, either a current source. There is no such thing as a power source(watts). You always have to count the tolerance for the voltage and the tolerance for the current and that will give you the maximum and minimum wattage.

Example:
"5VDC*0.5ADC=2.5W"
"4.5VDC*0.45ADC=2.025W"
 

Thread Starter

Vegasbob

Joined Feb 18, 2019
18
I’ve been testing the limits of a couple of USB 2.0 ports on two different old throw away computers. So far, I’ve been able to get up to 857 milliamps at 4.79 volts on one computer and 926 milliamps at 4.40 volts on the other. I only keep it at this level for about 10 seconds and it hasn’t fried either port. I could even go higher if I wanted. I find this amazing since the USB 2.0 spec is only 5.0 volts at 500 milliamps.

I’m beginning to think the USB 2.0 spec of 5.0 volts and 500 milliamps only applies to the underlying energy or watts the port should draw on a continuous basis. That is, the 2.5 watts of energy the port and electronics is subjected to is the important factor and not so much the individual volts or amps.

Say I run at 4.1 volts and 610 milliamps which is 2.5 watts does that have the same effect on the electronics, (which I guess would be heat), as running at 5.0 volts and 500 milliamps which is also 2.5 watts?
 

Lo_volt

Joined Apr 3, 2014
316
As far as I recall, USB 2.0 is specified, with a normal port (not a charger) to output up to 500mA. The voltage at the port must be no less than 5% below nominal of 5V. So 4.75 volts at a USB port is within spec. Note that lots of chargers currently on the market are designed to put out more than 0.5 amps, but that doesn't help you if you're plugging into a PC.

Most internal USB ports will give you an over current fault if you attempt to draw more than 0.5 amps. As you've found out this is a bit loosey-goosey. I've drawn enough current through a USB port to trip this fault. It didn't kill the port but I had to unplug the device and plug it back in. It automatically shut down the power when it happened.

0.5 amps will give you 2.375 watts if the port is supplying 4.75 volts. As long as your step-up converter is better than about 89% efficiency, you might be ok. Note as well that you will have some loss through your cable that will bite into that 89%.

The alternative is to use two USB ports. There are cables that parallel the power to the device (used a lot for external usb disk drives) to double the available current (and therefore power). See this link for an example and I've inserted the pic incase the link goes away:

https://www.startech.com/Cables/USB...l-Hard-Drive-Dual-USB-A-to-Micro-B~USB2HAUBY1

usbcable.jpg
Lastly, for a Chem E, you've got a great grasp of power, current and potential. You're not as lost as you think you are!
 

Thread Starter

Vegasbob

Joined Feb 18, 2019
18
Now a question for some of you smart double E guys.

Experimentally it’s difficult to use my device to measure current draw at various voltages. I’d like to replace my device with a potentiometer and just dial in a resistance at different voltages and then measure current draw. How does the resistance of a circuit vary with voltage? My results are strange and show a fairly linear area then the graph flattens out as voltage increases. It looks like the equation for resistance versus volts for a circuit is some sort of 1/x function. Is that correct? I was hoping to use least squares to find a simple linear equation to use but I guess the relationship of Ohms versus Volts for a circuit is some sort of exponential function.
 
Last edited:

djsfantasi

Joined Apr 11, 2010
9,156
Now a question for some of you smart double E guys.

Experimentally it’s difficult to use my device to measure current draw at various voltages. I’d like to replace my device with a potentiometer and just dial in a resistance at different voltages and then measure current draw. How does the resistance of a circuit vary with voltage? My results are strange and show a fairly linear area then the graph flattens out as voltage increases. It looks like the equation for resistance versus volts for a circuit is some sort of 1/x function. Is that correct? I was hoping to use least squares to find a simple linear equation to use but I guess the relationship of Ohms versus Volts for a circuit is some sort of exponential function.
Your final statement is incorrect for resistive loads. Ohm’s Law is simply:
V = I x R​
That is Volts is equal to current (I, in Amps) times Resistance. As you linearly increase resistance with your potentiometer, current will decrease

The other forms of Ohm’s Law May apply to your experiment.
I = V / R
R= V / I
Take the results of your experiment and use Ohm’s Law to predict the results.
 

Thread Starter

Vegasbob

Joined Feb 18, 2019
18
When I change the voltage on my device both the resistance and amperage change. If the resistance were constant, which it isn't, then I could use Ohm's Law to calculate amperage as a function of voltage. Since the resistance also changes as a function of voltage, I have two variables with only one equation so I need a second equation to calculate my resistance as a function of voltage. So for each voltage on my device, there is a unique resistance that seems to vary as 1/x as opposed to a linear function like y = mx + b. So I'm wondering if it's typical for resistance of a circuit to vary as the inverse of the voltage like a 1/x function or alternatively R = ekV where e is e raised to the kV power and k is a constant. (superscript don't work here). If that's the case, I could do some fancy math, use least squares, develop an equation for resistance as a function of voltage then use Ohm's Law to calculate current.
 

Thread Starter

Vegasbob

Joined Feb 18, 2019
18
What is not typical that the resistance varies with voltage or that the resistance varies like an exponential function? Sorry, I can't discuss what the device is at this time.
 
Last edited:

bassbindevil

Joined Jan 23, 2014
824
I'd look carefully at what the actual power supply requirements of the device are. Perhaps the innards run at 3.3V or 5V and the 12V is converted down by a switching regulator. Maybe the design can be updated with operation from USB power in mind.
 
Top