Powering an electronic device....

Discussion in 'General Electronics Chat' started by antennaboy, May 1, 2012.

  1. antennaboy

    Thread Starter Active Member

    Jan 31, 2008
    45
    0
    Hello Forum,

    In general, a generic electronic device is rated by the maximum input current that it can withstand and the optimal voltage that needs to be applied to it....

    So, to power an electronic device, do we need to worry more about the current going into it or the voltage applied to it? (Real) Power is given by the product current times voltage.

    I guess we need to respect the voltage recommendation on the back of the electronic device (i.e. provide the device with the suggested voltage) but can have some leeway to vary the input current: the higher the current going into the device the more the input power, up to the limit , max current reported on the back of the device. Too much power can be dangerous.

    Things my be simple to explain for a load that is a simple carbon resistor. But an electronic device has lots more (transistors, etc...). Some devices may not work at all if the voltage is lower than the recommended voltage. But the current can always be adjusted, correct?

    On the other side, I would think that once a certain voltage is provided to the electronic device terminals, the current flowing into the device is automatically determined. Voltage and current are not disjoint....not sure if I make my point clear here....

    thanks,
    antennaboy
     
  2. crutschow

    Expert

    Mar 14, 2008
    13,016
    3,235
    Yes, this question comes up often.

    Virtually all electronic devices are voltage operated and the voltage limits are stated in their spec. The current or power is also stated but that is just what the device will draw at the rated voltage, it is not a "maximum" that it can stand. It will not take more current than it needs so, as long as you apply the proper voltage, it will only take the current it requires.

    The only reason to be concerned about current is to be sure that the power source can supply at least as much current as the device requires at the rated voltage. The source can be capable of delivering much more current than required (such as a car battery) but that's OK as long as the voltage is correct.
     
  3. #12

    Expert

    Nov 30, 2010
    16,298
    6,810
  4. antennaboy

    Thread Starter Active Member

    Jan 31, 2008
    45
    0
    I am trying to understand how to transfer the max power from a solar cell to an electronic device. The solar cell, for standard irradiance, 100 W/m^2, gives out a certain voltage. The current is dependent on the connected load.

    The solar cell has a I-V curve indicating the current and voltage at which max power is emitted by the cell. To be able to transfer the max amount of power from the cell to an electronic device (like a cell phone), a DC-DC converter is probably needed to give the device the correct voltage....

    How about the max power transfer theorem, stating that the impedances must be complex conjugate of each other? We don't know the impedance of the cell phone or the impedance of the solar cell either...

    thanks
    antennaboy
     
  5. crutschow

    Expert

    Mar 14, 2008
    13,016
    3,235
    You don't have to know the impedance of the cell phone. You just need an MPPT that automatically operates at the maximum power point of the solar cell and converts it to the required voltage for your load..
     
Loading...