I have a very simple circuit, a 12v 500mA wall wart going into a LM2940 voltage regulator connected in series with a 130Ω resistor and an LED. When I specified the voltage regulator to order, I gave the part # for the 5v. Case markings indicate 5v (LM2490T-5.0). All signs point to this being a 5v regulator.
HOWEVER, in my circuit, when voltage is measured from Vout of the regulator to ground of the regulator approximately 10.7v is measured. Current draw should be 5v / 130 ~ 38mA. The datasheet says for 5mA < Io < 1A there is a typical output of 5v (being a 5v regulator and all).
Why am I seeing 10.7v? I tried this on two regulators, both are the same way. I'm afraid to go further in the project (attaching microcontroller, etc.) without making sure it will only see 5v.
EDIT: Without the LED I measure 9.6v...
HOWEVER, in my circuit, when voltage is measured from Vout of the regulator to ground of the regulator approximately 10.7v is measured. Current draw should be 5v / 130 ~ 38mA. The datasheet says for 5mA < Io < 1A there is a typical output of 5v (being a 5v regulator and all).
Why am I seeing 10.7v? I tried this on two regulators, both are the same way. I'm afraid to go further in the project (attaching microcontroller, etc.) without making sure it will only see 5v.
EDIT: Without the LED I measure 9.6v...
Last edited: