AC to DC Adaptor rated output vs tested?

Thread Starter

SBRuss

Joined Aug 30, 2019
4
0

Hi All,

I’m adding some accessories to my telescope for Astrophotography, and want to repurpose one of my old AC-DC adaptors for my autofocuser, as it didn’t come with a power supply. It requires 12V 0.5A input. So I went through my old box of adaptors collect from various old telephones, routers, modems etc. Below is a picture of several that have labels saying they output in the 12V 0.5-1.0 amp range.

The one on the left doesn’t show the output polarity, so I put my multimeter on it, but the readout just flickered between 0.1v and 0.2v. So just to make sure I was testing correctly, I started testing the other adaptors, and this is where my question arises.

After I went through and tested all the others, they ended up showing voltage outputs at the jack of between 15.6V and 20.V!!

So my question is, is this normal to get output voltage from a 12V adaptor, that is much higher than 12 volts? Is it an artefact of only using a multimeter to test, and real world usage would not show this high a voltage, or what? I always assumed these were regulated outputs and you could rely on a label saying 12V output, to be pretty close to 12V. Surely having 20V output would fry some things if they were expecting 12V.

My particular use-case, the focuser, I’m informed by the manufacture should accept from 12V-15V. None of these adaptors were below 15V, although one was close. So would none of these be acceptable to use?

Thanks. Russell.


B0723930-8E4F-409C-9ADD-04B8BE5D0460.jpeg
 

dl324

Joined Mar 30, 2015
16,916
Welcome to AAC!
After I went through and tested all the others, they ended up showing voltage outputs at the jack of between 15.6V and 20.V!!

So my question is, is this normal to get output voltage from a 12V adaptor, that is much higher than 12 volts?
Not with regulated adapters.
Is it an artefact of only using a multimeter to test, and real world usage would not show this high a voltage, or what?
If the adapter is unregulated, a DVM won't give the whole picture. You'd need an oscilloscope to get a better idea.
I always assumed these were regulated outputs and you could rely on a label saying 12V output, to be pretty close to 12V.
Not safe to make that assumption.
Surely having 20V output would fry some things if they were expecting 12V.
That would be a reasonable expectation.
My particular use-case, the focuser, I’m informed by the manufacture should accept from 12V-15V. None of these adaptors were below 15V, although one was close. So would none of these be acceptable to use?
Yes. 12VDC adapters are common, so there's no reason to not use a voltage recommended by the manufacturer.
 

Thread Starter

SBRuss

Joined Aug 30, 2019
4
Oh oops, AC. I wonder what I used to have that required 12V AC output?

Also thanks Dennis for the comprehensive reply. Bit of a bummer as I can’t easily get to a store to pick a new one up, so have to wait for mail order. I also found an old Jaycar “battery eliminator” adaptor, which allows selection between 3v & 12v. At 12v, even it is showing 15.8v. I guess in the old days simple resistors and capacitors were a lot more voltage tolerant than today’s IC’s, so variations didn’t matter as much. My other short-term alternative is to throw on a buck converter I guess.

Thanks again for the feedback folks.

Russell.
 

Audioguru

Joined Dec 20, 2007
11,248
An AC to DC adapter is supposed to produce its rated output voltage only when it has the rated input voltage (did you measure yours?) and is loaded with the rated current. Your multimeter provided almost no load current.
Since most adapters have NO voltage regulation then they are cheap and if the input voltage is higher than rated or with no load or a low current load then of course their output voltage is higher than with their rated input voltage or rated output current. If the input voltage varies or if the load current varies then the cheap adapter's output voltage also varies.
Some buck converters also have no or poor voltage regulation.
 

Thread Starter

SBRuss

Joined Aug 30, 2019
4
No I didn’t test the input. I have no real way of checking AC. I could chuck a dummy load on the output though and see what happens, but probably just simplier if I try and find a better power supply.
 

dl324

Joined Mar 30, 2015
16,916
Bit of a bummer as I can’t easily get to a store to pick a new one up, so have to wait for mail order.
I usually go to a second hand store when I need an adapter. There are several Goodwill stores in my area and they almost always have what I'm looking for. If it's something that's not common, I'll order on-line.
I guess in the old days simple resistors and capacitors were a lot more voltage tolerant than today’s IC’s, so variations didn’t matter as much.
Some (many?) devices include internal voltage regulators, but it doesn't make sense for manufacturers to incur any additional expense to tolerating voltages other than what they recommend.
 

AlbertHall

Joined Jun 4, 2014
12,346
Since most adapters have NO voltage regulation then they are cheap and if the input voltage is higher than rated or with no load or a low current load then of course their output voltage is higher than with their rated input voltage or rated output current. If the input voltage varies or if the load current varies then the cheap adapter's output voltage also varies.
I don't think this is true of current adaptors as they are generally SMPS. The older, unregulated, ones used a mains frequency iron cored transformer which is heavy and expensive.
 

Alec_t

Joined Sep 17, 2013
14,313
All the adapters shown look to be old-school transformer type with a narrow input voltage range. Those generally are unregulated and their output voltage droops as the load increases. Modern switch-mode types are usually recognisable by having a broad input voltage range, e.g. '90V-250V AC', and are regulated.
 

Andrei Suditu

Joined Jul 27, 2016
52
First...do you need AC or DC output from them....some are marked for AC output some for DC.Also the 0.2 values probably come from the AC ones when measured with the DC seting on the multimiter.
If you could post a picture with the input specification of the autofocuser it would be helpfull.
Assuming it's DC powered i'd go and grab a regulated wall wart....
 

Thread Starter

SBRuss

Joined Aug 30, 2019
4
Yes, these adaptors are all 5-10 years old. I was just hoping to make use of them, as they’ve been sitting in a cupboard taking up space. Would have been nice to find a use for them.

The focuser is DC 12v 0.5a input. There’s no range given on the paperwork. The manufacturers said “it should handle up to 15v but don’t go much higher”.
 

Alec_t

Joined Sep 17, 2013
14,313
Try the Netgear adaptor with a dummy load. As it's rated 800mA it's the only one that can handle your 0.5A motor current with some margin. My guess is even a 100mA or so load would bring the volts down to a safe level for the motor. Any electronics associated with the motor would be more of a concern re over-voltage.
 

MrChips

Joined Oct 2, 2009
30,802
All power transformers have internal resistance.
The current output is rated at the specified voltage. Hence the voltage output at no load will be higher than the specified voltage and will fall as the output current is increased.
 
Top