Need circuit for simple AC Adjustable Constant Current Source (6.4VAC compliance voltage @ 0-5A)

Thread Starter

Carolsboy

Joined Feb 7, 2018
27
Looking for help with making my own low cost AC Constant current source using the attached simplified block diagram. This source will be used to check the accuracy of AC current transducers used in our component reliability test cabinet. Would like a fairly stable output with minimal drift and range (current is checked at 0.250A, 0.500A, 0.750A, 1.000A, 1.500A, 2.000A, 2.500A, 3.000A..... up to 5.000A).

I am current using a simple AC current source utilizing a very small variac (0.23kVA) to vary the voltage into a 120/6.4VAC stepdown transformer with a max current output of 20A. The output current varies too much to get accurate measurements. The range is still too much even when the current source is plugged into a programmable AC voltage supply set for 120VAC.

Thanks for any help offered.
 

Attachments

crutschow

Joined Mar 14, 2008
34,281
What is causing the variation?
The output should be stable when feeding it from the programmable AC supply.
 
Last edited:

Thread Starter

Carolsboy

Joined Feb 7, 2018
27
What is causing the variation?
The output should be stable when feeding it from the programmable AC supply.
Good question... I was just trying our Chroma 61501 programmable power supply feeding the AC current source test fixture again, but this time I wanted to see if I had better control if I varied the output voltage of the AC programmable power supply and left the variac set for max on the AC current source. I believe I can increase and decrease the output current more accurately, but the problem is the fluxuating output current that occurs at a very low rate... like 4 seconds from min to max and visa-versa. Not sure if it has something to do with the transformer or variac and heating/cooling, eddy currents or if it may be the 66200 Choma power meter I am using to measure the current, which uses a shunt for it's current measurements. When I do the DC current transducer calibration, the current level is stable and I am using the same power meter, so... don't think it is the culprit. I am going to make a guess just to complete this post as to the range of values I am seeing... I'm guessing if the current is set at 1Arms output, the actual reading on the power meter is reading 0.990-1.010, but with it set around 5Arms, the output varied much more (like from 4.990-5.250). I am not sure if it has to do with the load impedance... like using a short for all readings between 0.250A to 5.000A, or using a low value power resistor so the AC input voltage into the transformer is higher to increase the magnetic field more, which may increase the accuracy of the induced voltage on the secondary? I mean to get 250mA output into a shorted load from a 20A 5 or 6.4VACrms secondary would mean a very low input voltage on the primary, where by increasing the load resistance, the input voltage would increase. I really have no clue... lol!!
 
Top