Hi all,
I have a brushed DC motor rated at 225W @ 3000rpm 24V. Datasheet available here if needed: https://www.parvalux.com/media/wysiwyg/parvalux/datasheets/pm90-data-sheet-10-10-2016.pdf
I have mechanically attached the output shaft of this motor to an off-the-shelf AC induction machine running at 3000rpm. The purpose of this experiment is to use the DC machine as a brake to apply a known load torque to the AC machine to evaluate the performance of the AC machine controller.
As far as I am aware, the torque required by a DC machine acting in the generator mode is proportional to the armature current. In other words, if I can control the current coming out of the machine, I can control the torque. My plan is to connect the DC motor output into a programmable four-quadrant amplifier and set it to sink a precise amount of current, therefore allowing me to set the load torque. I am not sure where the terminal voltage falls into this though, if the motor speed reduces as a result of the applied torque, the terminal voltage will reduce so do I just let the four quadrant amplifier deal with this on its own and try to maintain a constant current?
My other piece of thinking was to feed the DC machine into a fixed output buck converter and convert to say 12V. I would then sink the output of the converter into the four-quadrant amplifier. If voltage is fixed and I am controlling current, I am therefore controlling power. If I can control power and I am measuring rotor speed, it follows that I can deduce the applied torque, so in other words, this would be another way of controlling torque. Is this correct?
I would be interested to hear if anyone has any experience of a similar setup as I am not sure which approach to go with!
Thanks
I have a brushed DC motor rated at 225W @ 3000rpm 24V. Datasheet available here if needed: https://www.parvalux.com/media/wysiwyg/parvalux/datasheets/pm90-data-sheet-10-10-2016.pdf
I have mechanically attached the output shaft of this motor to an off-the-shelf AC induction machine running at 3000rpm. The purpose of this experiment is to use the DC machine as a brake to apply a known load torque to the AC machine to evaluate the performance of the AC machine controller.
As far as I am aware, the torque required by a DC machine acting in the generator mode is proportional to the armature current. In other words, if I can control the current coming out of the machine, I can control the torque. My plan is to connect the DC motor output into a programmable four-quadrant amplifier and set it to sink a precise amount of current, therefore allowing me to set the load torque. I am not sure where the terminal voltage falls into this though, if the motor speed reduces as a result of the applied torque, the terminal voltage will reduce so do I just let the four quadrant amplifier deal with this on its own and try to maintain a constant current?
My other piece of thinking was to feed the DC machine into a fixed output buck converter and convert to say 12V. I would then sink the output of the converter into the four-quadrant amplifier. If voltage is fixed and I am controlling current, I am therefore controlling power. If I can control power and I am measuring rotor speed, it follows that I can deduce the applied torque, so in other words, this would be another way of controlling torque. Is this correct?
I would be interested to hear if anyone has any experience of a similar setup as I am not sure which approach to go with!
Thanks