I am having an overload trying to figure this problem out. I think my calculus is weak so that is my first problem. I have been working on this one and can't get it. any help or advise would be greatly appreciated
The error voltage input to an integral controller goes from 0 V to 3 V at t=1 second and the remains at 3 V until t=5 seconds. What is the integral controller output voltage at t=5 seconds and what is the slope of the output voltage waveform from t=1 to t=5 assume that the integration constant Ki for the controller is 2.5 /sec.
I am looking for t he integral controller output voltage and the slope.
thanks for all the help you can give me..
The error voltage input to an integral controller goes from 0 V to 3 V at t=1 second and the remains at 3 V until t=5 seconds. What is the integral controller output voltage at t=5 seconds and what is the slope of the output voltage waveform from t=1 to t=5 assume that the integration constant Ki for the controller is 2.5 /sec.
I am looking for t he integral controller output voltage and the slope.
thanks for all the help you can give me..