Amplitude variation changes output signal

Thread Starter

Seehat

Joined Oct 14, 2013
12
hello all
working on a circuit to glow an led on whenever the input signal frequency is between 970hz to 1030hz.
but the problem is the amplitude of the input signal which can be anywhere between 1<=A<=4.
i used a bandpass filter to filter the signal for designated frequencies but after that i used a comparator but i cant use a single reference voltage in that because for every single amplitude the output voltage changes.
any suggestions ?
Thanks
 

Alec_t

Joined Sep 17, 2013
14,313
You already have a signal of the appropriate frequency at the filter output. Why not just amplify that up to or beyond amp clipping level and drive the LED from the amp output (perhaps via a diode/cap integrator)?
 

Thread Starter

Seehat

Joined Oct 14, 2013
12
@mikeML
the only components i have to use are lm741 or lm747
and resistors , capacitors, diodes..
@alec t
just have a look on the attached schematic snapshot.
Thanks
 

MikeML

Joined Oct 2, 2009
5,444
Sounds like a school assignment. If so, how come it is not in the homework forum..

Your required Q is f/Δf = 1000/(1030-970) = 17. That will take a much different filter...

Your filter is way too simplistic. To get such a narrow frequency response, you will need a much sharper bandpass filter.
 
Last edited:
Top