Amplitude variation changes output signal

Discussion in 'The Projects Forum' started by Seehat, Nov 4, 2013.

  1. Seehat

    Thread Starter New Member

    Oct 14, 2013
    12
    0
    hello all
    working on a circuit to glow an led on whenever the input signal frequency is between 970hz to 1030hz.
    but the problem is the amplitude of the input signal which can be anywhere between 1<=A<=4.
    i used a bandpass filter to filter the signal for designated frequencies but after that i used a comparator but i cant use a single reference voltage in that because for every single amplitude the output voltage changes.
    any suggestions ?
    Thanks
     
  2. Alec_t

    AAC Fanatic!

    Sep 17, 2013
    5,779
    1,103
    You already have a signal of the appropriate frequency at the filter output. Why not just amplify that up to or beyond amp clipping level and drive the LED from the amp output (perhaps via a diode/cap integrator)?
     
  3. MikeML

    AAC Fanatic!

    Oct 2, 2009
    5,450
    1,066
    Look at an LM567 chip.
     
  4. Seehat

    Thread Starter New Member

    Oct 14, 2013
    12
    0
    @mikeML
    the only components i have to use are lm741 or lm747
    and resistors , capacitors, diodes..
    @alec t
    just have a look on the attached schematic snapshot.
    Thanks[​IMG]
     
  5. MikeML

    AAC Fanatic!

    Oct 2, 2009
    5,450
    1,066
    Sounds like a school assignment. If so, how come it is not in the homework forum..

    Your required Q is f/Δf = 1000/(1030-970) = 17. That will take a much different filter...

    Your filter is way too simplistic. To get such a narrow frequency response, you will need a much sharper bandpass filter.
     
    Last edited: Nov 4, 2013
Loading...