Here's a very strange problem I've come across. I need to shift a high-frequency (50 kHz - 10 MHz) 5V amplitude square wave signal down to -100V, in order to drive the gave of a MOSFET. I did this with a high-pass filter:
This works fine at high frequencies, but for some reason not with low frequencies. In this case, 200 kHz.
I measured the voltage at IN+, and as you might expect, it is a clean 200 kHz square wave.
HOWEVER, when I measure the voltage at V3.pin1 (between C5 and R8), I get this horrible thing (200 kHz):
This doesn't make any sense to me. It's a high pass filter with a cut-off frequency of 1.5 kHz, why is it doing that?
Does it have something to do with the gate-capacity? Or Miller capacity?
Here's the entire circuit:
http://imgwiz.com/images/2013/05/06/BJgDu.png
The signals IN+ and IN- are as follows (5V digital):
Thanks for any help you can give me. If you need more information, I'll be happy to supply it.
This works fine at high frequencies, but for some reason not with low frequencies. In this case, 200 kHz.
I measured the voltage at IN+, and as you might expect, it is a clean 200 kHz square wave.
HOWEVER, when I measure the voltage at V3.pin1 (between C5 and R8), I get this horrible thing (200 kHz):
This doesn't make any sense to me. It's a high pass filter with a cut-off frequency of 1.5 kHz, why is it doing that?
Does it have something to do with the gate-capacity? Or Miller capacity?
Here's the entire circuit:
http://imgwiz.com/images/2013/05/06/BJgDu.png
The signals IN+ and IN- are as follows (5V digital):
Thanks for any help you can give me. If you need more information, I'll be happy to supply it.