The pitfalls of high impedance?

Thread Starter

xox

Joined Sep 8, 2017
838
I tend to use a lot of 1M resistors, mostly due to concerns over energy consumption. Almost every voltage divider or bias point in my circuits has at least one of them. Problem is, I'm not exactly sure what I'm trading off for the gain in efficiency.

I know that generally speaking, high-impedance paths are more prone to pick up noise or to be more sensitive to temperature fluctuations. But as far as actual examples of where this sort of thing could be a real problem, I just don't have enough experience to know what to look out for. Signal amplification is probably a big one, I'm sure.

Any others I should be thinking about?
 

WBahn

Joined Mar 31, 2012
29,979
Whether it is a problem or not depends on the specific application. If you are using them to establish the DC bias of most BJT-based amplifier topologies, then you have a very high-impedance bias source which means that your circuit has poor bias rejection, as well as all the other noise issues you mentioned. In many FET-based designs this is less of an issue. But whether or not it is "good enough" in a particular design depends both on the specifics of the design and one the specifics of what constitutes "good enough".
 

Thread Starter

xox

Joined Sep 8, 2017
838
Thanks, so bias rejection can be an issue. But what exactly does that mean? A cursory search for "transistor bias rejection" didn't seem to yield much. Is there a more standard term for it?
 

dl324

Joined Mar 30, 2015
16,846
You need to post examples.

Dividers with 1M resistors are going to be pretty wimpy unless they're driving CMOS inputs.
 

Thread Starter

xox

Joined Sep 8, 2017
838
You need to post examples.

Dividers with 1M resistors are going to be pretty wimpy unless they're driving CMOS inputs.
I honestly didn't think a voltage divider would be much good at driving any substantial load. (Then again, maybe I just don't know the right approach.) No, I'm specifically talking about a divider being used as a voltage reference. A good example of where I typically use them is this buffered SR flip-flop circuit which employs single-supply op amps (or comparators):

BufferedFlipFlop.png


The lower set provides a 1/2 Vcc reference, while the cross-coupling resistors establish the feedback network necessary to produce the latching effect. High values for the latter pair were selected simply because they seemed to be the natural choice for reducing the possibility of stray feedback oscillations. Overkill? Maybe. Anything over 10K might've worked fine, probably, I just went with what I thought was the safest bet.

But of course there has to be an upper limit. A trillion ohm resistor (if that even exists) would just appear to be a disconnected wire in some situations. Not sure really, which is why I'm asking.
 
Last edited:

OBW0549

Joined Mar 2, 2015
3,566
...you have a very high-impedance bias source which means that your circuit has poor bias rejection...
Thanks, so bias rejection can be an issue. But what exactly does that mean? A cursory search for "transistor bias rejection" didn't seem to yield much. Is there a more standard term for it?
"Bias rejection" isn't a commonly-used term; I've never encountered it until now, and I've been "doing electronics" since the late 1950's.

I think what he's referring to is the fact that all op amps have some amount of input bias current, ranging (dependent on the technology) from a few femtoamps up to a few microamps. BJT op amps have the most input bias current; JFET input op amps have a lot less, and CMOS op amps the least (usually).

In any circuit, whatever is connected to the op amp inputs must include a DC path which provides this current. If that path has a very high resistance (like your voltage divider using two 1 MΩ resistors), the op amp's input bias current flowing through it may cause the bias voltage to depart excessively (either positively or negatively, depending on the direction of the input bias current) from what it would be with no load.

If the circuit is processing large signal voltages, like hundreds of mV or more, and if high accuracy is not needed, this might not be a very big deal; but when microvolts matter, or when extreme precision is required, the voltage deviations caused by amplifier input bias current flowing through source resistance can be a major source of error.
 

crutschow

Joined Mar 14, 2008
34,285
So, in general, you need to look at the current drawn from the bias network by everything connected to it, to determine whether a 1 megohm bias resistor is suitable for the application.
 

Thread Starter

xox

Joined Sep 8, 2017
838
So, in general, you need to look at the current drawn from the bias network by everything connected to it, to determine whether a 1 megohm bias resistor is suitable for the application.
That's just the thing though, how to go about determining "suitability" in the first place?

In the case of op amp inputs (which ideally have an infinite impedance) using a high resistance voltage reference seems perfectly safe. Is that a correct assumption? In contrast, a BJT bias point typically draws a non-trivial amount of current. In that case, can a higher input impedance therefore lead to problems? (Ignoring amplification noise issues, that's a given.)

Or how about this: The output impedance of a bias network (or voltage reference) must be such that the maximum current available is greater than or equal to that which is required by the connected subcircuit. Is that a fairly accurate summary of the issue?
 

Thread Starter

xox

Joined Sep 8, 2017
838
"Bias rejection" isn't a commonly-used term; I've never encountered it until now, and I've been "doing electronics" since the late 1950's.

I think what he's referring to is the fact that all op amps have some amount of input bias current, ranging (dependent on the technology) from a few femtoamps up to a few microamps. BJT op amps have the most input bias current; JFET input op amps have a lot less, and CMOS op amps the least (usually).

In any circuit, whatever is connected to the op amp inputs must include a DC path which provides this current. If that path has a very high resistance (like your voltage divider using two 1 MΩ resistors), the op amp's input bias current flowing through it may cause the bias voltage to depart excessively (either positively or negatively, depending on the direction of the input bias current) from what it would be with no load.

If the circuit is processing large signal voltages, like hundreds of mV or more, and if high accuracy is not needed, this might not be a very big deal; but when microvolts matter, or when extreme precision is required, the voltage deviations caused by amplifier input bias current flowing through source resistance can be a major source of error.
Sorry, I didn't see your post. Let me think on it a bit, but that may be the answer I was looking for (at least insofar as biasing and voltage references are concerned).
 

OBW0549

Joined Mar 2, 2015
3,566
Or how about this: The output impedance of a bias network (or voltage reference) must be such that the maximum current available is greater than or equal to that which is required by the connected subcircuit. Is that a fairly accurate summary of the issue?
Ummm... sort of, but not quite. It would be more accurate to say "The output impedance of a bias network (or voltage reference) must be such that it can supply the current required by the connected subcircuit without an unacceptable voltage shift in the bias (or reference) voltage."

What constitutes "unacceptable," of course, will be different from case to case.
 

WBahn

Joined Mar 31, 2012
29,979
That's just the thing though, how to go about determining "suitability" in the first place?

In the case of op amp inputs (which ideally have an infinite impedance) using a high resistance voltage reference seems perfectly safe. Is that a correct assumption?
No. Think about the terms you used: which "ideally" have an infinite impedance. Real opamps do NOT have an infinite impedance, and therefore there is a limit to how large you can make the impedances connected to it before you have to start taking the impedance of the opamps (and other real world parameters) into account.

In contrast, a BJT bias point typically draws a non-trivial amount of current. In that case, can a higher input impedance therefore lead to problems? (Ignoring amplification noise issues, that's a given.)
Yes, it can cause problems.

Or how about this: The output impedance of a bias network (or voltage reference) must be such that the maximum current available is greater than or equal to that which is required by the connected subcircuit. Is that a fairly accurate summary of the issue?
It's in the right direction, but the typical rule of thumb is at least 10x and often 100x is used. You want to be able to assume that your bias network is unaffected by the currents drawn from it. But, at the same time, you can't make it so stiff that the signal can't move it around.
 

Thread Starter

xox

Joined Sep 8, 2017
838
Thanks for the excellent responses everyone, highly appreciated! I think I have a pretty decent grasp of the issues involved now. :)
 
Top