Voltage divider biasing method?

Discussion in 'The Projects Forum' started by rougie, Sep 27, 2012.

  1. rougie

    Thread Starter Active Member

    Dec 11, 2006
    410
    2
    Hello,

    I am experimenting with the voltage divider bias configuration and I am having some difficulty. Can someone please take a look at my circuit and let me know where I am going wrong.

    With a voltage divider connected to the base of a transistor I am trying to obtain 1.2 VDC at the base.

    Please view circuit attachment! All power sources are 3.3VDC!

    I am trying to calculate R1....

    The measured current at Vc is about 12ma. VRE is very close to 0.24VDC. I don't get this here... there really isn't any single Rb resistor to calculate since we have a voltage divider??

    If I assume Ib is very small and negligible, can I figure It for R1 and R2 like so:

    Ir2= 2.1/10K = 210ua

    and therefore:

    R1 = 1.2/~210ua = 10K

    But this gives 0.67VDC at the base ???

    confused!

    any help is greatly appreciated!
    r
     
    • R1.jpg
      R1.jpg
      File size:
      74.9 KB
      Views:
      50
  2. crutschow

    Expert

    Mar 14, 2008
    13,011
    3,233
    If you assume Ib is negligible (it actually will likely be somewhere around 100μA) then for R2=10k its current would be (3.3-1.2)/10kΩ = 210μA as you calculated. Then R1 would equal 1.2 / 210μA = 5.7kΩ (don't know how you calculated 10kΩ). This gives a voltage of 1.2V at the base.

    To minimize the effect of Ib you could reduce the value of the resistors by say a factor of 10 to 1kΩ and 570Ω.
     
  3. t_n_k

    AAC Fanatic!

    Mar 6, 2009
    5,448
    782
    One thing also worth noting is that for the circuit as drawn, the transistor is probably close to or at saturation, rather than being biased in the linear region.

    In any event, as a simple check, suppose the transistor has a HFE of 150 and is actually operating in the linear region. With Ic=12mA this would require a base current of 80uA.

    If 1.2V is the target base voltage, then as shown on the schematic, the current in R2 [10k] =210uA. With the base taking 80uA [say] then R1 would only carry 130uA. To set the voltage across R1 to 1.2V would therefore require a resistance of R1=1.2V/130uA=9.23kΩ.

    Even if HFE=300, the base current would still be 40uA. This would mean R1 would be around 7.06kΩ to give the target 1.2V base voltage.
     
    Last edited: Sep 28, 2012
  4. rougie

    Thread Starter Active Member

    Dec 11, 2006
    410
    2
    Hello guys,

    Thanks for replying!

    Ooops, my error.... yes you are correct, 5.7K

    Okay this isn't making sence...

    With crutshow's solution, at R1 = 5.7K I get a base voltage of only 0.921VDC.

    With t_n_k's solution, if I use a 10K as R1 (I did not have a 9.23K ohm handy) I get a base voltage of 0.931VDC ???? The hFE of this transistor is 165.

    Ic = approximately 11 ma!

    The way I see it is:

    if hFE = 165 and I have 11ma as Ic, my Ib should be about 66.6 ua. Given 2.1V across R2, then, Ir2 should be I = e/r which is: 2.1V/10K = 210ua.

    Ir1 should be: Ir1 = 210ua - 66.6ua = 144ua

    Given that I want VR1 to be 1.2VDC, I can do: 1.2V/144ua= 8333 ohms ... I used an 8.8K instead.

    when I measure Vb with R1 being 8.8K I still get 0.930VDC....

    why is it so hard to obtain 1.2VDC at Vb??

    discouraged... help!
     
    Last edited: Sep 28, 2012
  5. t_n_k

    AAC Fanatic!

    Mar 6, 2009
    5,448
    782
    The other obvious matter is that

    Vb=Ve+Vbe

    Suppose Vbe=0.65 V then you would need Ve=0.55 V to obtain Vb=1.2 V. At the moment you have Ve at about 0.22V which wont help you reach the goal.

    Suppose you want to set Vb to 1.2 V with Ie=10mA and Vbe=0.65. To obtain Ve=0.55V then one needs ....

    Re=0.55V/10mA=55Ω.

    If with HFE=165, you set Vs=3.3V, Rc=120Ω [i.e. lose the LED and the 180Ω], Re=56Ω, R1=8.2kΩ & R2=10kΩ [all preferred values] then you would get Vb close to 1.2V. Also the transistor wont be operating in the saturation region.

    If you wanted to keep Re as 20Ω then you would require an Ie value of around 27mA to obtain Vb=1.2 V. In that case you'd probably need an R1 of ~27kΩ and a much lower Rc (say 56Ω) - to keep the transistor out of saturation.
     
    Last edited: Sep 28, 2012
  6. rougie

    Thread Starter Active Member

    Dec 11, 2006
    410
    2
    but I need the led???
     
  7. t_n_k

    AAC Fanatic!

    Mar 6, 2009
    5,448
    782
    Well try Re=56Ω and see what happens. Otherwise forget the requirement of getting Vb=1.2V
     
  8. Audioguru

    New Member

    Dec 20, 2007
    9,411
    896
    The hFE of as transistor is measured when it has plenty of collector to emitter voltage so it is not saturated. When a transistor is saturated (like you want yours to be) then its hFE is very low and is guaranteed to be 10 or more but probably not as high as 165.
     
  9. t_n_k

    AAC Fanatic!

    Mar 6, 2009
    5,448
    782
    Attached is a simulation result of a similar arrangement to the original circuit except the LED and Rc replaced with a single resistor. The transistor is "approaching" but not in saturation. Note that Vb=1.2V with Re=20Ω and Ic≈25mA.

    As an aside comment [since rougie wanted to retain the LED] - I'm not sure why one would bother with voltage divider bias if the goal is simply to turn on an LED. Why not simply operate the transistor as a saturated switch?
     
  10. crutschow

    Expert

    Mar 14, 2008
    13,011
    3,233
    In summary, the problem is that the transistor is near saturation, due to resistor in series with LED. This means you can not obtain the desired current through Re without significant current coming from the base (giving an apparent low Hfe), which lowers the voltage at the base. Since you are apparently trying to use the transistor to limit the current through the LED, then you can eliminate the resistor in series with the LED. The circuit should then behave as you would expect.
     
  11. rougie

    Thread Starter Active Member

    Dec 11, 2006
    410
    2
    Hello fellas,

    Yes, I see now... when we do a voltage divider like this, the resistance of R1 is in parallel with the impedence of (Ve+Vbe).... and the collective resistance will determine Vb... or at least I think so!!!! And if I am right, then it makes sense as to no matter what resistor I use as R1 and R2, I don't seem to be able to get greater Vb then Ve + Vbe which is 0.225 + 0.7 = approximately 0.930VDC !!!!

    Well, that's because the goal is not to simply turn on a led. The goal is to turn on an infra-red led by a ramping voltage. Let me explain:

    I want the base of the transistor to start at 0.4VDC and ramp up to 1.2VDC. The only problem I have is the ramping voltages out of my DAC must start at 2.0VDC minimum and could ramp up to something like 10VDC maximum.

    You see, I tried making my DAC start at 0.4VDC and ramp to 1.2VDC and IT DID WORK! However, the ramp was sort of a saw tooth like (see sawtoothVb sketch in attachment). And that was not acceptable. It seems that with my DAC, the maximum range voltage value reference has to be >= 2V for a steady ramping. So now, for my DAC, I am providing a voltage ramp starting at a value of 2.5VDC up to 3.3VDC and this seems to provide a very steady ramp!!!! See steady ramp sketch in attachment.

    Therefore, I need to bias my transistor in the linear region in accordance to min/max voltages from my DAC output (2.5 min to 3.3VDC max) is as follows:

    a) When my DAC is 2.5VDC, I need 0.4VDC at the base of the transistor.
    b) When my DAC is 3.3VDC, I need 1.2VDC at the base of the transistor.
    c) When my DAC is at any voltage between 2.5 and 3.3VDC, the voltage at the base is linearly proportional. ex: lets say DAC output = 2.9 volts, I require 0.8VDC at the base.

    Also, another thing that I should mention, is that at approximately 12ma through the infra-red led is sufficient to emit the correct maximum amount of light for a specific distance. Even though the led's datasheet says the Id may be 100ma.... well, 100ma through the led is waaaaay too much infra-red for what I need. This has been measured over and over... so my infra-red led *MUST* go from 0ma to 12ma MAXIMUM!

    I am experimenting as much as I could with voltage divider circuits but don't really know which type of circuit is the best one to go with which makes it difficult for me to get a circuit to run according to the specs mentioned above.

    So my question to all of you is: According to the attachment below, which biasing circuit method would be the best for achieving my goal....

    A) Voltage divider...
    B) Voltage divider with base resistor... or
    C) Base resistor only

    Your selection will at least guide me towards the right circuit and calculations. This for me would be an enormous help!

    PS. For now I would like to stick with transistors ... so no opamps or fets ect... please. I really need to get familiar with transistors before I go on to any other active components.

    thanks for all your help
    very appreciated
    r
     
    Last edited: Sep 29, 2012
  12. rougie

    Thread Starter Active Member

    Dec 11, 2006
    410
    2
    hello crutshow,

    I am not 100% sure about what I am about to say here...! But:

    If I do this, then I will not be able to use all of the transistor's linear region. For example when the transistor is fully saturated there is way too much infra-red light emitting from the led. Therefore to reduce the infra-red emissions I use Rc. This way I can use *all* the transistor's linear region so that I can get the maximum resolution out of the transistor.

    Please see my last post which explains my requirements....

    thanks crutshow your help is appreciated!
    r
     
    Last edited: Sep 29, 2012
Loading...