Voltage reduction?

Thread Starter

snowdrifter

Joined Aug 13, 2013
43
I have a project where I need to drop voltage by exactly 1 volt. +/- no more than .05v.

The hang up is this has a varying electrical demand, varying temperature, and varying voltage. And it needs to be responsive - if the voltage is changing quickly, this too needs to change quickly.

This IS a DC application with a voltage range between 11 and 17v, and the current draw of <500ma.


I've toyed with diodes but those only create a drop of .7v. And resistors don't do well with fluctuating current demands. Plus I'll have variances in temperature between about 0 degrees and 175 degrees Fahrenheit. I've been banging my head against the wall trying to come up with a way to do this but I'm just not getting anywhere.

Ideas?

Trying to explain the project I'm doing would be tough and it's not really pertinent. So I'm going to say don't worry about it
 

studiot

Joined Nov 9, 2007
4,998
Trying to explain the project I'm doing would be tough and it's not really pertinent. So I'm going to say don't worry about it
Oh but we do worry about it.

Are you a student electronics engineer and is this a college electronics project?

We are happy to help but we do not do your college work for you.
 

wayneh

Joined Sep 9, 2010
17,498
You need this voltage drop to occur in the "main power" path, the <500mA current? Or do you just need a small reference voltage 1V below some other voltage?

There are lots of ways to do either, but the current load matters. Do you have any more detailed information than "<500mA"? I mean, can you narrow it to 300-500mA?
 

#12

Joined Nov 30, 2010
18,224
Hmm...a 1.00 volt zener that can pass 1/2 amp and stay stable within 5% across time, temperature, and load changes. Did I get that right?

Of course it will not drop a volt if there is no current at all, and this brings up the idea that we need to know the minimum current. There is a lot of difference between a nanoamp and a halfanamp.
 

Thread Starter

snowdrifter

Joined Aug 13, 2013
43
Just measured current. I got between 66ma and 84ma. So figure a range between 50 and 100ma to be on the safe side.I may have been a bit overzealous when I said 500ma LOL


Edit: I just poked around on Mouser. The lowest zener diode rating they have is 2.4v
 

Thread Starter

snowdrifter

Joined Aug 13, 2013
43
Although... I suppose a 2v zener diode would work in this application


@studiot: College doesn't start for another couple weeks :p
 

#12

Joined Nov 30, 2010
18,224
What?:eek:

You want 1.00 volts +/- 5% or 2 volts?
This is entirely contradictory.
Please get specific.
 

Thread Starter

snowdrifter

Joined Aug 13, 2013
43
Right... I could see how that would be confusing lol. It's less about it being 1 or 2v and more about it being an even, known number. I'll explain


I have a volt meter, it's this slick analogue style interface with an LED representation. Advantage being that is is SUPER responsive and can pick up millisecond changes in voltage. But the issue with this is that it's limited in it's display range. And running at 16v+ just pegs it constantly and more or less defeats the point.

So if I can fool the meter into seeing a lower voltage than it really is, by 1 or 2 volts, then I can get some readability out of it. Yes, it's a bit hackish/shoddy, but I'll be the only one reading it and mentally adding a volt or two is easy.

 

Thread Starter

snowdrifter

Joined Aug 13, 2013
43
Could you expand on that a bit? That's a voltage regulator, no? I'm having trouble visualizing how it would still allow for variation in the output voltage?
 

#12

Joined Nov 30, 2010
18,224
Like this...

I just noticed, you are using a volt meter and expecting 66 to 84 ma to go through it. That doesn't make sense. It also renders my idea for an adjustable zener nonsense. Where is this 66 to 84 ma going?
 

Attachments

Last edited:

WBahn

Joined Mar 31, 2012
30,058
Now you can start to see why it does matter what you are trying to do.

Things work out best when you start with what you NEED, not what you think you WANT.

1) Where did the ±0.05V come from? It looks like the meter only has a resolution of ±0.15V. So what would be wrong with ±0.1V? Can make meeting your spec a whole lot easier.

2) You've got a voltmeter that is drawing 66mA to 84mA from what you measuring the voltage of? Are you sure you are measuring the current at the correct spot, namely the place that you really need to be dropping your voltage at?

3) This handheld meter needs to work up to 175°F? Just where is it going to be, why does it need to be there, and how are you going to read it?

4) Not that it matters a lot, but you say that this responds to millisecond changes in voltage. It looks like you read this thing by visually seeing which LED is currently on. Right? So what does a voltage that is 14V and that spikes up to 15V and back in, say, 5ms look like when you are trying to read it?

5) You said you need the input to be between 11V and 17V, for a range of 6V. Yet the range of the metter is only 5.7V. Perhaps that's close enough if you drop 1V, but then you said that dropping 2V was okay. Is it? What if the voltage is 11.5V and you drop 2V?

The first thing that came to my mind was a Vbe-multiplier, but I'm concerned about meeting the precision over that big a temperature range.
 
Last edited:

#12

Joined Nov 30, 2010
18,224
The 431 is an odd kind of Vbe multiplier, but the definitions I'm getting from snowdrifter are contradictory.
 

WBahn

Joined Mar 31, 2012
30,058
The 431 is an odd kind of Vbe multiplier
Yeah, I just went and looked the datasheet you linked and that pretty much exactly what it is, except with a lot of effort to effect temperature compensation. But with a dynamic impedance of about half an ohm, that will chew up a good chunk of his requested precision over the requested range of currents.

, but the definitions I'm getting from snowdrifter are contradictory.
Nothing new, there. He is making the same mistake that so many posters (and customers) do -- and it's one that we all tend to make, even after we really should know better. He has a problem that he needs a solution to. He dreams up a possible solution. Then he asks how to solve the problem of implementing his solution instead of how to solve the real problem.

But he seems to be coming around quicker than most, so there is hope.

@snowdrifter: Don't take comments like these personally. They are multipurpose. We get to discuss peripheral issues that many readers might benefit from (including you), we get to prod you, specifically, into more effectively engaging in the search for something that will work for you, and we get to vent and rant a bit.
 

LDC3

Joined Apr 27, 2013
924
Since it is going to a meter, why not use an op-amp circuit to drop the voltage. You could even use a switch to change the amount of voltage removed.
 

WBahn

Joined Mar 31, 2012
30,058
Since it is going to a meter, why not use an op-amp circuit to drop the voltage. You could even use a switch to change the amount of voltage removed.
That was another thought I had -- and it may work -- but the question is how would it be powered? I suppose you could assume that the voltage being measured will be between 11V and 17V and power it off that. But being sure that you stayed within the allowed input voltage range might be easier said than done.
 

studiot

Joined Nov 9, 2007
4,998
This type of circuit has been around for a very long time.
Automotive engineers used to call it a suppressed zero voltmeter. It was used to display the top 6 or so volts of the auto battery voltage only ie spread the 6 volts over the full scale.
There were mechanical and zener based versions.
 

Thread Starter

snowdrifter

Joined Aug 13, 2013
43
Now you can start to see why it does matter what you are trying to do.

Things work out best when you start with what you NEED, not what you think you WANT.

1) Where did the ±0.05V come from? It looks like the meter only has a resolution of ±0.15V. So what would be wrong with ±0.1V? Can make meeting your spec a whole lot easier.

2) You've got a voltmeter that is drawing 66mA to 84mA from what you measuring the voltage of? Are you sure you are measuring the current at the correct spot, namely the place that you really need to be dropping your voltage at?

3) This handheld meter needs to work up to 175°F? Just where is it going to be, why does it need to be there, and how are you going to read it?

4) Not that it matters a lot, but you say that this responds to millisecond changes in voltage. It looks like you read this thing by visually seeing which LED is currently on. Right? So what does a voltage that is 14V and that spikes up to 15V and back in, say, 5ms look like when you are trying to read it?

5) You said you need the input to be between 11V and 17V, for a range of 6V. Yet the range of the metter is only 5.7V. Perhaps that's close enough if you drop 1V, but then you said that dropping 2V was okay. Is it? What if the voltage is 11.5V and you drop 2V?

The first thing that came to my mind was a Vbe-multiplier, but I'm concerned about meeting the precision over that big a temperature range.
1. +/- .5v is a want, not a need. +/- .1v is acceptable, but I'd like to know within a tenth of where the volt meter is. Compounding a possible .1v variance on top of the meter's own .3v resolution puts it close to a half volt. I'd really /like/ to minimize that as much as possible since once we get into that level of error, I'm no better off than using a resistor and dealing with the effects of varying current

2. Voltage is being measured from an ~850 farad capacitor bank, which is also wired with a 14v lead/acid battery. The reason why I'm trying to minimize the voltage variance is so I can more closely monitor battery voltage without necessitating a second, digital volt meter. If I were to decrease the voltage of the thing I am measuring the battery wouldn't charge properly and the voltage would be more suitable for long-term storage rather than actually charging it.

3. It will be exposed to sun for long periods. It's in a car. Winter, summer heat, etc. I know the rules say no automotive projects, but this isn't necessarily related to the car as much as it is something separate that happens to be installed in one.

4. Correct. I'll post a video at the end of ths

5. Then the meter will remain "pegged" at the low end of the range. The opposite of what I'm at right now where it's "pegged" at the high end of the range



4. cont.
Skip to 0:45. Ignore the audio
http://youtu.be/CuR00e3yjBo




Getting my ideas in my head, out in a manner makes sense has always been a difficulty for me. So I apologize if I'm being confusing.
 

WBahn

Joined Mar 31, 2012
30,058
2) You've got a voltmeter that is drawing 66mA to 84mA from what you measuring the voltage of? Are you sure you are measuring the current at the correct spot, namely the place that you really need to be dropping your voltage at?
2. Voltage is being measured from an ~850 farad capacitor bank, which is also wired with a 14v lead/acid battery. The reason why I'm trying to minimize the voltage variance is so I can more closely monitor battery voltage without necessitating a second, digital volt meter. If I were to decrease the voltage of the thing I am measuring the battery wouldn't charge properly and the voltage would be more suitable for long-term storage rather than actually charging it.
That doesn't answer the question. The question is WHERE and HOW are you measuring the current that you did?



The top is what you are describing that you presently have. The bottom is what you are describing you want. The labeled arrow is the current you SHOULD be measuring.

4) Not that it matters a lot, but you say that this responds to millisecond changes in voltage. It looks like you read this thing by visually seeing which LED is currently on. Right? So what does a voltage that is 14V and that spikes up to 15V and back in, say, 5ms look like when you are trying to read it?
4. Correct. I'll post a video at the end of ths
I certainly don't see anything that would indicate that this is responding on a millisecond time scale. Figure that the time between frames of the video is probably 20ms to 50ms, there is no video (that wasn't taken at high speed and slowed down considerably for replay) that could show it and it's very doubtful that the human eye could perceive it, anyway. Where is that millisecond responsiveness claim coming from? I'm not seeing anything that would indicate a responsiveness faster than a couple hundred milliseconds from one side to the other and I don't think you would even want it to otherwise you would not see the illusion of a meter-like movement. Again, not that it really matters.
 

Attachments

Top