What're better? Voltmeters or Voltage Comparators?

Thread Starter

Raxnos

Joined May 21, 2015
18
Hi People! I'm currently working on a voltage monitoring system to measure quite a few different voltages from different sources and I was wondering, what would be better for monitoring the voltages: voltmeters with an associated program or some voltage comparators? I could possibly have references so the comparators would have comparative voltages relative to the comparator itself.. Any thoughts?
 

Thread Starter

Raxnos

Joined May 21, 2015
18
What output from this monitoring system do you want, Go/No-Go or values?
How is the output to be monitored?
I haven't worked out the details yet, but the principal is that the system will monitor the voltages of a range of sources and, when one voltage reaches a certain value (something I'm not completely clear on yet as I have no clue wether or not the value will come from a reference, or from pre-constructed data.) it would send a response signalling that that specific voltage has been met.
 

Thread Starter

Raxnos

Joined May 21, 2015
18
What output from this monitoring system do you want, Go/No-Go or values?
How is the output to be monitored?
I just realised that I didn't give enough detail.. So... I was basically asking if, from another person's view, a quite a bit more expensive voltmeter setup would be beneficial compared to a much cheaper voltage comparator setup. As both can do the same thing in different ways. There are few upsides I can see to having the voltmeter setup, for example the ability to calibrate it pretty easily and inexpensively. So.. Follow up question, do you know any place , that has a store in the UK, that sells cheap, accurate, small voltmeters with no display? I mean, I need a couple hundred of them, which is why cost is a bit of a drag.. Thanks by the way!
 

crutschow

Joined Mar 14, 2008
23,494
An A/D converter approach would give more flexibility and require less calibration than a comparator based system but would likely require a microcontroller or computer to do the monitoring. Any changes in the voltage limits can then be easily done in software.
A PC or micro with a USB port could use a USB A/D converter module, such as one of these, to do the voltage monitoring so little on no hardware would need to be built.

A comparator based approach would require no computer but the voltage limits would likely have to be calibrated, and would need to be manually readjusted if you needed any changes in the limits.

My choice would be the A/D computer approach.
 

Thread Starter

Raxnos

Joined May 21, 2015
18
An A/D converter approach would give more flexibility and require less calibration than a comparator based system but would likely require a microcontroller or computer to do the monitoring. Any changes in the voltage limits can then be easily done in software.
A PC or micro with a USB port could use a USB A/D converter module, such as one of these, to do the voltage monitoring so little on no hardware would need to be built.

A comparator based approach would require no computer but the voltage limits would likely have to be calibrated, and would need to be manually readjusted if you needed any changes in the limits.

My choice would be the A/D computer approach.
Thanks that looks really good! Novice question.. But what do channel and bit mean in this scenario? I know what they generally mean, but I have no clue what they mean here..
 

blocco a spirale

Joined Jun 18, 2008
1,546
I just realised that I didn't give enough detail.. So... I was basically asking if, from another person's view, a quite a bit more expensive voltmeter setup would be beneficial compared to a much cheaper voltage comparator setup. As both can do the same thing in different ways. There are few upsides I can see to having the voltmeter setup, for example the ability to calibrate it pretty easily and inexpensively. So.. Follow up question, do you know any place , that has a store in the UK, that sells cheap, accurate, small voltmeters with no display? I mean, I need a couple hundred of them, which is why cost is a bit of a drag.. Thanks by the way!
You need "a couple of hundred......cheap, accurate, small voltmeters with no display"?

Are you serious?
 

MrChips

Joined Oct 2, 2009
19,382
Thanks that looks really good! Novice question.. But what do channel and bit mean in this scenario? I know what they generally mean, but I have no clue what they mean here..
A channel refers to your signal input. Hence an 8-channel data acquisition system (DAQ) will allow you to monitor 8 separate sources.

A bit is a single binary unit. When the voltage being monitored is converted to numbers it is represented by a quantity of bits.
The more bits you have, the finer the resolution. Hence if the DAQ converts voltages using 12 bits, you can measure the voltage using a scale of 4096 steps ( 2 multiplied by 2 twelve times). If the DAQ full scale range is 0 to 10V, then the finest step is 10V/4096 = 2.44mV
 

Thread Starter

Raxnos

Joined May 21, 2015
18
A channel refers to your signal input. Hence an 8-channel data acquisition system (DAQ) will allow you to monitor 8 separate sources.

A bit is a single binary unit. When the voltage being monitored is converted to numbers it is represented by a quantity of bits.
The more bits you have, the finer the resolution. Hence if the DAQ converts voltages using 12 bits, you can measure the voltage using a scale of 4096 steps ( 2 multiplied by 2 twelve times). If the DAQ full scale range is 0 to 10V, then the finest step is 10V/4096 = 2.44mV
Thanks! This makes a LOT more sense now..
 

Thread Starter

Raxnos

Joined May 21, 2015
18
How many voltages do you need to monitor at the same time?
How fast do you need to respond to a change in voltage?
I'll need to monitor around 200 different voltages. And I have no clue at what frequency the A/Ds can measure up to, so I'll just say that around 1ms (i.e. 1KHz) would quite easily suffice. Thanks!
 

DerStrom8

Joined Feb 20, 2011
2,390
An A/D converter takes an Analog signal and converts it to a Digital value that represents the voltage on the input at the time the sample was taken. Your sampling time is what determines the maximum measurable frequency of the signal. You have not described the nature of the signals you are measuring--Are they sinusoidal? DC? Square wave? What is their maximum amplitude? If you're measuring, say, 240VAC, you're going to need to drop the voltage before measuring it, and that would introduce a wide variety of possible errors and inaccuracies.
 

alfacliff

Joined Dec 13, 2013
2,458
the only place I have seen voltage comparators used is in production, where things are adjusted for a specific level, they are too complicated to set for each voltage.
 

Thread Starter

Raxnos

Joined May 21, 2015
18
That's a lot of voltages to monitor.
Exactly what is this for?
Do all the voltages have a common ground?
It's for an EEG experiment I was working on. Also, what do you mean by common ground? As in they've been grounded by the same thing? If so, then yes they're all grounded by the same thing.
 

kubeek

Joined Sep 20, 2005
5,621
Most of the time it is better to have the whole picture right at the beginning.
The preamps for the tiny eeg signals will be a lot more complex and expensive than the acquisition part. Sampling say 16 channels at 1 khz is not that big of a deal, but amplifying the signals will be more demanding.
 
Top