Seeking some USB controlled Signal Relay Modules

Thread Starter

ErnieM

Joined Apr 24, 2011
8,415
For an upcoming project I need multiple (20+?) signal relays to build a DMM tree (many pairs of inputs, on output pair).

I'm seeking preassembled boards, USB (or similar) control, with multiple DPST or DPDT relays with contacts rated 2 amps or less, as higher current contacts tend to oxidize without large currents which they will never see.

Something similar to the SainSmart 16-Channel 9-36V USB Relay Module would be great, but with signal relays.

Anyone know of any?
 

MrSalts

Joined Apr 2, 2020
2,767
A young engineer used some cheap USB modules with 10A relays for a hack-a-thon. Controlling a 16x16 array of incandescent bulbs to make a walking man like on a traffic light. That was 3-years ago and that thing was flashing away in our lobby all during Covid. The 10a relays never failed even though they've been powering some little flashlight bulbs. I think you are over-playing the risk of big relays for small jobs. I still don't know why he used incandescent bulbs,
 

MisterBill2

Joined Jan 23, 2018
27,211
There can be a serious problem caused by using higher current contacts to switch instrumentation level signals. Those high current contacts need a fair amount of current to maintain a low contact resistance.
So for the meter multiplexing scheme I suggest reed relays with a fairly low current rating.
There are indeed companies that sell assemblies made for exactly that purpose and intended for computer control interface. National Instruments is the only name that comes to mind, but they are the most expensive one.
Avoid amazon or any similar bulk seller of everything, they are fully useless.
 

Irving

Joined Jan 30, 2016
5,012
Given the stupid price of multiplexing boards I'd roll my own(and did).

What sort of signals are you switching?

<edit> FWIW the NI PXI arrays use the Coto 9104-05-11 reed relay with a single NO contact rated at 0.5A & 150mΩ contact resistance. The cheapest starts at around $2500 (£2000). You could roll your own for tenth of that!
 
Last edited:

Thread Starter

ErnieM

Joined Apr 24, 2011
8,415
A young engineer used some cheap USB modules with 10A relays for a hack-a-thon. Controlling a 16x16 array of incandescent bulbs to make a walking man like on a traffic light. That was 3-years ago and that thing was flashing away in our lobby all during Covid. The 10a relays never failed even though they've been powering some little flashlight bulbs. I think you are over-playing the risk of big relays for small jobs. I still don't know why he used incandescent bulbs,
Since an incandescent bulb has a near zero resistance when first turned on the initial inrush of current can and will act as a contact clearing force. So it seems pairing an incandescent bulb (as opposed to LEDs) with a cheap and easy to source relay card is actually a wise choice.

Given the stupid price of multiplexing boards I'd roll my own(and did).

What sort of signals are you switching?
Actually I am proposing a budget for some new fixturing and thus I can afford to purchase very nice things.

We have a decent in-house design of a 25 relay card using Omron G5V-2-DC5 relays that have proven themselves to be quite reliably, working thousands of times a day in numerous fixtures with failures per year one could count on a single hand, with fingers left over. However, they have no drive features so one has to sink the current for every relay to turn them on, not an issue in our fixed test stands as there are built in drivers for these. My job is to build something stand along so I need to supply all relays and the drivers.

Getting a USB controlled card will simplify the wiring and construction.

I'm switching a bunch of logic level states, and monitoring several voltages, most logic level but the power is some 370VDC which will need some special attention (as those G5V's are only rated for 125 VAC/VDC.
 

Irving

Joined Jan 30, 2016
5,012
For 370v DC you'll need 1000v rated relays and a PCB layout to suit. I doubt you'll find that off-the-shelf (at least not at a price I'd be prepared to pay!) I'd definitely consider rolling-my-own, not least to have a consistent command structure, with one controller and two or three slave PCBs with different relay types. I did something very similar a few years back with an Arduino Nano as the intelligence and a i2c port extender to control 48 channels in various N x M configurations. The advantage over the 'LABView' type of products was being able to have tight integration with the experiment rig and minimise the wiring.
 

MisterBill2

Joined Jan 23, 2018
27,211
I did caution that N.I. was the most expensive, evidently that is still correct. A build your own array is always an option if one is able to build. A lot of places can only do plug and play construction plus write code. And there certainly could be more than one array, or a segment made with higher voltage relays.
 

Thread Starter

ErnieM

Joined Apr 24, 2011
8,415
For 370v DC you'll need 1000v rated relays and a PCB layout to suit... <snip>
370 VDC is a singular exception and thus can be handled in an exceptional way, such as using one different HV rated relay (still dumps HV into the DMM tree) or a simple voltage divider (which has calibration implications).
 

Irving

Joined Jan 30, 2016
5,012
Or use an active solution (like using a HV probe on a 'scope). How accurate do you need to measure 370v? 0.5% (1.3v), 0.1% (0.4v)?
 

Thread Starter

ErnieM

Joined Apr 24, 2011
8,415
I have yet to make an error budget but however accurate the measurement is it certainly must have an accuracy traceable to the NIST.

I have found some decent signal relay boards, only drawback is there is no driver, needs a hardware each relay. Amazon Link
 

Irving

Joined Jan 30, 2016
5,012
I have yet to make an error budget but however accurate the measurement is it certainly must have an accuracy traceable to the NIST.
Not sure what the traceability of a relay contact would be. Even NI don't list that other than its warranted as <1.0Ω contact resistance over an unspec'd life-span. Given the relay itself is spec'd at <0.15Ω typical, & <0.2Ω max that's not much of a warranty! I guess if you measured contact resistance with a 6+ digit DMM with NIST traceability that would suffice.

As to the voltage measurement I'd argue the same would apply. Design it right so no calibration needed, then measure appropriate set of input and output voltages at known temperature with a suitable traceable, hi-digit DMM, divide output by input to get ratio and if that's within spec for both absolute points and overall linearity job done. That's what I did before and was accepted by BSI and TUV in UK - as long as process is reasonable and documented all's well. The main thing is not to over-spec it to start with. I'd guess that 370v +/- 1% is more than good enough (esp if this is rectified/smoothed AC off a phase of a 3-phase supply?)
 
Last edited:

MisterBill2

Joined Jan 23, 2018
27,211
The cheating trick is to include a calibration check voltage point along with the measured variables points. Then every scan of the inputs the calibration reading can be verified. And if the data is saved then it will provide a verification of the accuracy for each set of data.
 

Irving

Joined Jan 30, 2016
5,012
The cheating trick is to include a calibration check voltage point along with the measured variables points. Then every scan of the inputs the calibration reading can be verified. And if the data is saved then it will provide a verification of the accuracy for each set of data.
I get your thinking, but not sure how you'd implement that on a simple voltage divider?

Unless you mean connect it to a calibration voltage. But that would need to be a high voltage to validate the divider and that's potentially a significant cost.
 

MisterBill2

Joined Jan 23, 2018
27,211
I get your thinking, but not sure how you'd implement that on a simple voltage divider?

Unless you mean connect it to a calibration voltage. But that would need to be a high voltage to validate the divider and that's potentially a significant cost.
Presently the various voltages being monitored are within some range, as they are "mostly logic level" voltage. So a single voltage within that range should be adequate. I have not seen any reference to voltage dividers, which are normally rather stable, anyway.
A single reference source powered by the system power supply, should be adequate. Adding a zero voltage data point will provide an indication that the data system zero has not shifted, as well. These two points do not serve as calibration points, but as proof that the system has not changed. That is the purpose. Proving stability, not the calibration.
 

Thread Starter

ErnieM

Joined Apr 24, 2011
8,415
There’s this for what it’s worth.
Oh cool, I like that. A lot. I wonder how "sticky" their COM number is. Have to get me some samples.

Not sure what the traceability of a relay contact would be. Even NI don't list that other than its warranted as <1.0Ω contact resistance over an unspec'd life-span. Given the relay itself is spec'd at <0.15Ω typical, & <0.2Ω max that's not much of a warranty! I guess if you measured contact resistance with a 6+ digit DMM with NIST traceability that would suffice.

As to the voltage measurement I'd argue the same would apply. Design it right so no calibration needed, then measure appropriate set of input and output voltages at known temperature with a suitable traceable, hi-digit DMM, divide output by input to get ratio and if that's within spec for both absolute points and overall linearity job done. That's what I did before and was accepted by BSI and TUV in UK - as long as process is reasonable and documented all's well. The main thing is not to over-spec it to start with. I'd guess that 370v +/- 1% is more than good enough (esp if this is rectified/smoothed AC off a phase of a 3-phase supply?)
"Design it right so no calibration needed" in not a thing. Calibration needs to be periodic and traceable to NIST. Period.

The cheating trick is to include a calibration check voltage point along with the measured variables points. Then every scan of the inputs the calibration reading can be verified. And if the data is saved then it will provide a verification of the accuracy for each set of data.
I've used the "cheating trick" before very successfully. The issue was measuring a motor current via a 'scope; using a high side current shunt I got a representation of it, but uncalibrated. So I switched out the motor and sank current thru a resistor and then into the current input of a DMM. Now I have the ratio of voltage to current of the high side current shunt. Takes but a few seconds so it gets done every time the unit is (automatically) tested.

A voltage divider can be calibrated similarly: input known voltage, measure divided output, calculate cal ratio.
I get your thinking, but not sure how you'd implement that on a simple voltage divider?

Unless you mean connect it to a calibration voltage. But that would need to be a high voltage to validate the divider and that's potentially a significant cost.
The problem doing that here is switching the high voltage. The resistive divider means only a low voltage is switched, but to check the divider ratio the HV needs be switched to something known, and the point is to avoid switched the HV in the first place.

I bet the whole thing comes down to one DPDT high voltage relay at the DMM input to sense either the DMM tree, or the 370 volts. No divider, no cal issues.
 

MisterBill2

Joined Jan 23, 2018
27,211
That 370 volts can be constantly connected to the 100:1 divider and then the divider output can be scanned just like the other signals. The accuracy will be the same but the resolution will be less. A voltage divider can be verified by standard methods, and then it is just one more term used to convert from the A/D converter output bits to an actual voltage number.
 

Irving

Joined Jan 30, 2016
5,012
This is similar to what I used before, just a higher voltage. The LT1236-10 is ±0.05% accuracy so 10 in series gives 100v±0.016% (RSS 3σ method) which I reckon is good enough. R5 - 7 are standard 1% 1206 sized just to keep the voltage per resistor<200v with margin, you could probably get away with just 2 39k resistors. R1 - R4 are 0.1% 1206 size eg TE RQ73-2B rated at 200v. That gives a divider accuracy of better than 100±0.09%. With the 'calibrator' that gives a known accuracy of better than ±0.15%

1658342166454.png
 

Thread Starter

ErnieM

Joined Apr 24, 2011
8,415
OK, let's cut to the chase. What is missing in these schemes is the calibration traceable to the NIST, or National Institute of Standards and Technology. That is no small thing, but is vital to keep in mind when doing any task that takes metrology seriously.

Below is the scheme I will probably be using. Note there will be multiple "Other" inputs, as many as necessary:

TREE.jpg
KX1 is the only high voltage rated component needed. It is used to switch between the high voltage and a much lower voltage. The lower voltage is also included in the "regular" tree. Thus, by reading the direct 28V and the divided 28V the precise divider ratio can be determined at any time (like each time the test is performed). There is no difference in resolution as the range on the DMM may be adjusted.

Now where does that traceability to the National Institute of Standards and Technology come from? Simple, the DMM. We send them out for calibration each and every year. This generates lots of paperwork for traceability. We also put a sticker on the instrument to indicate when cal was performed and when cal has expired.

Meantime, thanks to everyone for their input. That USB controller looks really interesting Ya’akov.
 

Irving

Joined Jan 30, 2016
5,012
Agreed the DMM is the traceable element, in my case to BSI. I was, at the time, a certified BS5750, later ISO900x, practitioner and wrote much of the in-house documentation for the organisation I was working for at the time.
 
Last edited:
Top