Sampling of an unlimited number of sensors in parallel: photon-pixel coupling

Thread Starter

Mobius Scutav

Joined May 17, 2019
7
This post is more of an awareness of a fascinating subject discussed yesterday with @Wendy in particular. Our discussion was focused on a new method called photon-pixel coupling, which allows the sampling of an enormous number of sensors in parallel.

The photon-pixel coupling method is truly ingenious because it solves two main problems in engineering: 1] an unlimited number of sensors and 2] their parallel sampling at video rate frequencies. Just imagine, we can read as many sensors as we wish.

How it works:

Basically, each sensor output is an LED. If you have 10000 sensors, the LED (output) of each is inserted in a LED array of 10000 elements, a LED matrix as the authors say. After that, the LED array is filmed by a video camera and the images are processed in real time by a computer. A software reads one pixel from each LED from the LED array picture/frame and converts it to numerical values. Thus, your LED array will be converted in a matrix (with 10000 elements) filled with numbers that can be processed as you wish in your software. I don't know if I was explicit but you can read their article here: https://www.sciencedirect.com/science/article/pii/S2215016119300901

Note that classic multiplexing is serial and photon-pixel coupling is parallel.

Schematics of the Photon-pixel coupling from the original article:




What I wander is if we can adapt the photon-pixel coupling to Arduino. I am new in the world of microcontrollers but I know Arduino can support a cam, a low fps cam, so it should be possible.

If you are a PhD student then:
P.A. Gagniuc, C. Ionescu-Tirgoviste, R.G. Serban, E. Gagniuc. Photon-pixel coupling: A method for parallel acquisition of electrical signals in scientific investigations. MethodsX, 6:968-979, 2019.
 

Attachments

Thread Starter

Mobius Scutav

Joined May 17, 2019
7
That is the main problem I think of too. In theory, the number of pixels that should be read by Arduino from the cam images is equal to the number of sensors in the LED array (ex. if I have 20 sensors, Arduino will read just 20 pixels once or twice per second). I understand that an Arduino cam has 1 fps or 2 fps. So the photon-pixel coupling should be reasonable to implement.

A sampling for 100 to let's say 10000 sensors once or twice a second will be sufficient for the majority of measuring or automation applications. But can arduino read 10000 values in one second or twice per second ? That I don't know. However, I am certain that a few hundred pixels can be processed by Arduino. What I don't know is if the Arduino can run the video camera and do some simple processing at the same time ...

My experience with Arduino is fairly limited. I see that Arduino Due has everything for this implementation and more. But Arduino due is almost a mini-mini-PC :)

Arduino (for me) means full independence from the complexity of PC's. That is why I think like many others that it is the future. I wish to see the photon-pixel coupling implemented on a simple Arduino UNO to expand the capabilities of what it can be done with this microcontroller.
 
Last edited:

bogosort

Joined Sep 24, 2011
696
Note that classic multiplexing is serial and photon-pixel coupling is parallel.
If I understand how this works, it is still serial in that each pixel in the image will be processed serially, one after the other. With a single processor, such as an Arduino, I don't foresee this providing a performance advantage over traditional multiplexing. In fact, I'm pretty sure it will underperform -- and have far more software complexity -- until some fairly high threshold of sensors is reached. I suppose it could be worth it if you need to record hundreds of sensors that can easily be converted to LED outputs; then the system could be made semi-parallel by sharing the image between a group of processors, which each working on a submatrix of pixels.

But I suspect that significant sensor precision will be lost in the sensor output → LED brightness → camera → pixel brightness conversion. It's a neat idea, but I'm not sure how generally applicable this camera-as-multiplexer method will be.
 

nsaspook

Joined Aug 27, 2009
13,079
This is novel? It's another variation of a signal modulation type constellation. The camera sensor provides a sample&hold of the constellation LED image while the image sensor encoder serially sends each set of sensor pixels as an encoded symbol of X/Y and amplitude by mapping into a jpeg image stream.


QAM constellation.
 

djsfantasi

Joined Apr 11, 2010
9,156
But only if the camera on the Arduino has just 20 pixels. If it is a more 'normal' camera with say 640x480 pixels then the Arduino has to sort through 300,000 input pixels.
Maybe. Maybe not.

If the led array and the camera are fixed relative to each other, then the significant pixels in the image are fixed as well. Hence, the algorithm only has to sort through n pixels, where n is the number of elements in the led array.

Even if there is some slight variation, the number of pixels to be examined is still significantly smaller than a full image. Let’s imagine a scenario where we examine 1 pixel to each side of a position, that would be 4 pixels per led array position*. Or 40,000 tests versus 300,000 or ~14% of the image.



* where position is a virtual position, calculated as the center point of every 4 pixels in a square. There are other algorithms.
 

Thread Starter

Mobius Scutav

Joined May 17, 2019
7
But only if the camera on the Arduino has just 20 pixels. If it is a more 'normal' camera with say 640x480 pixels then the Arduino has to sort through 300,000 input pixels.
That is the thing. In regard with the example from the previous comment, only 20 pixels need to be read in fixed locations on the images regardless of the Arduino video camera (just the pixels from the bright spots of the LEDs). The arduino video camera can be at any resolution, because the number of pixels will be equal with the number of sensors regardless. So an [x,y] map of each pixel is needed in advance, to avoid scanning the entire image (which can be totally time consuming).
 

Thread Starter

Mobius Scutav

Joined May 17, 2019
7
This is novel? It's another variation of a signal modulation type constellation. The camera sensor provides a sample&hold of the constellation LED image while the image sensor encoder serially sends each set of sensor pixels as an encoded symbol of X/Y and amplitude by mapping into a jpeg image stream.


QAM constellation.

Photon-pixel coupling is a novel method, no doubt about that. Quadrature amplitude modulation is quite different in my opinion ... just see these videos and info below:


http://www.wirelesscommunication.nl/pdfandps/qam.pdf
https://en.wikipedia.org/wiki/Quadrature_amplitude_modulation
 

Thread Starter

Mobius Scutav

Joined May 17, 2019
7
Maybe. Maybe not.

If the led array and the camera are fixed relative to each other, then the significant pixels in the image are fixed as well. Hence, the algorithm only has to sort through n pixels, where n is the number of elements in the led array.

Even if there is some slight variation, the number of pixels to be examined is still significantly smaller than a full image. Let’s imagine a scenario where we examine 1 pixel to each side of a position, that would be 4 pixels per led array position*. Or 40,000 tests versus 300,000 or ~14% of the image.



* where position is a virtual position, calculated as the center point of every 4 pixels in a square. There are other algorithms.

Yes, spot on !
 

Thread Starter

Mobius Scutav

Joined May 17, 2019
7
If I understand how this works, it is still serial in that each pixel in the image will be processed serially, one after the other. With a single processor, such as an Arduino, I don't foresee this providing a performance advantage over traditional multiplexing. In fact, I'm pretty sure it will underperform -- and have far more software complexity -- until some fairly high threshold of sensors is reached. I suppose it could be worth it if you need to record hundreds of sensors that can easily be converted to LED outputs; then the system could be made semi-parallel by sharing the image between a group of processors, which each working on a submatrix of pixels.

But I suspect that significant sensor precision will be lost in the sensor output → LED brightness → camera → pixel brightness conversion. It's a neat idea, but I'm not sure how generally applicable this camera-as-multiplexer method will be.

True, all software applications are liniar. We can not change that one :) However, if we have time sensitive sampling, we can eliminate the software and just record the video. After that we can use the software to process the video frame by frame. The advantage I see with photon-pixel coupling is that: 1) it costs nothing compared with multiplexing, 2) avoids a sampling offset (in the case of multiplexing, if we have 10000 sensors and we wish to make a correlation between sensor 1 and sensor 10000, then we have a big problem, because when the multiplexor reaches sensor 10000, sensor 1 may long have totally different values ... Also, in the case of multiplexing, the sampling time is inversely proportional to to the number of inputs/sensors).

You are right with the threshold, you need many sensor inputs for an advantage over traditional multiplexing. But science today is about large scale in every area. In the end we are limited by the fps ... :)
 
Last edited:

bogosort

Joined Sep 24, 2011
696
The advantage I see with photon-pixel coupling is that: 1) it costs nothing compared with multiplexing
What costs are we talking about? A bag full of MUX chips is cheaper than a camera, and the software costs -- in terms of development time and product complexity -- of camera multiplexing surely exceed the simple logic of traditional multiplexing.

2) avoids a sampling offset (in the case of multiplexing, if we have 10000 sensors and we wish to make a correlation between sensor 1 and sensor 10000, then we have a big problem, because when the multiplexor reaches sensor 10000, sensor 1 may long have totally different values ... Also, in the case of multiplexing, the sampling time is inversely proportional to to the number of inputs/sensors).
This is a very good point -- certain applications require simultaneous sampling, which simply can't be done with traditional multiplexing to a single ADC.

I'm still skeptical about sensor transduction. LEDs are not what anyone would call precision devices, so even if one could easily and cheaply retrofit existing sensors to drive an LED output -- and that's certainly not a given -- a new, significant source of signal error has just been introduced. Then the signal + error brightness levels have to be captured by a camera, which -- even if high quality and capable of producing 24-bit grayscale bitmaps -- will introduce even more error. As such, I'd think this is only viable for low-precision applications.

Personally, if I were designing a large parallel-sampling application, I'd rather use off-the-shelf sensors and a distributed network of ADCs with the exact precision required. But I'd be interested to know if the camera system finds wider use.
 

MrAl

Joined Jun 17, 2014
11,389
Hello there,

It is not that much about how fast you can process the ACQUIRED data, it's about how fast you can ACQUIRE it.

In other words, you dont always have to process data in real time but you may have to acquire it in real time because life happens in real time.

So say you have 100 sensors 10x10. You acquire 2 days worth of data as a video.
The later you take it back to the lab and process all that not in real time but in your particular computers time.

You get results you could not have gotten if you just used the computer alone.

A similar argument can be made for having 100 volt meters being viewed by a web cam. You can read the 7 segment displays long after the real life measurements have been made, and you also have a video record of it all too.
 

Thread Starter

Mobius Scutav

Joined May 17, 2019
7
What costs are we talking about? A bag full of MUX chips is cheaper than a camera, and the software costs -- in terms of development time and product complexity -- of camera multiplexing surely exceed the simple logic of traditional multiplexing.


This is a very good point -- certain applications require simultaneous sampling, which simply can't be done with traditional multiplexing to a single ADC.

I'm still skeptical about sensor transduction. LEDs are not what anyone would call precision devices, so even if one could easily and cheaply retrofit existing sensors to drive an LED output -- and that's certainly not a given -- a new, significant source of signal error has just been introduced. Then the signal + error brightness levels have to be captured by a camera, which -- even if high quality and capable of producing 24-bit grayscale bitmaps -- will introduce even more error. As such, I'd think this is only viable for low-precision applications.

Personally, if I were designing a large parallel-sampling application, I'd rather use off-the-shelf sensors and a distributed network of ADCs with the exact precision required. But I'd be interested to know if the camera system finds wider use.
A web camera is $1 to $3, so it may be (for whatever reason) cheaper than a multiplexor PCB + components + time for design and testing. But, it is not just the camera, the LED matrix also costs ... and LEDs cost much more than the camera (I know that from experience).There is also time spent in making the LED array, which may be equivalent to the time spent to make a simple multiplexor. I don't know what to say.

For large arrays of sensors, what a multiplexer samples is under the same scrutiny as the LEDs in the end. Photon-pixel coupling is limited by fps, so we can not talk about GHz speed sampling ... but I think it will win for large scale applications for the simple reason that is simple/direct, and gives power to the software side, which is the trend these days.. Who knows, time and engineering needs will tell.
 
Top