calculate the chance of collision

Thread Starter

bug13

Joined Feb 13, 2012
2,002
Hi guys

I got some code here (in c), basically, I got an send() function, it will return true if the data is send successfully. In this send() function, it deals with CSMA/CA and CSMA/CD stuff. So I need to add some random delay between an fail send() and the next retry.

Code:
uint8_t random = 0;
uint8_t counter = 0;
while(send() == false)
{
  random = getRandomNumber(min, max);   // generate an random number
  delay_ms(random);                                      // random delay
  counter++;
  if (counter > max_retry) break;                     // if fail too many times, exit
}
So my question is how can I estimate what is the best range for getRandomNumber(min, max), to get the minimum collision.

Let's say I have 100 wireless devices, there is about 1 packet data need to be send per second. The wireless devices are in a mesh network.

Thanks guys
 

WBahn

Joined Mar 31, 2012
30,072
How long does it take to send one packet (when successful)?

Is that one packet per second the total comm burden for the entire network, or it is per device?

How many devices are in range of each other (i.e., what is the mesh size)?

This sounds like homework. Is it?
 

Thread Starter

bug13

Joined Feb 13, 2012
2,002
I thought someone will ask me that:
This sounds like homework. Is it?
The answer is no, I just want an general idea on how to calculate this. Or maybe some pointer so I can get started.

How long does it take to send one packet (when successful)?
I am not sure, my packets sniffing device shows that is usually need 100us or less, but sometime it takes 1ms (assuming CSMA/CA working here?). For this estimation, say 1ms per packet.

Is that one packet per second the total comm burden for the entire network, or it is per device?
it's kind of random, but my estimate is about 1 packet per second per device.

How many devices are in range of each other (i.e., what is the mesh size)?
The one I am testing now is about 12 devices, but I would also want to know if I have say 100 devices.

Extra info:
I am using the MRF24J40 from microchip with the microchip MiWi Pro stack. The reason I want to estimate the possible of collision is when I program one device to send one packet per second (as a test), it seems to lost some packets. And I don't want that.

First I change to code to resend up to a number of times (if fail) before giving up. Thing seem to improve, but it still lost some packets.

Then I added some random delay between a resend. My result improve a lot. I just tested it over the weekend, and I don't see any lost packets.

That leads me to ask, what is the best number to use in those delay to get maximum result methodically.
 

Thread Starter

bug13

Joined Feb 13, 2012
2,002
PS:

I first thought it was a birthday problem (birthday paradox), but I don't think this is the case. I am pretty sure the packet will have collision one way or another. I just want to minimize it, not to avoid it.
 

WBahn

Joined Mar 31, 2012
30,072
I am not sure, my packets sniffing device shows that is usually need 100us or less, but sometime it takes 1ms (assuming CSMA/CA working here?). For this estimation, say 1ms per packet.
I don't think I asked the question well enough. What I'm looking for is how long (time wise) a packet is. It sounds like you gave me how long it can take for a successful transmission that could involve several unsuccessful ones first.

it's kind of random, but my estimate is about 1 packet per second per device.


The one I am testing now is about 12 devices, but I would also want to know if I have say 100 devices.
If it takes 1 ms for a packet (if that's the actual packet length), then 100 devices with one packet per second is going to be about 10% channel utilization under 100% efficiency, and if a packet length is more like 100 us then you are in the 1% range, so you've got some room to maneuver within.

Extra info:
I am using the MRF24J40 from microchip with the microchip MiWi Pro stack. The reason I want to estimate the possible of collision is when I program one device to send one packet per second (as a test), it seems to lost some packets. And I don't want that.

First I change to code to resend up to a number of times (if fail) before giving up. Thing seem to improve, but it still lost some packets.

Then I added some random delay between a resend. My result improve a lot. I just tested it over the weekend, and I don't see any lost packets.

That leads me to ask, what is the best number to use in those delay to get maximum result methodically.
I don't know anything about MiWi -- and don't want to spend time looking anything up. In general, if you don't have a random backoff then transmission attempts will tend to cluster unacceptably. That's probably what's happing to you.

Are you using, or planning to use, any type of ARQ system to create a reliable channel?
 

dannyf

Joined Sep 13, 2015
2,197
my question is how can I estimate what is the best range for getRandomNumber(min, max), to get the minimum collision.
the minimum collision probability is achieved by having an infinite delay: you will never collide, for sure.

Most people don't try to achieve minimum collision - they try a compromise between throughput and collision.

If I were you, I would get a sense of how fast / slow send() executes. Say that it executes with a mean and standard deviation, you will need to determine how likely you want the transmission to go through in the next round. from that, you pick a delay from mean-k*standard deviation to mean + k*standard deviation.

For example, say that you want the probability of successful transmission to be 95%, k = 2; 99.7%, k=3, etc.

Now, that assumes a normal distribution. technically the delay should follow poisson distribution but the basic idea is the same.
 

Thread Starter

bug13

Joined Feb 13, 2012
2,002
If it takes 1 ms for a packet (if that's the actual packet length), then 100 devices with one packet per second is going to be about 10% channel utilization under 100% efficiency, and if a packet length is more like 100 us then you are in the 1% range, so you've got some room to maneuver within.
Thanks, this thinking is useful!

I don't know anything about MiWi -- and don't want to spend time looking anything up.
Think if MiWi as stack/driver/Network_protocal to use Microchip MRF24J40, it's like zigbee but smaller footprint, and it's free (if you use it with microchip part)

Are you using, or planning to use, any type of ARQ system to create a reliable channel?
No, but the stack does come with an acknowledge mechanism with CRC16, if the TX device don't get an ACK, I will re-transmitter.
 

Thread Starter

bug13

Joined Feb 13, 2012
2,002
Most people don't try to achieve minimum collision - they try a compromise between throughput and collision.
I think that's what I really want to ask, but you already know :)

If I were you, I would get a sense of how fast / slow send() executes. Say that it executes with a mean and standard deviation, you will need to determine how likely you want the transmission to go through in the next round. from that, you pick a delay from mean-k*standard deviation to mean + k*standard deviation.

For example, say that you want the probability of successful transmission to be 95%, k = 2; 99.7%, k=3, etc.
I can understand this:
mean-k*standard deviation to mean + k*standard deviation
But how do you get the relationship between the percentage and k here?
95%, k = 2; 99.7%, k=3
Thanks
 
Top