# calculate the chance of collision

Discussion in 'Math' started by bug13, Jul 24, 2016.

1. ### bug13 Thread Starter Senior Member

Feb 13, 2012
1,521
55
Hi guys

I got some code here (in c), basically, I got an send() function, it will return true if the data is send successfully. In this send() function, it deals with CSMA/CA and CSMA/CD stuff. So I need to add some random delay between an fail send() and the next retry.

Code (Text):
1. uint8_t random = 0;
2. uint8_t counter = 0;
3. while(send() == false)
4. {
5.   random = getRandomNumber(min, max);   // generate an random number
6.   delay_ms(random);                                      // random delay
7.   counter++;
8.   if (counter > max_retry) break;                     // if fail too many times, exit
9. }
So my question is how can I estimate what is the best range for getRandomNumber(min, max), to get the minimum collision.

Let's say I have 100 wireless devices, there is about 1 packet data need to be send per second. The wireless devices are in a mesh network.

Thanks guys

2. ### WBahn Moderator

Mar 31, 2012
23,400
7,109
How long does it take to send one packet (when successful)?

Is that one packet per second the total comm burden for the entire network, or it is per device?

How many devices are in range of each other (i.e., what is the mesh size)?

This sounds like homework. Is it?

3. ### bug13 Thread Starter Senior Member

Feb 13, 2012
1,521
55
I thought someone will ask me that:
The answer is no, I just want an general idea on how to calculate this. Or maybe some pointer so I can get started.

I am not sure, my packets sniffing device shows that is usually need 100us or less, but sometime it takes 1ms (assuming CSMA/CA working here?). For this estimation, say 1ms per packet.

it's kind of random, but my estimate is about 1 packet per second per device.

The one I am testing now is about 12 devices, but I would also want to know if I have say 100 devices.

Extra info:
I am using the MRF24J40 from microchip with the microchip MiWi Pro stack. The reason I want to estimate the possible of collision is when I program one device to send one packet per second (as a test), it seems to lost some packets. And I don't want that.

First I change to code to resend up to a number of times (if fail) before giving up. Thing seem to improve, but it still lost some packets.

Then I added some random delay between a resend. My result improve a lot. I just tested it over the weekend, and I don't see any lost packets.

That leads me to ask, what is the best number to use in those delay to get maximum result methodically.

4. ### bug13 Thread Starter Senior Member

Feb 13, 2012
1,521
55
PS:

I first thought it was a birthday problem (birthday paradox), but I don't think this is the case. I am pretty sure the packet will have collision one way or another. I just want to minimize it, not to avoid it.

5. ### WBahn Moderator

Mar 31, 2012
23,400
7,109
I don't think I asked the question well enough. What I'm looking for is how long (time wise) a packet is. It sounds like you gave me how long it can take for a successful transmission that could involve several unsuccessful ones first.

If it takes 1 ms for a packet (if that's the actual packet length), then 100 devices with one packet per second is going to be about 10% channel utilization under 100% efficiency, and if a packet length is more like 100 us then you are in the 1% range, so you've got some room to maneuver within.

I don't know anything about MiWi -- and don't want to spend time looking anything up. In general, if you don't have a random backoff then transmission attempts will tend to cluster unacceptably. That's probably what's happing to you.

Are you using, or planning to use, any type of ARQ system to create a reliable channel?

6. ### dannyf Well-Known Member

Sep 13, 2015
2,196
421
the minimum collision probability is achieved by having an infinite delay: you will never collide, for sure.

Most people don't try to achieve minimum collision - they try a compromise between throughput and collision.

If I were you, I would get a sense of how fast / slow send() executes. Say that it executes with a mean and standard deviation, you will need to determine how likely you want the transmission to go through in the next round. from that, you pick a delay from mean-k*standard deviation to mean + k*standard deviation.

For example, say that you want the probability of successful transmission to be 95%, k = 2; 99.7%, k=3, etc.

Now, that assumes a normal distribution. technically the delay should follow poisson distribution but the basic idea is the same.

7. ### bug13 Thread Starter Senior Member

Feb 13, 2012
1,521
55
Thanks, this thinking is useful!

Think if MiWi as stack/driver/Network_protocal to use Microchip MRF24J40, it's like zigbee but smaller footprint, and it's free (if you use it with microchip part)

No, but the stack does come with an acknowledge mechanism with CRC16, if the TX device don't get an ACK, I will re-transmitter.

8. ### bug13 Thread Starter Senior Member

Feb 13, 2012
1,521
55
I think that's what I really want to ask, but you already know

I can understand this:
But how do you get the relationship between the percentage and k here?
Thanks

Feb 13, 2012
1,521
55