0603 Resistor Bias

Thread Starter

joeyd999

Joined Jun 6, 2011
5,234
Ha! I bet you think this thread is about biasing with resistors.

Nope.

I've been spending the day soldering a whole ton of 0603 resistors. They have two sides: the top side which is black, and the bottom side which is white. Call it heads and tails.

Granted, the resistors work regardless if they're installed heads up or down, but I am OCD about component orientation, so I always place them heads up.

I usually dump a quantity of parts on the work surface. Some land heads up, some land heads down. I group them according to up or down.

Then I press my finger (they stick) onto the down ones, and drop them. Again, some up, some down. I do this till they are all heads up.

What I notice (and this is an anecdotal observation), the 0603 package seems to have a bias to landing heads down, and it is annoying. I wonder why this is? (Actually, it could be that head-down annoys me so much that I "notice" it more).

In any case, this made me think of a probability problem (@WBahn, maybe you're interested in playing with this):

Take n coins. Assume P(0.5) when tossed (no bias). Re-toss all heads-down (as a group) until only heads-up remain.

What is the average number of tosses required until all coins are heads up, and what does the distribution look like?

Theoretically, the problem could take an infinite number of tosses (i.e. the last coin never lands heads up), or only one toss. In reality, the problem usually quickly resolves in a few tosses. O(log(n)) I suspect.

Have fun.
 

Hymie

Joined Mar 30, 2018
1,277
Ha! I bet you think this thread is about biasing with resistors.

Nope.

I've been spending the day soldering a whole ton of 0603 resistors. They have two sides: the top side which is black, and the bottom side which is white. Call it heads and tails.

Granted, the resistors work regardless if they're installed heads up or down, but I am OCD about component orientation, so I always place them heads up.

I usually dump a quantity of parts on the work surface. Some land heads up, some land heads down. I group them according to up or down.

Then I press my finger (they stick) onto the down ones, and drop them. Again, some up, some down. I do this till they are all heads up.

What I notice (and this is an anecdotal observation), the 0603 package seems to have a bias to landing heads down, and it is annoying. I wonder why this is? (Actually, it could be that head-down annoys me so much that I "notice" it more).

In any case, this made me think of a probability problem (@WBahn, maybe you're interested in playing with this):

Take n coins. Assume P(0.5) when tossed (no bias). Re-toss all heads-down (as a group) until only heads-up remain.

What is the average number of tosses required until all coins are heads up, and what does the distribution look like?

Theoretically, the problem could take an infinite number of tosses (i.e. the last coin never lands heads up), or only one toss. In reality, the problem usually quickly resolves in a few tosses. O(log(n)) I suspect.

Have fun.
I think 2 to the power n gives the probability.

e.g. if you have 10 resistors the odds of them all falling one side up would be 1 in 1024.
 

MrChips

Joined Oct 2, 2009
30,708
"OCD about component orientation"... no such thing!
  • Never mount SMD resistors white-side up.
  • Resistors are mounted so that resistance value is readable right side up without having to turn the board around.
  • Color coded resistors are mounted so that they are readable from left to right, tolerance band on the right.
  • Axial components such as diodes, capacitors, etc. are rotated so that values are readable.
  • Radial capacitors are mounted so that markings are readable.
 

Hymie

Joined Mar 30, 2018
1,277
I think 2 to the power n gives the probability.

e.g. if you have 10 resistors the odds of them all falling one side up would be 1 in 1024.

After reading your post again, I see that I have not given the answer to your question, but the odds of them all falling one side up in one go.

Imagine you have 10 coins and thrown them at random on the table.

On average 5 would be heads and five tales, on average it would take 2 tosses to turn each head into a tale – so the total number of tosses would be 20 (if you count the original action of throwing the 10 coins on the table as 10 tosses).

So the answer to your question is that the average number of tosses would be twice the number of coins.
 

Thread Starter

joeyd999

Joined Jun 6, 2011
5,234
On average 5 would be heads and five tales, on average it would take 2 tosses to turn each head into a tale – so the total number of tosses would be 20 (if you count the original action of throwing the 10 coins on the table as 10 tosses).

So the answer to your question is that the average number of tosses would be twice the number of coins.
Thanks for playing. But I'm looking for some mathematical rigor and a distribution.
 

billnow

Joined Aug 4, 2010
23
Murphy's Law and the laws of probability are in conflict. You want the top side up so Murphy states that this won't happen; probability states that you have a 50% chance of the top being up.
 

jpanhalt

Joined Jan 18, 2008
11,087
We can probably agree that the probability of tossing 4 coins and having them all be heads is 1/2^4 or 1/16 and that resolves to 16 tosses. However, Joey had made it simpler. There are no "failed" tosses, and each toss cuts the pile that needs to be tossed again in half.

I may be repeating what is meant by this:
O(log(n)) I suspect
, as I do not understand that format.

Thus, with 4 coins, toss 1 = 2H + 2T, toss 2 will give 1H + 1T and toss 3 will give the final head. Thus, my answer is log(base 2) n +1, where n is the number of coins.

\(Log_{2}n +1 \)
 
Last edited:

DickCappels

Joined Aug 21, 2008
10,153
Yikes!!! You mean I can no longer salvage SMD resistors off discarded electronics???:(
You can get out your meter (LC or resistance/DVM) and measure the parts (done that). Just be careful when pressing the probes down onto the chip lest you end up making an unplanned tiddlywink shot.
 

WBahn

Joined Mar 31, 2012
29,976
We can probably agree that the probability of tossing 4 coins and having them all be heads is 1/2^4 or 1/16 and that resolves to 16 tosses.
I'm not sure I understand just what you are claiming here. I'm assuming you are talking about the equivalent of a "game" in which I toss four (fair) coins and if they all come up heads I win but if they don't then I have to retoss all four. If that's correct, are you saying that it requires 16 tosses to throw a winning hand? That I will throw a winning hand somewhere within the first 16 tosses? Or that, on average, it will take 16 tosses to get a winning hand?

However, Joey had made it simpler. There are no "failed" tosses, and each toss cuts the pile that needs to be tossed again in half.
What would constitute a "failed" toss. Do you mean a toss in which you have to retoss all the coins (resistors)? How are you concluding that each toss cuts the pile that needs to be tossed again in half? Joey explicitly included the possibilities that an infinite number of tosses might not yield the desired result or that it could be yielded on the first toss.

Are you limiting yourself to the "expected" results only?

I may be repeating what is meant by this: , as I do not understand that format.
That is what is known as Big-O notation. Basically, if I have a function, f(n), (which is traditionally the size of a problem to be solved by a particular algorithm or, in this case, process) and I want to know how f(n) behaves as n gets larger and larger, I am hampered by the fact that f(n) is usually a pretty complicated function and, more often than not, its exact form isn't even known (or would take far more time than it is worth to figure out). So what I settle for finding a simpler function, g(n) whose behavior is no smaller than f(n), at least for sufficiently large n.

The formal definition is that

f(n) = O(g(n))

if there exists positive integers C and N such that

f(n) ≤ C·g(n) for every n ≥ N

Thus, with 4 coins, toss 1 = 2H + 2T, toss 2 will give 1H + 1T and toss 3 will give the final head. Thus, my answer is log(base 2) n +1, where n is the number of coins.

And trying Latex [latex]Log_{2}n +1 [/latex] (didn't work)
By this last part, do you mean trying to use the LaTeX engine in your post here? If so, then the BB Code tags are [tex] and [/tex]

You might try this:

[tex]Log_2 n + 1[/tex]

\(Log_2 n + 1\)

Note that, while not uncommon, this is quite ambiguous since you don't know for sure whether you are taking the logarithm of n and then adding one, or taking the logarithm of (n+1). It's generally considered better to remove all doubt and write it as

[tex]log_2 (n) \; + \; 1[/tex]

\(log_2 (n) \; + \; 1\)
 

jpanhalt

Joined Jan 18, 2008
11,087
when pressing the probes down onto the chip lest you end up making an unplanned tiddlywink shot.
I've done that. The good thing is they don't hurt as much as Legos when you step on them. One of the tricks of an old watchmaker was to wear cuffs on your trousers. It's amazing how much stuff they catch.
 

jpanhalt

Joined Jan 18, 2008
11,087
@WBahn
Thanks for the reminder. I use LaTex less than once a year and just had a mental block for the proper tag.

I used "average" in what I understood Joey999 to mean. Obviously, one could go a very long time and never get 4 heads in a single toss. I gave that example to illustrate the difference between the scenario of "all heads with a single toss," which will have lots of failed tosses, and what was effectively a sorting problem.

I thought about using parentheses, but doesn't PEMDAS resolve that ambiguity? For the sake of clarity, I meant,

\(Log_{2}(n) +1\)

Thanks, John
 
Last edited:

Thread Starter

joeyd999

Joined Jun 6, 2011
5,234
I'm not sure I understand just what you are claiming here. I'm assuming you are talking about the equivalent of a "game" in which I toss four (fair) coins and if they all come up heads I win but if they don't then I have to retoss all four. If that's correct, are you saying that it requires 16 tosses to throw a winning hand? That I will throw a winning hand somewhere within the first 16 tosses? Or that, on average, it will take 16 tosses to get a winning hand?



What would constitute a "failed" toss. Do you mean a toss in which you have to retoss all the coins (resistors)? How are you concluding that each toss cuts the pile that needs to be tossed again in half? Joey explicitly included the possibilities that an infinite number of tosses might not yield the desired result or that it could be yielded on the first toss.

Are you limiting yourself to the "expected" results only?



That is what is known as Big-O notation. Basically, if I have a function, f(n), (which is traditionally the size of a problem to be solved by a particular algorithm or, in this case, process) and I want to know how f(n) behaves as n gets larger and larger, I am hampered by the fact that f(n) is usually a pretty complicated function and, more often than not, its exact form isn't even known (or would take far more time than it is worth to figure out). So what I settle for finding a simpler function, g(n) whose behavior is no smaller than f(n), at least for sufficiently large n.

The formal definition is that

f(n) = O(g(n))

if there exists positive integers C and N such that

f(n) ≤ C·g(n) for every n ≥ N



By this last part, do you mean trying to use the LaTeX engine in your post here? If so, then the BB Code tags are [tex] and [/tex]

You might try this:

[tex]Log_2 n + 1[/tex]

\(Log_2 n + 1\)

Note that, while not uncommon, this is quite ambiguous since you don't know for sure whether you are taking the logarithm of n and then adding one, or taking the logarithm of (n+1). It's generally considered better to remove all doubt and write it as

[tex]log_2 (n) \; + \; 1[/tex]

\(log_2 (n) \; + \; 1\)
Do you have a solution?
 

Thread Starter

joeyd999

Joined Jun 6, 2011
5,234
Murphy's Law and the laws of probability are in conflict. You want the top side up so Murphy states that this won't happen; probability states that you have a 50% chance of the top being up.
Yebbut, Murphy's Law also states that if the probability of something happening is a million to one, odds are 50/50 that it will.
 

dendad

Joined Feb 20, 2016
4,451
I too think that Murphy has the upper hand here.
It can be very frustrating to get the resistors the correct way up. Sometimes it seems to take many tries. Definitely greater that 50% of the time they will be upside down. At least it seems that in my experience ;)
 

Thread Starter

joeyd999

Joined Jun 6, 2011
5,234
I was attempting to work the problem, and, I realize now it is not trivial.

Here is the case for just 1 coin (n=1):

1. What is the expected number of coin flips for getting a head?
Ans:
Let the expected number of coin flips be x. Then we can write an equation for it -
a. If the first flip is the head, then we are done. The probability of this event is 1/2 and the number of coin flips for this event is 1.
b. If the first flip is the tails, then we have wasted one flip. Since consecutive flips are independent events, the solution in this case can be recursively framed in terms of x - The probability of this event is 1/2 and the expected number of coins flips now onwards is x. But we have already wasted one flip, so the total number of flips is x+1.
The expected value x is the sum of the expected values of these two cases. Using the rule of linerairty of the expectation and the definition of Expected value, we get
x = (1/2)(1) + (1/2) (1+x)
Solving, we get x = 2.
Thus the expected number of coin flips for getting a head is 2.
And this is only for calculating the "expected" number of flips, not a distribution.

I imagine the problem explodes considerably as n gets larger (i.e. lots of recursion).

Perhaps I am missing something, but this doesn't seem an easy problem.
 
Top