# Entropy

Discussion in 'Physics' started by boks, Aug 24, 2009.

1. ### boks Thread Starter Active Member

Oct 10, 2008
218
0
I have seen two different entropy equations:

1. $S = -k /Sigma_i P_i ln P_i$

and

2. $S = k lnW$, where W is the multiplicity.

I'm trying to calculate the entropy for an experiment where I sample the sock colors of 30 students and find the distribution of the colors of the socks they are wearing. I'm a little surprised to find that the two equations give me different entropy values. Why is that?

2. ### rspuzio Active Member

Jan 19, 2009
77
0
The former equation is the more general one; the latter
is what it reduces to when all the $P_i$'s are
equal (uniform probability distribution). However, as
you wrote them, there is a typo --- there should be no
$S = -k \sum_i P_i \ln P_i$