Hello y'all, i have a doubt on what is the physical significance that entropy has in my example. According to a famous prof from EPFL, entropy is basically the average number of binary questions that you should ask to get an answer. OK, say there is an employee who asks his manager whether tomorrow is a holiday or working day , say the probabilities are Ph=0.99 and Pw=0.01 (Ph is probability of a holiday and Pw is the working one). Now if you calculate the entropy the answer turns out to be 0.08 bits. WHAT? that means 0.08 questions? How?
Let me post a link to the professor's claim , see 2:39
Let me post a link to the professor's claim , see 2:39
Last edited by a moderator: