تماشای ویدئو What is Information Entropy Shannons formula از آی-ویدئو

Entropy is a measure of the uncertainty in a random variable (message source). Claude Shannon defines the "bit" as the unit of entropy (which is the uncertainty of a fair coin flip). In this video information entropy is introduced intuitively using bounce machines yes/no questions. Note: This analogy applies to higher order approximations, we simply create a machine for each state and average over all machines!
15 دی 1396
آی-ویدئو