Chapter 9: Q. 9.15 (page 413)
A coin having probability p = 2 3 of coming up heads is flipped 6 times. Compute the entropy of the outcome of this experiment.
Short Answer
The entropy of the outcome of this experiment is.
Chapter 9: Q. 9.15 (page 413)
A coin having probability p = 2 3 of coming up heads is flipped 6 times. Compute the entropy of the outcome of this experiment.
The entropy of the outcome of this experiment is.
All the tools & learning materials you need for study success - in one app.
Get started for freeShow that for any discrete random variable and function
A transition probability matrix is said to be doubly
stochastic if
for all states j = 0, 1, ... , M. Show that such a Markov chain is ergodic, then
j = 1/(M + 1), j = 0, 1, ... , M.
This problem refers to Example 2f.
(a) Verify that the proposed value of πj satisfies the necessary equations.
(b) For any given molecule, what do you think is the (limiting) probability that it is in urn 1?
(c) Do you think that the events that molecule j, j Ú 1, is in urn 1 at a very large time would be (in the limit) independent?
(d) Explain why the limiting probabilities are as given.
In transmitting a bit from location A to location B, if we let X denote the value of the bit sent at location A and Y denote the value received at location B, then H(X) − HY(X) is called the rate of transmission of information from A to B. The maximal rate of transmission, as a function of P{X = 1} = 1 − P{X = 0}, is called the channel capacity. Show that for a binary symmetric channel with P{Y = 1|X = 1} = P{Y = 0|X = 0} = p, the channel capacity is attained by the rate of transmission of information when P{X = 1} = 1 2 and its value is 1 + p log p + (1 − p)log(1 − p).
A pair of fair dice is rolled. Let
and let Y equal the value of the first die. Compute (a) H(Y), (b) HY(X), and (c) H(X, Y).
What do you think about this solution?
We value your feedback to improve our textbook solutions.