Show that the expected number of heads in a single toss of a coin is \(\frac{1}{2}\). Show in two ways that the expected number of heads in two tosses of a coin is 1: (a) Let \(x=\) number of heads in two tosses and find \(\bar{x}\). (b) Let \(x=\) number of heads in toss 1 and \(y=\) number of heads in toss 2 ; find the average of \(x+y\) by Problem \(9 .\) Use this method to show that the expected number of heads in \(n\) tosses of a coin is \(\frac{1}{2} n\).

Short Answer

Expert verified
The expected number of heads in two tosses is 1, and in n tosses it is \(\frac{1}{2} n\).

Step by step solution

01

Expected number of heads in a single toss

For a single coin toss, the outcomes are heads (H) and tails (T). The probability of heads is \(\frac{1}{2}\) and the probability of tails is \(\frac{1}{2}\). The expected value (E) is given by \(\text{E}(\text{X}) = \text{Sum of} (\text{outcome} \times \text{probability}) \ = 1 \times \frac{1}{2} + 0 \times \frac{1}{2} = \frac{1}{2} \).
02

Expected number of heads in two tosses (Part a)

When a coin is tossed twice, the possible outcomes are HH, HT, TH, TT. We can represent the number of heads as X, which can take values 0 (TT), 1 (HT or TH), or 2 (HH). The probabilities are: P(X=0) = \(\frac{1}{4}\), P(X=1) = \(\frac{2}{4} = \frac{1}{2}\), P(X=2) = \(\frac{1}{4}\). Now, we calculate the expected value \(\text{E}(X) = 0 \times \frac{1}{4} + 1 \times \frac{1}{2} + 2 \times \frac{1}{4} = 0 + \frac{1}{2} + \frac{1}{2} = 1\).
03

Expected number of heads in two tosses (Part b)

Let X represent the number of heads in the first toss and Y represent the number of heads in the second toss. For both tosses: \(\text{E}(X) = \text{E}(Y) = \frac{1}{2} \). By the linearity property of expectation: \(\text{E}(X + Y) = \text{E}(X) + \text{E}(Y) = \frac{1}{2} + \frac{1}{2} = 1\).
04

Expected number of heads in n tosses

Using the same method, let X_1, X_2, ..., X_n be the number of heads in each of the n tosses. The expected value for each toss is \(\text{E}(X_i) = \frac{1}{2}\). By the linearity of expectation: \(\text{E}(X_1 + X_2 + ... + X_n) = \text{E}(X_1) + \text{E}(X_2) + ... + \text{E}(X_n) = n \times \frac{1}{2} = \frac{1}{2} n\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

Key Concepts

These are the key concepts you need to understand to accurately answer the question.

Probability Theory
Probability theory is the branch of mathematics that deals with the likelihood of different events occurring. It's used to predict the outcomes of random events and is essential in fields like statistics, finance, and science. One key aspect of probability theory is understanding the concept of probability itself, which measures how likely an event is to happen.
For example, in a coin toss, there are two possible outcomes: heads (H) or tails (T). Each outcome has an equal chance of occurring, so the probability of getting heads is \(\frac{1}{2}\), and the probability of tails is also \(\frac{1}{2}\). These probabilities must add up to 1, as they cover all possible outcomes.
Probability theory uses formulas and models to predict these outcomes over numerous trials, giving us insights into long-term trends.
Expected Value Calculation
The expected value is a fundamental concept in probability theory. It represents the average that you would expect to get if you could repeat an experiment infinitely many times. It's calculated as the sum of all possible outcomes, each multiplied by their respective probabilities.
For instance, for a single coin toss, the expected value of getting heads can be computed as: \[ \text{E}(\text{X}) = (1 \times \frac{1}{2}) + (0 \times \frac{1}{2}) = \frac{1}{2} \] Here, we consider getting heads as 1, and tails as 0.
This method can be extended to more complex scenarios, like calculating the expected number of heads in multiple coin tosses. Understanding how to calculate the expected value helps in predicting expected outcomes in various probabilistic situations.
Coin Toss Experiment
The coin toss experiment is a classic example used to explain probability concepts. It involves tossing a coin to see whether it lands on heads or tails. Each toss is an independent event, meaning the result of one toss does not affect the next.
In this exercise, we analyze the expected number of heads in one, two, and multiple coin tosses. With one toss, the probability of getting heads is \( \frac{1}{2} \), so the expected value is \(\frac{1}{2} \). For two tosses, there are four possible outcomes: HH, HT, TH, and TT. The expected number of heads can be calculated using these outcomes and their probabilities: \[ \text{E}(X) = (0 \times \frac{1}{4}) + (1 \times \frac{1}{2}) + (2 \times \frac{1}{4}) = 0 + \frac{1}{2} + \frac{1}{2} = 1 \]
By conducting this experiment, we can better understand the mechanics of probability and expected value in a straightforward scenario.
Linearity of Expectation
Linearity of expectation is a useful property in probability theory that simplifies calculations of expected values for complex scenarios. It states that the expected value of the sum of random variables is equal to the sum of their individual expected values.
In other words, if you have random variables \(X_1, X_2, ..., X_n\), then \[ \text{E}(X_1 + X_2 + ... + X_n) = \text{E}(X_1) + \text{E}(X_2) + ... + \text{E}(X_n) \] This property is particularly handy in the coin toss experiment. For two tosses, let \(X\) represent heads in the first toss and \(Y\) represent heads in the second toss. Each has an expected value of \( \frac{1}{2} \). Therefore, using the linearity of expectation: \[ \text{E}(X + Y) = \text{E}(X) + \text{E}(Y) = \frac{1}{2} + \frac{1}{2} = 1 \]
This method extends to \(n\) tosses, where the expected number of heads is \( \frac{1}{2} n \). Linearity makes computing the expected values manageable, even in larger and more complex probabilistic experiments.

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

A so-called 7 -way lamp has three 60 -watt bulbs which may be turned on one or two or all three at a time, and a large bulb which may be turned to 100 watts, 200 watts or 300 watts. How many different light intensities can the lamp be set to give if the completely off position is not included? (The answer is not 7 .)

Suppose it is known that \(1 \%\) of the population have a certain kind of cancer. It is also known that a test for this kind of cancer is positive in \(99 \%\) of the people who have it but is also positive in \(2 \%\) of the people who do not have it. What is the probability that a person who tests positive has cancer of this type?

What is the probability that you and a friend have different birthdays? (For simplicity, let a year have 365 days.) What is the probability that three people have three different birthdays? Show that the probability that \(n\) people have \(n\) different birthdays is $$p=\left(1-\frac{1}{365}\right)\left(1-\frac{2}{365}\right)\left(1-\frac{3}{365}\right) \cdots\left(1-\frac{n-1}{365}\right).$$ Estimate this for \(n \ll 365\) by calculating \(\ln p\) [recall that \(\ln (1+x)\) is approximately \(x\) for \(x \ll 1]\). Find the smallest (integral) \(n\) for which \(p<\frac{1}{2}\). Hence, show that for a group of 23 people or more, the probability is greater than \(\frac{1}{2}\) that two of them have the same birthday. (Try it with a group of friends or a list of people such as the presidents of the United States.)

Two people are taking turns tossing a pair of coins; the first person to toss two alike wins. What are the probabilities of winning for the first player and for the second player? Hint: Although there are an infinite number of possibilities here (win on first turn, second turn, third turn, etc.), the sum of the probabilities is a geometric series which can be summed; see Chapter 1 if necessary.

You are trying to find instrument \(A\) in a laboratory. Unfortunately, someone has put both instruments \(A\) and another kind (which we shall call \(B\) ) away in identical unmarked boxes mixed at random on a shelf. You know that the laboratory has \(3 A\) 's and \(7 B\) 's. If you take down one box, what is the probability that you get an \(A ?\) If it is a \(B\) and you put it on the table and take down another box, what is the probability that you get an \(A\) this time?

See all solutions

Recommended explanations on Combined Science Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free