Skip to main content
Mathematics LibreTexts

2.3: Probability and Expected Value

  • Page ID
    82761
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    Many games have an element of chance. In order to model such games and determine strategies, we should understand how mathematicians use probability to represent chance.

    2.3.1: Some Basic Probability

    You are probably a little bit familiar with the idea of probability. People often talk about the chance of some event happening. For example, a weather forecast might say there is a \(20 \%\) chance of rain. Now determining the chance of rain can be difficult, so we will stick with some easier examples.

    Consider a standard deck of \(52\) playing cards. What is the chance of drawing a red card? What is the probability of drawing a red card? Is there a difference between chance and probability? Yes! The probability of an event has a very specific meaning in mathematics.

    The probability of an event \(E\) is the number of different outcomes resulting in \(E\) divided by the total number of equally likely outcomes. In mathematical symbols,

    \begin{equation*} P(E)=\dfrac{\mbox{number of different outcomes resulting in \(E\)} }{\mbox{total number of equally likely outcomes} }. \end{equation*}

    Notice that the probability of \(E\) will always be a number between \(0\) and \(1\). An impossible event will have probability \(0\); an event that always occurs will have probability \(1\).

    Thus, the probability of drawing a red card is \(\dfrac{1}{2}\text{,}\) not \(50 \%\). Although we can convert between probability and percent (since \(0.5\) converted to percent is \(50\%\)), it is important to answer a question about probability with a probability, not a percent.

    Example 2.3.1 : Drawing a Particular Suit

    Given a standard deck of playing cards, what is the probability of drawing a heart?

    Solution

    You might say since there are four suits, and one of the suits is hearts, you have a probability of \(\dfrac{1}{4}\text{.}\) You'd be correct, but be careful with this reasoning. This works because each suit has the same number of cards, so each suit is equally likely. Another way the calculate the probability is to count the number of hearts \((13)\) divided by the number of cards \((52)\). Thus, we get a probability of \(\dfrac{13}{52}=\dfrac{1}{4}=0.25\text{.}\)

    Example 2.3.2 : A Card is Missing

    Now suppose the ace of spades is missing from the deck. What is the probability of drawing a heart?

    Solution

    As before, there are still four suits in the deck, so it might be tempting to say the probability is still \(\dfrac{1}{4}\text{.}\) But we'd be wrong! Each suit is no longer equally likely since, it is slightly less likely that we draw a spade. Each individual card is still equally likely, though. So now

    \begin{equation*} P(\mbox{drawing a heart} )= \dfrac{\mbox{number of hearts} }{\mbox{number of cards} }=\dfrac{13}{51}= 0.255. \end{equation*}

    As you can see, it is now slightly more likely that we draw a heart if the ace of spades is removed from the deck.

    Now try to compute some probabilities on your own.

    Exercise 2.3.1 : Probability with a Single Die

    Consider rolling a single die. List the possible outcomes. Assuming that it is a fair die, are all the outcomes equally likely? What is the probability of rolling a 2? What is the probability of rolling an even number?

    Exercise 2.3.2 : Probability with Red and Green Die

    Now consider rolling two fair dice, say a red die and a green die.

    1. How many equally likely outcomes are there? List them.
    2. What is the probability that you get a two on the red die and a four on the green die?
    3. What is the probability that you roll a three on the red die?
    4. What is the probability that you roll a two and a four?
    5. What is the probability that you roll a three?
    6. Compare your answers in (b) and (c) with your answers in (d) and (e). Are they the same or different? Explain.
    Exercise 2.3.3 : Probability with Two of the Same Dice

    Again consider rolling two fair dice, but now we don't care what color they are.

    1. Does this change the number of equally likely outcomes from Exercise \(2.3.2\)? Why or why not? It may be helpful to list the possible outcomes.
    2. What is the probability that you get snake eyes (two ones)?
    3. What is the probability that you roll a two and a four?
    4. What is the probability that you roll a three?
    5. What is the probability that you roll a two OR a four?
    Exercise 2.3.4 : Sums of Dice.

    Suppose we roll two dice and add them.

    1. List the possible sums.
    2. What is the probability that you get a total of seven on the two dice?
    3. What is the probability that you get a total of four when you roll two dice?
    4. Are the events of getting a total of seven and getting a total of four equally likely? Explain.

    It is important to note that just because you can list all of the possible outcomes, they may not be equally likely. As we see from Exercise \(2.3.4\), although there are \(11\) possible sums, the probability of getting any particular sum (such as seven) is not \(\dfrac{1}{11}\text{.}\)

    2.3.2: Expected Value

    Definition: Expected Value

    The expected value of a game of chance is the average net gain or loss that we would expect per game if we played the game many times. We compute the expected value by multiplying the value of each outcome by its probability of occurring and then add up all of the products.

    For example, suppose you toss a fair coin: Heads, you win \(25\) cents, Tails, you lose \(25\) cents. The probability of getting Heads is \(\dfrac{1}{2}\text{,}\) as is the probability of getting Tails. The expected value of the game is

    \begin{equation*} \biggl(\dfrac{1}{2}\times .25\biggr)+\biggl(\dfrac{1}{2}\times(- .25)\biggr)=0. \end{equation*}

    Thus, you would expect an average payoff of \($0\), if you were to play the game several times. Note, the expected value is not necessarily the actual value of playing the game.

    Exercise 2.3.5 : Expected Value and a Two-Coin Game

    Consider a game where you toss two coins. If you get two Heads, you win \($2\). If you get a Head and a Tail, you win \($1\), if you get two Tails, you lose \($4\). Find the expected value of the game. (Caution: first you need to find the probability of each event– think about “equally likely” events.)

    Exercise 2.3.6 : Play the Two-Coin Game

    Now play the game in Exercise \(2.3.5\) the indicated number of times. Give your actual payoff and compare it to the expected value.

    1. One time.
    2. Ten times.
    3. Twenty-five times.
    4. Is there a single possible outcome where you would actually win or lose the exact amount computed for the expected value? If not, why do we call it the expected value?
    Exercise 2.3.7 : Expected Value of Roulette

    A standard roulette wheel has \(38\) numbered slots for a small ball to land in: \(36\) are marked from \(1\) to \(36\), with half of those black and half red; two green slots are numbered \(0\) and \(00\). An allowable bet is to bet on either red of black. This bet is an even money bet, which means if you win you receive twice what you bet. Many people think that betting black or red is a fair game. What is the expected value of betting \($1000\) on red? Is this a fair game? Explain.

    Exercise 2.3.8 : Another Roulette Example

    Considering again the roulette wheel, if you bet \($100\) on a particular number and the ball lands on that number, you win \($3600\). What is the expected value of betting \($100\) on Red \(4\)?

    ​​​​​​​

    After finding the expected value of the games in the above exercises, what do you think the expected value can tell us about a game? Can you use it to decide whether you should play that game of chance or not? When will a game be advantageous for the player? We often care whether a game is “fair.” Can the expected value help you determine if a game is fair?

    Exercise 2.3.9 : Expected Value and Fairness

    Use the idea of expected value to explain “fairness” in a game of chance.

    The last exercise is a good challenge for exploring expected value.

    Exercise 2.3.10 : A Betting Game with Two Dice

    You place a bet and roll two fair dice. If you roll a 7 or an 11, you receive your bet back (you break even). If you roll a 2, a 3, or a 12, then you lose your bet. If you roll anything else, you receive half of the sum you rolled in dollars. How much should you bet to make this a fair game?

    Hint

    It might be helpful to begin with a table showing the possible sums, their probability, and the payoff for each.

    In the next section, we use the ideas of probability and expected value to understand how to set up a payoff matrix for a game of chance.


    This page titled 2.3: Probability and Expected Value is shared under a CC BY-SA 4.0 license and was authored, remixed, and/or curated by Jennifer A. Firkins Nordstrom via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.