In tennis, players play 'sets', and the first player to win 2 sets wins a match. Up to 3 sets can be played, if both players win a set each. If you didn't know the probability of each player winning a set, would you bet on a match going to 3 sets or only going 2 sets? Assume the odds are \( 50/50 \) for these events.
Let \( p \) be the odds that player A wins a set, and let \( 1 - p \) be the odds that player B wins a set.
Therefore, the probability that the match ends in 2 sets is given by \( p^2 + (1 - p)^2 \), and the probability that the match ends in 3 sets is given by \( 2*p*(1 - p) \). Let's create an inequality: \[ p^2 + (1 - p)^2 \ge 2p(1 - p) \] simplifying this, we get: \[ 2p^2 - 2p + 1 - 2p + 2p^2 \ge 0 \] \[ 4p^2 - 4p + 1 \ge 0 \] We need to check if the inequality holds for \( 0 < p < 1 \). The local minimum is given by: \[ \frac{d}{dp} = 8p - 4 \] \[ 4 = 8p \] \[ p = \frac{1}{2} \] The local minimum is at \( \frac{1}{2} \). Substituting this into the equation we get: \[ 4 * (\frac{1}{2})^2 - 4 * \frac{1}{2} + 1 \ge 0 \] \[ 1 - 2 + 1 \ge 0 \] \[ 0 \ge 0 \] The local minimum at \( \frac{1}{2} \) satisfies the inequality.
Therefore, the equation \( p^2 + (1 - p)^2 \ge 2p(1 - p) \) satisfies for all values, and we can conclude that a tennis match being won in 2 sets is more probable than a match being won in 3 sets.