Menu

Basic Concepts

B A S I C   C O N C E P T S

Chapter 1 of the “Stats 2 Book” establishes the fundamental vocabulary and axiomatic framework necessary for studying probability and statistics. This chapter introduces the structure needed to discuss the likelihood of occurrences.

Here is a detailed explanation of the key concepts, followed by illustrative examples and exercises based on the source material:


1.1 Definitions and Properties

The foundation of probability theory rests on defining what is possible before determining what is likely.

Key Definitions

ConceptDefinitionExample
Sample Space (SS)A set listing all possibilities (outcomes) that might occur.When rolling a six-sided die, S={1,2,3,4,5,6}S = \{1, 2, 3, 4, 5, 6\}.
OutcomeAn element of the sample space SS.The result of rolling a die, e.g., ‘4’.
ExperimentThe process of actually selecting one of the outcomes listed.Flipping a coin or waiting for the winner of the World Cup.
Event (EE)Any subset of the sample space SS.Rolling a number > 2: E={3,4,5,6}E = \{3, 4, 5, 6\}.

Probability Axioms

A probability (PP) is a function that assigns a chance (a number between 0 and 1) to each event EE. This formally relies on Kolmogorov’s axioms.

💡

Axiom 1: Certainty

The probability of the entire sample space is 1. P(S)=1P(S) = 1 Interpretation: There is a 100% chance that an experiment will result in some outcome included in SS.

💡

Axiom 2: Additivity

For any countable collection of disjoint events (E1,E2,E_1, E_2, \dots), their combined probability is the sum of their individual probabilities. P(E1E2)=P(E1)+P(E2)+P (E_1 \cup E_2 \cup \dots) = P(E_1) + P(E_2) + \dots Interpretation: If events cannot happen simultaneously, their probabilities add up.

Basic Properties

From the two fundamental axioms, several properties can be proven that simplify probability calculations:

PropertyFormulaDescription
Empty SetP()=0P(\emptyset) = 0The probability of nothing happening is zero.
Finite AdditivityP(Ei)=P(Ei)P(\cup E_i) = \sum P(E_i)Sum rule for finite disjoint events.
MonotonicityIf EF,P(E)P(F)E \subset F, P(E) \le P(F)Subsets cannot be more likely than supersets.
Difference RuleP(FE)=P(F)P(E)P(F \setminus E) = P(F) - P(E)Prob. of FF occurring but EE not (if EFE \subset F).
Complement RuleP(Ec)=1P(E)P(E^c) = 1 - P(E)Prob. that EE does NOT occur.
General AdditionP(EF)=P(E)+P(F)P(EF)P(E \cup F) = P(E) + P(F) - P(E \cap F)Subtract intersection to avoid double counting.

Example and Solution

Q1

Coin Flip Axioms

A fair coin flip has a sample space S={heads, tails}S = \{\text{heads, tails}\}. Use the axioms to show that the probability of observing heads is 0.5.

📝 View Detailed Solution
  1. Let E={heads}E = \{\text{heads}\} and F={tails}F = \{\text{tails}\} be two disjoint events.
  2. Since the coin is “fair,” P(E)=P(F)=pP(E) = P(F) = p for some value pp.
  3. The union is the sample space: S=EFS = E \cup F.
  4. Using Axiom 1 and Axiom 2: 1=P(S)=P(EF)=P(E)+P(F)1 = P(S) = P(E \cup F) = P(E) + P(F)
  5. Substituting pp: 1=2p    p=0.51 = 2p \implies p = 0.5.
Q2

Fishing Tonnage

A town’s fishing fleet has a 35% chance of catching over 400 tons (P(F)=0.35P(F)=0.35) and a 10% chance of catching over 500 tons (P(E)=0.10P(E)=0.10). How likely is it that they will catch between 400 and 500 tons ?

📝 View Detailed Solution
  1. Variable EE (“over 500”) is a subset of FF (“over 400”).
  2. “Between 400 and 500” is the difference set FEF \setminus E.
  3. Using the Difference Rule: P(FE)=P(F)P(E)P(F \setminus E) = P(F) - P(E)

Answer: There is a 25% chance that between 400 and 500 tons of fish will be caught.


1.2 Equally Likely Outcomes

When outcomes are finite and equally likely, we have a Uniform Distribution.

Running probability calculations in this setting becomes a pure counting problem.

Q3

Rolling Two Dice

Two dice are rolled. How likely is it that their sum will equal eight?

📝 View Detailed Solution
  1. Total sample space S=6×6=36|S| = 6 \times 6 = 36.
  2. Event EE (sum is 8): E={(2,6),(3,5),(4,4),(5,3),(6,2)}E = \{(2, 6), (3, 5), (4, 4), (5, 3), (6, 2)\}.
  3. Count E=5|E| = 5.
  4. Probability: P(E)=536P(E) = \frac{5}{36}
Q4

Group Selection

A group of 12 people includes Grant and Dilip. Pick 3 randomly. How likely is it to include Grant, but not Dilip?

📝 Combinatorial Solution
  1. Total Outcomes: Choose 3 from 12. S=(123)=220|S| = \binom{12}{3} = 220.
  2. Event EE: Grant is fixed (1 way). Dilip is out. We need 2 more from the remaining 10.
  3. Count E|E|: Choose 2 from 10. E=(102)=45|E| = \binom{10}{2} = 45.
  4. Probability: P(E)=45220=944P(E) = \frac{45}{220} = \frac{9}{44}

1.3 Conditional Probability

How is likelihood “altered” by knowledge that another event BB has occurred?

💡

Conditional Probability Formula

P(AB)=P(AB)P(B)P(A|B) = \frac{P(A \cap B)}{P(B)} Provided P(B)>0P(B) > 0

Bayes’ Theorem

One of the most powerful theorems in statistics allows us to reverse conditional probabilities.

💡

Bayes' Theorem

P(BA)=P(AB)P(B)P(ABj)P(Bj)P(B|A) = \frac{P(A|B)P(B)}{\sum P(A|B_j)P(B_j)} Updates belief P(B)P(B) given new evidence AA.

Q5

The Swine Flu Test

  • Detects Flu 95% of the time if infected: P(PosFlu)=0.95P(Pos|Flu) = 0.95.
  • False Positive rate is 2%: P(PosHealthy)=0.02P(Pos|Healthy) = 0.02.
  • Population rate is 1%: P(Flu)=0.01P(Flu) = 0.01.

If a person tests positive, what is the probability they actually have the flu?

📝 Bayesian Check
  1. Let A=FluA = \text{Flu}, B=Positive TestB = \text{Positive Test}.
  2. We want P(AB)P(A|B).
  3. Apply Bayes’ Theorem: P(AB)=P(BA)P(A)P(BA)P(A)+P(BAc)P(Ac)P(A|B) = \frac{P(B|A)P(A)}{P(B|A)P(A) + P(B|A^c)P(A^c)} P(AB)=(0.95)(0.01)(0.95)(0.01)+(0.02)(0.99)P(A|B) = \frac{(0.95)(0.01)}{(0.95)(0.01) + (0.02)(0.99)} P(AB)0.324P(A|B) \approx 0.324

Result: Only a 32.4% chance they actually have the flu!


1.4 Independence

Events are independent if the occurrence of one has no effect on the other.

P(AB)=P(A)P(B)P(A \cap B) = P(A)P(B)


1.5 Using R

R is essential for complex calculations.

FunctionSyntaxPurpose
Vector Creationc(1, 2, 3)Creates a list of numbers.
Sequence1:100Creates integers from 1 to 100.
Combinationschoose(n, k)Calculates (nk)\binom{n}{k}.
Summationsum(x)Adds all elements in vector xx.