Phase 1: The Core Concepts
To solve this problem, you need to understand Maximum Likelihood Estimation (MLE).
1. What is a Likelihood Function (L)?
Imagine you have a die, but you don’t know if it’s fair. You roll it a few times and get specific numbers.
The Likelihood is the probability of getting that specific sequence of numbers, assuming a specific variable (parameter) is true.
- Formula: L(θ)=P(x1)⋅P(x2)⋅P(x3)…
(We multiply the probabilities because each roll is an “independent event” — one roll doesn’t affect the next).
2. What is Maximum Likelihood Estimation (MLE)?
In statistics, we work backward. We have the data (the rolls), and we want to find the “best guess” for the probability (θ).
The MLE is simply the value of θ that makes the observed data most likely to have happened.
To find this “maximum” point, we use calculus (derivatives).
3. Why use Log-Likelihood (lnL)?
The Likelihood function usually involves multiplying many small numbers (x⋅y⋅z). Calculus is hard with multiplication (Product Rule).
Calculus is easy with addition.
By taking the natural log (ln) of the function, we turn multiplication into addition:
- Log Rule: ln(a⋅b)=ln(a)+ln(b)
- Power Rule: ln(xn)=n⋅ln(x)
This makes the derivative much easier to solve, and the maximum point remains at the same spot.
Phase 2: Step-by-Step Solution
Step 1: Define the Probabilities
From the question text:
- The probability of getting a 1, 2, or 3 is θ.
- P(1)=θ
- P(2)=θ
- P(3)=θ
- The probability of getting a 4 is the remaining probability.
- P(4)=1−3θ
Step 2: Analyze the Data
The observed outcomes are: 1, 4, 3, 4, 3, 2, 4
Let’s count how many times each type of probability appears:
- Type A (Outcome 1, 2, or 3):
- We have a 1, a 3, a 3, and a 2.
- Total count (nA) = 4
- Type B (Outcome 4):
- We have a 4, a 4, and a 4.
- Total count (nB) = 3
Total rolls = 7.
Step 3: Build the Likelihood Function
We multiply the probability of each specific roll that occurred.
L(θ)=θ⋅(1−3θ)⋅θ⋅(1−3θ)⋅θ⋅θ⋅(1−3θ)
Group them together using exponents:
L(θ)=θ4⋅(1−3θ)3
(Read this as: We got the "θ" outcome 4 times, and the "1−3θ" outcome 3 times).
Step 4: Take the Log-Likelihood
Now, apply the natural log (ln) to make it solvable.
l(θ)=ln(L(θ))
l(θ)=ln(θ4⋅(1−3θ)3)
Using the Log Rules (ln(ab)=lna+lnb and ln(xn)=nlnx):
l(θ)=4ln(θ)+3ln(1−3θ)
Step 5: Differentiate and Maximize
To find the maximum, we take the derivative with respect to θ and set it to equal 0.
Recall the derivative formulas:
- dxdln(x)=x1
- Chain Rule: dxdln(u)=u1⋅u′
Let’s derive our equation:
l′(θ)=dθd[4ln(θ)]+dθd[3ln(1−3θ)]
- First part: 4⋅(θ1)=θ4
- Second part (Chain Rule applies here because the inside is 1−3θ, not just θ):
- Derivative of outer ln function: 3⋅1−3θ1
- Multiplied by derivative of inside function (1−3θ): −3
- Result: 3⋅1−3θ1⋅(−3)=1−3θ−9
Combine them:
l′(θ)=θ4−1−3θ9
Step 6: Solve for θ
Set the derivative to 0 to find the maximum likelihood estimate.
θ4−1−3θ9=0
Move the negative term to the right side:
θ4=1−3θ9
Cross-multiply:
4(1−3θ)=9θ
Expand:
4−12θ=9θ
Add 12θ to both sides:
4=21θ
Divide by 21:
θ=214
Step 7: Final Calculation
Now we convert the fraction to a decimal.
4÷21≈0.190476...
The question asks for the answer correct to two decimal places.
0.1904...→0.19
Final Answer
The maximum likelihood estimate of θ is 0.19.