Module 02 • Part 1

Bayesian Inference

Understanding the mathematical framework for updating beliefs with new evidence.

AS
Aryan S.
Posted 2 days ago

Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available.

The Core Equation

The fundamental equation involves the Posterior, Likelihood, Prior, and Marginal Likelihood. Unlike frequentist statistics, we treat parameters as random variables.

P(A|B) = P(B|A) ċ P(A)P(B)
  • P(A): The Prior probability (what we believe before seeing data).
  • P(B|A): The Likelihood (how probable is the data given our hypothesis).
  • P(A|B): The Posterior (our updated belief).

Python Implementation

Let's simulate a simple coin flip scenario to update our belief about the fairness of a coin. We will use a grid approximation method which is intuitive for understanding the mechanics.

bayes.py
import numpy as np

# Define prior: Uniform distribution (any bias is equally likely)
prior = np.ones(100) / 100
p_grid = np.linspace(0, 1, 100)

def update(prior, heads, total):
    likelihood = p_grid**heads * (1 - p_grid)**(total - heads)
    posterior = likelihood * prior
    posterior /= posterior.sum() # Normalize
    return posterior

Conclusion

Unlike frequentist methods, Bayesian approaches give us a distribution of probabilities rather than a single point estimate, allowing for richer uncertainty quantification. This is particularly useful in decision-making processes under uncertainty.