Lesson 1

Bayesian Inference

Understanding priors, likelihoods, and posterior distributions through mathematical intuition and code.

Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available.

Bayes' Theorem
P(A|B) = P(B|A) ċ P(A)P(B)

Unlike frequentist statistics which treats parameters as fixed constants, Bayesian statistics treats them as random variables described by a probability distribution.

Python Implementation

Let's look at how we can implement a simple updater function in Python.

bayes.py
import numpy as np

# 1. Define Prior
prior = np.ones(100) / 100 
p_grid = np.linspace(0, 1, 100)

def update(prior, heads, total):
    likelihood = p_grid**heads * (1 - p_grid)**(total - heads)
    posterior = likelihood * prior
    return posterior / posterior.sum()

Key Concepts

  • Prior: What we know before seeing data.
  • Likelihood: How probable the data is given our hypothesis.
  • Posterior: Our updated belief after seeing data.