Priors & Posteriors

Module 2 / Lesson 1

Bayesian Inference

Aryan S.
15 min read

Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available.

Core Formula

P(A|B) = [P(B|A) ยท P(A)] / P(B)

Where P(A|B) is the posterior probability, P(B|A) is the likelihood, and P(A) is the prior. Unlike frequentist statistics, we treat parameters as random variables.

Python Implementation

Let's simulate a simple coin flip scenario to update our belief about the fairness of a coin.

bayes.py
import numpy as np

# 1. Define Prior (Uniform)
prior = np.ones(100) / 100 
p_grid = np.linspace(0, 1, 100)

def update(prior, heads, total):
    likelihood = p_grid**heads * (1 - p_grid)**(total - heads)
    posterior = likelihood * prior
    return posterior / posterior.sum()