Summarising Continuous Random Variables
SUMMARISING CONTINUOUS RVs
Chapter 6 of your statistics book, Summarising Continuous Random Variables , is the natural continuation of Chapter 4, translating the concepts of average (Expected Value) and spread (Variance/Standard Deviation) from the discrete world (sums) to the continuous world (integrals). It also introduces Moment Generating Functions as powerful tools for working with distributions and concludes by defining Bivariate Normal distributions.
6.1 Expected Value (): The Continuous Average
The expected value, or mean (), of a continuous random variable is the measure of the “long-run average” or the theoretical center of its probability density function (PDF), .
Concept and Calculation
Continuous Expectation
In the continuous setting, the summation used for discrete RVs is replaced by integration over the entire range of possible outcomes:
If this integral converges to a real number, the variable has finite expectation.
Example 1: Expected Value of a Uniform Distribution
Uniform Expectation
Concept: The uniform distribution is the simplest continuous distribution, where the density is constant over a given interval. Intuitively, the average should be the midpoint of that interval.
Question: What is the expected value of ?
📝 View Detailed Solution ▼
Solution: The density is for .
Key Properties of Expected Value
The linearity properties established for discrete variables hold true for continuous variables as well, using integrals instead of sums:
6.1 Variance and Standard Deviation: The Continuous Spread
Variance and standard deviation quantify the spread of a continuous random variable around its expected value.
Concept and Calculation
Continuous Variance
The variance () is the expected value of the squared distance between and its mean, :
The standard deviation ( or ) is the square root of the variance.
Key Properties of Variance
The properties mirror the discrete case:
- Alternate Formula: .
- Scaling: .
- Shifting: .
- Independence (Product): If and are independent, .
- Independence (Sum): If and are independent, .
Example 2: Variance of a Uniform Distribution
Uniform Variance
Question: What is the variance of ? (We know ).
📝 View Detailed Solution ▼
Solution: Using the formula :
First, calculate :
Next, subtract the squared mean:
The standard deviation is .
Application: Central Theorems
The well-known inequalities from discrete probability generalize to the continuous domain. These bounds apply universally, regardless of the specific shape of the continuous distribution.
Markov’s Inequality
For a non-negative continuous RV with finite mean , and any :Chebychev’s Inequality
For any continuous RV with finite non-zero variance , and any :6.2 Conditional Expectation and Variance
For continuous random variables and with a joint density , knowing the outcome of changes the expected value and spread of . This leads to the concepts of conditional expectation and conditional variance.
Concept: Conditional Expectation
The conditional expectation of given uses the conditional density :
Where is the marginal density of .
Laws of Total Expectation and Variance
These powerful theorems allow the overall (unconditional) average and variance of a variable to be computed from its conditional characteristics.
| Theorem | Formula | Description |
|---|---|---|
| Law of Total Expectation | Expectation of the conditional expectation. | |
| Law of Total Variance | Sum of expected conditional variance and variance of conditional expectation. |
6.2 Covariance and Correlation: Measuring Relationships
When dealing with two continuous random variables, covariance and correlation measure the strength and direction of their linear relationship.
Concept: Covariance
Covariance Formula
The covariance () measures the degree to which and move together.
The alternate computational formula is:
A key consequence: If and are independent, then (they are uncorrelated). (Note: The converse is generally not true).
Concept: Correlation
The correlation coefficient () is the standardized version of covariance, restricted between and :
It is dimensionless and measures the degree of linear association.
6.3 Moment Generating Functions (MGFs)
The moment generating function () is a mathematical tool that, when it exists, can fully define the distribution of a random variable and simplify complex calculations.
Definition and Moments
The -th moment of is . The MGF is defined as the expected value of :
The MGF generates moments via its derivatives evaluated at : .
Key Properties of MGFs
- Linear Transformation: .
- Sum of Independents: If and are independent, the MGF of their sum is the product of their individual MGFs:
- Uniqueness Theorem: If two random variables have the same MGF over an open interval containing 0, they have the exact same distribution.
Example 3: Sum of Independent Normals (MGF Application)
MGF of Normal Sum
Question: If and are independent, what is the distribution of their sum?
📝 View Detailed Solution ▼
Solution: Since the MGF for a single normal variable is , the resulting MGF of the sum is:
By the uniqueness theorem, must be distributed as .
6.4 Bivariate Normal Distributions
This section discusses the properties of a joint distribution where linear combinations of the constituent variables are always normally distributed.
Definition and Properties
Bivariate Normal
A pair of random variables is bivariate normal if every linear combination is a normally distributed random variable.
- Marginal Normality: and individually are also normally distributed.
- Determination by Moments: The joint distribution is completely determined by the means, variances, and correlation.
- Independence is Uncorrelation: For bivariate normal variables, if and only if and are independent.
Analogy: Summarizing a continuous random variable is like weighing an object. The Expected Value is the measurement of the object’s mass (its central tendency), while the Variance is a measure of the precision of the scale (how spread out the possible readings are). Covariance, then, is like simultaneously weighing two interconnected objects to see how much one influences the other’s reading.