Chapter 2: Monte Carlo Simulation

Monte Carlo simulation transforms intractable problems into experiments we can run on a computer. When analytical solutions are impossible—high-dimensional integrals, complex probability calculations, distributions without closed forms—we replace mathematics with simulation. The law of large numbers guarantees that averages of random samples converge to expectations; the central limit theorem tells us how fast. These fundamental results turn simulation from ad-hoc experimentation into rigorous methodology.

The chapter opens with a paradox at the heart of computational randomness: how can deterministic algorithms produce sequences that behave randomly? Pseudo-random number generators create long sequences from short seeds, sequences that pass stringent statistical tests for uniformity and independence. Understanding this machinery—linear congruential generators, the Mersenne Twister, modern PCG algorithms—demystifies the random() function and enables proper seed management for reproducibility.

From uniform variates, we build generators for arbitrary distributions. The inverse CDF method exploits a beautiful theorem: if \(U \sim \text{Uniform}(0,1)\), then \(F^{-1}(U)\) has distribution \(F\). When inverse CDFs lack closed forms, transformation methods like Box-Muller convert uniforms to normals through clever geometry. When all else fails, rejection sampling generates from any target distribution by accepting or rejecting proposals from a simpler distribution. The chapter culminates with variance reduction techniques—antithetic variates, control variates, importance sampling, stratification—that can improve Monte Carlo efficiency by orders of magnitude, transforming naive simulation into precision computation.

Learning Objectives: Upon completing this chapter, you will be able to:

Monte Carlo Fundamentals

  • Define Monte Carlo estimators as sample averages approximating expectations

  • Apply the law of large numbers to establish consistency of Monte Carlo estimates

  • Derive confidence intervals using the central limit theorem and Monte Carlo standard errors

  • Analyze the \(O(1/\sqrt{n})\) convergence rate and its dimension-independence advantage

Pseudo-Random Number Generation

  • Implement linear congruential generators and analyze their period and lattice structure

  • Evaluate PRNG quality using statistical tests (chi-square, runs, spectral)

  • Design seed management systems for reproducibility and parallel streams

  • Compare algorithms (LCG, Mersenne Twister, PCG) and their trade-offs

Random Variate Generation

  • Apply the inverse CDF method to continuous and discrete distributions

  • Implement efficient algorithms for discrete sampling (binary search, alias method)

  • Transform uniform variates to normal using Box-Muller and Marsaglia polar methods

  • Design rejection sampling algorithms with optimal proposal distributions

Variance Reduction

  • Implement antithetic variates by exploiting negative correlation

  • Apply control variates using known expectations of correlated quantities

  • Design importance sampling schemes that concentrate samples in high-contribution regions

  • Construct stratified and Latin hypercube sampling for improved space coverage

  • Analyze efficiency gains and determine when each technique applies

Sections