Chapter 2: Monte Carlo Simulation
Monte Carlo simulation transforms intractable problems into experiments we can run on a computer. When analytical solutions are impossible—high-dimensional integrals, complex probability calculations, distributions without closed forms—we replace mathematics with simulation. The law of large numbers guarantees that averages of random samples converge to expectations; the central limit theorem tells us how fast. These fundamental results turn simulation from ad-hoc experimentation into rigorous methodology.
The chapter opens with a paradox at the heart of computational randomness: how can deterministic algorithms produce sequences that behave randomly? Pseudo-random number generators create long sequences from short seeds, sequences that pass stringent statistical tests for uniformity and independence. Understanding this machinery—linear congruential generators, the Mersenne Twister, modern PCG algorithms—demystifies the random() function and enables proper seed management for reproducibility.
From uniform variates, we build generators for arbitrary distributions. The inverse CDF method exploits a beautiful theorem: if \(U \sim \text{Uniform}(0,1)\), then \(F^{-1}(U)\) has distribution \(F\). When inverse CDFs lack closed forms, transformation methods like Box-Muller convert uniforms to normals through clever geometry. When all else fails, rejection sampling generates from any target distribution by accepting or rejecting proposals from a simpler distribution. The chapter culminates with variance reduction techniques—antithetic variates, control variates, importance sampling, stratification—that can improve Monte Carlo efficiency by orders of magnitude, transforming naive simulation into precision computation.
Learning Objectives: Upon completing this chapter, you will be able to:
Monte Carlo Fundamentals
Define Monte Carlo estimators as sample averages approximating expectations
Apply the law of large numbers to establish consistency of Monte Carlo estimates
Derive confidence intervals using the central limit theorem and Monte Carlo standard errors
Analyze the \(O(1/\sqrt{n})\) convergence rate and its dimension-independence advantage
Pseudo-Random Number Generation
Implement linear congruential generators and analyze their period and lattice structure
Evaluate PRNG quality using statistical tests (chi-square, runs, spectral)
Design seed management systems for reproducibility and parallel streams
Compare algorithms (LCG, Mersenne Twister, PCG) and their trade-offs
Random Variate Generation
Apply the inverse CDF method to continuous and discrete distributions
Implement efficient algorithms for discrete sampling (binary search, alias method)
Transform uniform variates to normal using Box-Muller and Marsaglia polar methods
Design rejection sampling algorithms with optimal proposal distributions
Variance Reduction
Implement antithetic variates by exploiting negative correlation
Apply control variates using known expectations of correlated quantities
Design importance sampling schemes that concentrate samples in high-contribution regions
Construct stratified and Latin hypercube sampling for improved space coverage
Analyze efficiency gains and determine when each technique applies
Sections
- Section 2.1 Monte Carlo Fundamentals
- The Historical Development of Monte Carlo Methods
- The Core Principle: Expectation as Integration
- Theoretical Foundations
- Variance Estimation and Confidence Intervals
- Worked Examples
- Comparison with Deterministic Methods
- Sample Size Determination
- Convergence Diagnostics and Monitoring
- Practical Considerations
- Chapter 2.1 Exercises: Monte Carlo Fundamentals Mastery
- Bringing It All Together
- Bringing It All Together
- Transition to What Follows
- References
- Section 2.2 Uniform Random Variates
- Why Uniform? The Universal Currency of Randomness
- The Paradox of Computational Randomness
- Chaotic Dynamical Systems: An Instructive Failure
- Linear Congruential Generators
- Shift-Register Generators
- The KISS Generator: Combining Strategies
- Modern Generators: Mersenne Twister and PCG
- Statistical Testing of Random Number Generators
- Practical Considerations
- Chapter 2.2 Exercises: Uniform Random Variates Mastery
- Bringing It All Together
- Transition to What Follows
- References
- Section 2.3 Inverse CDF Method
- Mathematical Foundations
- Continuous Distributions with Closed-Form Inverses
- Numerical Inversion
- Discrete Distributions
- Mixed Distributions
- Practical Considerations
- Chapter 2.3 Exercises: Inverse CDF Method Mastery
- Bringing It All Together
- Bringing It All Together
- Transition to What Follows
- References
- Section 2.4 Transformation Methods
- Why Transformation Methods?
- The Box–Muller Transform
- The Polar (Marsaglia) Method
- Method Comparison: Box–Muller vs Polar vs Ziggurat
- The Ziggurat Algorithm
- The CLT Approximation (Historical)
- Distributions Derived from the Normal
- Multivariate Normal Generation
- Implementation Guidance
- Chapter 2.4 Exercises: Transformation Methods Mastery
- Bringing It All Together
- References
- Section 2.5 Rejection Sampling
- The Dartboard Intuition
- The Accept-Reject Algorithm
- Efficiency Analysis
- Choosing the Proposal Distribution
- Python Implementation
- The Squeeze Principle
- Geometric Example: Sampling from the Unit Disk
- Worked Examples
- Limitations and the Curse of Dimensionality
- Connections to Other Methods
- Practical Considerations
- Chapter 2.5 Exercises: Rejection Sampling Mastery
- Bringing It All Together
- References
- Section 2.6 Variance Reduction Methods
- The Variance Reduction Paradigm
- Importance Sampling
- Control Variates
- Antithetic Variates
- Stratified Sampling
- Common Random Numbers
- Conditional Monte Carlo (Rao–Blackwellization)
- Combining Variance Reduction Techniques
- Practical Considerations
- Bringing It All Together
- Chapter 2.6 Exercises: Variance Reduction Mastery
- References
- Section 2.7 Chapter 2 Summary