Multi-Fidelity Monte Carlo
PyApprox Tutorial Library
Learning Objectives
After completing this tutorial, you will be able to:
- Explain how the MFMC recursive derivation builds the estimator from two-model ACV
- Describe the nested sample structure \(\mathcal{Z}_0 \subset \cdots \subset \mathcal{Z}_M\) and contrast it with MLMC’s independent sets
- Explain why only \(f_1\) directly reduces HF variance in both MFMC and MLMC
- Identify settings where MFMC and MLMC perform similarly vs differently
Prerequisites
Complete Multi-Level Monte Carlo before this tutorial.
Building the Estimator Recursively
MFMC is constructed by starting from the two-model ACV estimator and recursively improving each LF mean estimate with the next cheaper model.
Step 1. Start with the two-model ACV estimator:
\[ \hat{\mu}_0 = \hat{\mu}_0(\mathcal{Z}_0) + \eta_1\bigl(\hat{\mu}_1(\mathcal{Z}_0) - \hat{\mu}_1(\mathcal{Z}_1)\bigr). \]
Step 2. Rather than taking \(\hat{\mu}_1(\mathcal{Z}_1)\) as a raw mean, improve it with \(f_2\) using the same ACV pattern, then substitute back:
\[ \hat{\mu}_0 = \hat{\mu}_0(\mathcal{Z}_0) + \eta_1\bigl(\hat{\mu}_1(\mathcal{Z}_0) - \hat{\mu}_1(\mathcal{Z}_1)\bigr) + \underbrace{\eta_1 \eta_2}_{\text{weight product}} \bigl(\hat{\mu}_2(\mathcal{Z}_1) - \hat{\mu}_2(\mathcal{Z}_2)\bigr). \]
Step 3. Repeating this substitution for all \(M\) LF models and re-labelling the accumulated weight products as a single effective weight \(\eta_\alpha\) per level gives the compact MFMC estimator:
\[ \hat{\mu}_0^{\text{MFMC}} = \hat{\mu}_0(\mathcal{Z}_0) + \sum_{\alpha=1}^{M} \eta_\alpha \bigl(\hat{\mu}_\alpha(\mathcal{Z}_{\alpha-1}) - \hat{\mu}_\alpha(\mathcal{Z}_\alpha)\bigr). \tag{1}\]
The nested structure \(\mathcal{Z}_0 \subset \mathcal{Z}_1 \subset \cdots \subset \mathcal{Z}_M\) ensures that model \(f_\alpha\) is evaluated on both \(\mathcal{Z}_{\alpha-1}\) (the shared subset with the model above) and \(\mathcal{Z}_\alpha\) (the full set including its exclusive samples), mirroring the ACV structure at every level.
The estimator is unbiased for any choice of \(\eta_\alpha\), since each correction term has mean zero. The weights are then chosen to minimise variance — see MFMC Analysis for the full derivation.
Key Takeaways
- MFMC builds the estimator recursively: start from two-model ACV and replace each raw LF mean with an improved ACV estimate at the next level
- The result is a sum of corrections with nested sample sets and optimised weights
- MFMC optimises weights; MLMC fixes them at \(-1\)
Exercises
Write out the three-model MFMC estimator explicitly using the weight-product form from the recursive derivation. Verify that it matches Equation 1.
If \(\rho_{0,1} = 0.99\) and \(\rho_{0,2} = 0.50\), does adding \(f_2\) significantly improve variance over using \(f_1\) alone in MFMC? Why or why not?
Next Steps
- MFMC Analysis — Derive the MFMC variance formula and optimal sample ratios \(r_\alpha^*\)
- API Cookbook — Use the PyApprox MFMC API end-to-end
Ready to try this? See API Cookbook → MFMCEstimator.
References
- [PWGSIAM2016] B. Peherstorfer, K. Willcox, M. Gunzburger. Optimal Model Management for Multifidelity Monte Carlo Estimation. SIAM Journal on Scientific Computing, 38(5):A3163–A3194, 2016. DOI