Tutorials

Browse 80+ validated, executable tutorials covering forward UQ, multifidelity estimation, inverse UQ, experimental design, surrogate modeling, and sensitivity analysis.

Tutorial Library

80+ validated, executable tutorials organized by topic. Each tutorial is dual-backend tested (NumPy and PyTorch) and reproducible. Start with the learning path below, or jump to the topic that fits your application.

Start Here

New to PyApprox? Follow this learning path:

4

Pick a Topic

Choose a topic below that matches your application


Run Tutorials in Google Colab

Every tutorial is available as a Jupyter notebook — download, upload, and run in three steps.

1

Download

Download any .ipynb from the notebooks folder

2

Upload

Open Google Colab and upload the notebook (File → Upload)

3

Run

Run the first cell to install PyApprox, then run all remaining cells


Forward UQ

Propagate input uncertainty through computational models to characterize output variability.

Forward Uncertainty Quantification The forward UQ problem: propagating input uncertainty through a model to characterize output uncertainty Concept

Random Variables and Distributions Specifying parameter uncertainty with independent and copula-based distributions Concept

Monte Carlo Sampling Estimating QoI statistics by random sampling: variability, sample size effects, and multi-output covariance Hands-on

Estimator Accuracy and MSE Bias, variance, and MSE of Monte Carlo estimators for scalar and vector-valued statistics Analysis

Quasi–Monte Carlo Sampling Low-discrepancy sequences fill space more evenly than random samples, yielding faster convergence for smooth integrands Concept

Practical QMC: Scrambling, Error Bars, and High Dimensions Owen scrambling for error estimation, star discrepancy as a measure of uniformity, and how the QMC advantage depends on dimension Concept

Importance Sampling for Rare Events When Monte Carlo needs millions of samples to see a single failure, importance sampling targets the tail and corrects with likelihood ratios Concept

Monte Carlo Budget Estimation Using pilot samples and the estimator API to predict the number of samples needed for a target accuracy Hands-on


Multifidelity Estimation

Reduce computational cost by combining multiple model fidelities with optimal sample allocation.

Control Variate Monte Carlo Reducing MC estimator variance using a correlated low-fidelity model with a known statistic Concept

Control Variate Analysis Deriving the optimal control variate weight, variance reduction formula, and unbiasedness Analysis

ACV Concept Approximate control variates: extending CVMC to multiple low-fidelity models without known statistics Concept

General ACV Concept How using all LF models as direct control variates breaks the CV-1 variance ceiling Concept

General ACV Analysis Deriving the optimal weight matrix, minimum covariance, allocation matrices, and covariance formulas for all ACV estimators Analysis

MFMC Concept Multi-fidelity Monte Carlo: exploiting a hierarchy of model fidelities for variance reduction Concept

MFMC Analysis Optimal sample ratios, variance reduction formulas, and numerical verification of MFMC and MLMC Analysis

Multi-Level Monte Carlo Estimating high-fidelity model statistics using a hierarchy of models by accumulating corrections between consecutive levels Concept

MLMC Analysis Deriving the optimal MLMC sample allocation, minimum achievable cost, and the MLMC complexity theorem Analysis

Multi-Output ACV Concept Why estimating several QoIs jointly with multi-fidelity corrections improves each individual estimate Concept

Multi-Output ACV Analysis Deriving the MOACV optimal weight matrix and the condition under which joint estimation beats independent scalar ACVs Analysis

PACV Concept How a single recursion index unifies MLMC, MFMC, ACVMF, and ACVIS as special cases of one family Concept

PACV Analysis Formal definition of GMF, GRD, GIS families and the covariance formulas enabling enumeration-based best-estimator selection Analysis

MLBLUE Concept The multi-level best linear unbiased estimator: optimal weights from a single linear system Concept

MLBLUE Analysis Deriving the MLBLUE weight matrix, variance bound, and comparison with ACV families Analysis

Pilot Studies Concept Using small pilot samples to estimate the covariance structure needed by multi-fidelity estimators Concept

Pilot Studies Analysis Analyzing pilot study accuracy and its effect on downstream estimator performance Analysis

Ensemble Selection Concept Choosing which subset of available models to include in a multi-fidelity estimator Concept

API Cookbook Unified API cookbook: one-stop reference for all multi-fidelity estimators Cookbook


Inverse UQ

Infer model parameters from data using Bayesian inference, MCMC, and variational methods.

Introduction to Bayesian Inference From data to parameters: updating beliefs using Bayes’ theorem Concept

Sampling the Posterior with MCMC Using Markov chain Monte Carlo to generate posterior samples when grid evaluation is infeasible Concept

Practical MCMC: MH and DRAM Tuning proposal distributions, adaptive methods, and convergence diagnostics for MCMC Hands-on

Introduction to Variational Inference Approximating the posterior with optimization instead of sampling Concept

The Variational Objective Deriving the ELBO and understanding the KL divergence minimization Analysis

Optimizing the ELBO Gradient-based optimization of the evidence lower bound Analysis

Variational Families Mean-field, full-rank, and normalizing flow variational families Concept

Amortized Variational Inference Learning inference networks that generalize across observations Hands-on


Experimental Design

Choose optimal experiments to maximize information gain or minimize prediction uncertainty.

Introduction to Experimental Design Why sensor placement matters: comparing designs by posterior covariance and Expected Information Gain Concept

KL-Based OED

OED: KL Concept What Expected Information Gain measures and why it is the right objective for choosing experiments Concept

OED: Double-Loop Estimator How the double-loop MC estimator approximates EIG by treating the marginal evidence as a nested average Analysis

OED: Gradients The reparameterization trick, the C1+C2+C3 gradient decomposition, and the corrected Hessian Analysis

OED: Convergence Using KLOEDDiagnostics to measure MSE, decompose bias and variance, and confirm convergence Hands-on

OED: MC vs Halton Why randomly-shifted Halton sequences reduce MSE compared to i.i.d. MC for the double-loop EIG estimator Analysis

OED: Nonlinear Usage Applying EIG-based OED to a Lotka-Volterra predator-prey system Hands-on

OED: Design Stability Demonstrating that OED designs depend on MC realizations at fixed budget and stabilize as budget grows Analysis

Goal-Oriented OED

Goal-Oriented OED: Concept Why targeting prediction uncertainty rather than parameter uncertainty changes which experiments are optimal Concept

Goal-Oriented OED: Gaussian Analysis Closed-form posterior mean distribution, entropic risk deviation, and AVaR deviation for Gaussian push-forwards Analysis

Goal-Oriented OED: Log-Normal Analysis Deriving the distribution and expected AVaR deviation of the standard deviation of a log-normal QoI Analysis

Goal-Oriented OED: Convergence Verifying numerical risk-aware utility estimates converge to analytical values with MC vs. Halton Hands-on

Goal-Oriented OED: Nonlinear Usage Comparing std-deviation and entropic deviation objectives for goal-oriented OED on Lotka-Volterra Hands-on

OED Workflows

OED Data Workflow Generating, saving, and reusing expensive model evaluations so OED optimisation avoids re-running the forward model Hands-on


Approximation Theory

Mathematical foundations for interpolation, quadrature, and polynomial approximation.

Lagrange Interpolation Lagrange basis polynomials, Runge phenomenon, and optimal node placement Concept

Piecewise Interpolation Local basis functions for smooth and discontinuous functions Concept

Tensor Product Interpolation Multivariate approximation via Cartesian products of 1D bases Concept

Gauss Quadrature Optimal nodes from orthogonal polynomials with exponential convergence Concept

Piecewise Polynomial Quadrature Rectangle, trapezoid, and Simpson’s rules with error derivations Analysis

Leja Sequences Nested points for adaptive interpolation via greedy determinant maximisation Concept


Surrogate Modeling

Build fast approximations of expensive models using polynomials, sparse grids, and Gaussian processes.

The Surrogate Modeling Workflow Build your first surrogate model and learn the four-step workflow that every surrogate method follows Concept

What Makes a Good Surrogate? How the choice of ansatz, training data, and fitting strategy affect surrogate quality, and how different surrogates exploit different kinds of function structure Concept

Polynomial Chaos

Building a Polynomial Chaos Surrogate Approximate an expensive model with a polynomial chaos expansion Hands-on

Overfitting and Cross-Validation Choose the polynomial degree using LOO cross-validation Hands-on

Sparse Polynomial Approximation OMP and BPDN/LASSO for high-dimensional polynomial chaos Hands-on

UQ with Polynomial Chaos Extract analytical moments, marginal densities, and convergence comparisons from a fitted PCE Hands-on

Function Trains

Function Train Surrogates Build intuition for low-rank functions and the function train decomposition Concept

UQ with Function Trains Extract analytical moments, marginal densities, and input marginals from a fitted function train Hands-on

Sparse Grids

Isotropic Sparse Grids Smolyak combination technique for breaking the curse of dimensionality Concept

Adaptive Sparse Grids Data-driven refinement for anisotropic functions Hands-on

Multifidelity Sparse Grids Multi-index collocation for model hierarchies with varying cost and accuracy Hands-on

Sparse Grid to PCE Conversion Spectral projection converting Lagrange-based sparse grid interpolants into Polynomial Chaos Expansions Analysis

UQ with Sparse Grids Compute moments from sparse grid quadrature, convert to PCE for Sobol indices, and compare against MC Hands-on

Gaussian Processes

Building a GP Surrogate Approximate an expensive model with a Gaussian process and quantify prediction uncertainty Hands-on

UQ with Gaussian Processes Compute analytical moments, marginal densities, and main effects from a fitted GP Hands-on

Adaptive GP Sampling Choose training points that maximally reduce GP posterior uncertainty using greedy Cholesky and IVAR strategies Hands-on

Multi-Fidelity Gaussian Processes Exploit correlations between model fidelities to build accurate surrogates within a fixed computational budget Hands-on


Sensitivity Analysis

Identify which uncertain inputs drive output variability using variance decomposition and screening.

Sensitivity Analysis Concept Variance decomposition and Sobol indices: identifying which uncertain inputs drive output variability Concept

Sobol Sensitivity Analysis Variance-based sensitivity indices for identifying important parameters Hands-on

PCE-Based Sensitivity Computing Sobol indices analytically from polynomial chaos expansion coefficients Hands-on

GP-Based Sensitivity Computing Sobol indices analytically from GP kernel integrals and verifying against fix-and-freeze estimates Hands-on

Morris Screening One-at-a-time elementary effects for cheap parameter screening Hands-on

Bin-Based Sensitivity Variance-based sensitivity indices via binning estimation Hands-on

FT Sensitivity: Math Deriving Sobol indices analytically from PCE FunctionTrain structure Analysis

FT Sensitivity: Usage Computing Sobol indices analytically with FunctionTrainSensitivity Hands-on


Random Fields

Represent spatially correlated uncertainty using Karhunen-Loeve expansions.

KLE Introduction Karhunen-Loeve expansion for representing random fields as truncated eigenfunction series Concept

KLE: Mesh and Data-Driven Building KLE from mesh-based covariance or data-driven empirical covariance Hands-on

KLE: SPDE Approach Stochastic PDE approach to efficient random field generation Hands-on


Optimization Under Uncertainty

Optimize system design when model inputs are uncertain.

Design Under Uncertainty Optimizing system design when model inputs are uncertain Hands-on