.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_tutorials/analysis/plot_sensitivity_analysis.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_tutorials_analysis_plot_sensitivity_analysis.py: Sensitivity Analysis ==================== Quantifying the sensitivity of a model output :math:`f` to the model parameters :math:`\rv` can be an important component of any modeling exercise. This section demonstrates how to use popular local and global sensitivity analysis. Sobol Indices ------------- Any function :math:`f` with finite variance parameterized by a set of independent variables :math:`\rv` with :math:`\pdf(\rv)=\prod_{j=1}^d\pdf(\rv_j)` and support :math:`\rvdom=\bigotimes_{j=1}^d\rvdom_j` can be decomposed into a finite sum, referred to as the ANOVA decomposition, .. math:: f(\rv) = \hat{f}_0 + \sum_{i=1}^d \hat{f}_i(\rv_i)+ \sum_{i,j=1}^d \hat{f}_{i,j}(\rv_i,\rv_j) + \cdots + \hat{f}_{1,\ldots,d}(\rv_1,\ldots,\rv_d) or more compactly .. math:: f(\rv)=\sum_{\V{u}\subseteq\mathcal{D}}\hat{f}_{\V{u}}(\rv_{\V{u}}) where :math:`\hat{f}_\V{u}` quantifies the dependence of the function :math:`f` on the variable dimensions :math:`i\in\V{u}` and :math:`\V{u}=(u_1,\ldots,u_s)\subseteq\mathcal{D}=\{1,\ldots,d\}`. The functions :math:`\hat{f}_\V{u}` can be obtained by integration, specifically .. math:: \hat{f}_\V{u}(\rv_\V{u}) = \int_{\rvdom_{\mathcal{D}\setminus\V{u}}}f(\rv)\dx{\pdf_{\mathcal{D} \setminus \V{u}}(\rv)}-\sum_{\V{v}\subset\V{u}}\hat{f}_\V{v}(\rv_\V{v}), where :math:`\dx{\pdf_{\mathcal{D} \setminus \V{u}}(\rv)}=\prod_{j\notin\V{u}}\dx{\pdf_j(\rv)}` and :math:`\rvdom_{\mathcal{D} \setminus \V{u}}=\bigotimes_{j\notin\V{u}}\rvdom_j`. The first-order terms :math:`\hat{f}_{\V{u}}(\rv_i)`, :math:`\lVert \V{u}\rVert_{0}=1` represent the effect of a single variable acting independently of all others. Similarly, the second-order terms :math:`\lVert\V{u}\rVert_{0}=2` represent the contributions of two variables acting together, and so on. The terms of the ANOVA expansion are orthogonal, i.e. the weighted :math:`L^2` inner product :math:`(\hat{f}_\V{u},\hat{f}_\V{v})_{L^2_\pdf}=0`, for :math:`\V{u}\neq\V{v}`. This orthogonality facilitates the following decomposition of the variance of the function :math:`f` .. math:: \var{f}=\sum_{\V{u}\subseteq\mathcal{D}}\var{\hat{f}_\V{u}}, \qquad \var{\hat{f}_\V{u}} = \int_{\rvdom_{\V{u}}} f^2_{\V{u}} \dx{\pdf_\V{u}}, where :math:`\dx{\pdf_{\V{u}}(\rv)}=\prod_{j\in\V{u}}\dx{\pdf_j(\rv)}`. The quantities :math:`\var{\hat{f}_\V{u}}/ \var{f}` are referred to as Sobol indices [SMCS2001]_ and are frequently used to estimate the sensitivity of :math:`f` to single, or combinations of input parameters. Note that this is a *global* sensitivity, reflecting a variance attribution over the range of the input parameters, as opposed to the local sensitivity reflected by a derivative. Two popular measures of sensitivity are the main effect and total effect indices given respectively by .. math:: S_{i} = \frac{\var{\hat{f}_{\V{e}_i}}}{\var{f}}, \qquad S^T_{i} = \frac{\sum_{\V{u}\in\mathcal{J}}\var{\hat{f}_{\V{u}}}}{\var{f}} where :math:`\V{e}_i` is the unit vector, with only one non-zero entry located at the :math:`i`-th element, and :math:`\mathcal{J} = \{\V{u}:i\in\V{u}\}`. Sobol indices can be computed different ways. In the following we will use polynomial chaos expansions, as in [SRESS2008]_. .. GENERATED FROM PYTHON SOURCE LINES 40-58 .. code-block:: default import matplotlib.pyplot as plt from pyapprox.benchmarks import setup_benchmark from pyapprox.surrogates import approximate from pyapprox import analysis benchmark = setup_benchmark("ishigami", a=7, b=0.1) num_samples = 1000 train_samples = benchmark.variable.rvs(num_samples) train_vals = benchmark.fun(train_samples) approx_res = approximate( train_samples, train_vals, 'polynomial_chaos', {'basis_type': 'hyperbolic_cross', 'variable': benchmark.variable, 'options': {'max_degree': 8}}) pce = approx_res.approx res = analysis.gpc_sobol_sensitivities(pce, benchmark.variable) .. GENERATED FROM PYTHON SOURCE LINES 59-60 Now lets compare the estimated values with the exact value .. GENERATED FROM PYTHON SOURCE LINES 60-63 .. code-block:: default print(res.main_effects[:, 0]) print(benchmark.main_effects[:, 0]) .. rst-class:: sphx-glr-script-out .. code-block:: none [3.13751427e-01 4.42645977e-01 6.47617333e-08] [0.31390519 0.44241114 0. ] .. GENERATED FROM PYTHON SOURCE LINES 64-65 We can visualize the sensitivity indices using the following .. GENERATED FROM PYTHON SOURCE LINES 65-77 .. code-block:: default fig, axs = plt.subplots(1, 3, figsize=(3*8, 6)) analysis.plot_main_effects(benchmark.main_effects, axs[0]) analysis.plot_total_effects(benchmark.total_effects, axs[1]) analysis.plot_interaction_values(benchmark.sobol_indices, benchmark.sobol_interaction_indices, axs[2]) axs[0].set_title(r'$\mathrm{Main\;Effects}$') axs[1].set_title(r'$\mathrm{Total\;Effects}$') axs[2].set_title(r'$\mathrm{Sobol\;Indices}$') plt.show() .. image-sg:: /auto_tutorials/analysis/images/sphx_glr_plot_sensitivity_analysis_001.png :alt: $\mathrm{Main\;Effects}$, $\mathrm{Total\;Effects}$, $\mathrm{Sobol\;Indices}$ :srcset: /auto_tutorials/analysis/images/sphx_glr_plot_sensitivity_analysis_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 78-82 .. Morris One-at-a-time -------------------- [MT1991]_ .. GENERATED FROM PYTHON SOURCE LINES 85-90 References ^^^^^^^^^^ .. [SMCS2001] `I.M. Sobol. Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates. Mathematics and Computers in Simulation, 55(3): 271-280, 2001. `_ .. [SRESS2008] `B. Sudret. Global sensitivity analysis using polynomial chaos expansions. Reliability Engineering & System Safety, 93(7): 964-979, 2008. `_ .. .. GENERATED FROM PYTHON SOURCE LINES 90-93 .. code-block:: default #.. # .. [MT1991] `M.D. Morris. Factorial Sampling Plans for Preliminary Computational Experiments, Technometrics, 33:2, 161-174, 1991 `_ .. rst-class:: sphx-glr-timing **Total running time of the script:** ( 0 minutes 2.777 seconds) .. _sphx_glr_download_auto_tutorials_analysis_plot_sensitivity_analysis.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_sensitivity_analysis.py ` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_sensitivity_analysis.ipynb ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_