setup_rosenbrock_function

pyapprox.benchmarks.benchmarks.setup_rosenbrock_function(nvars)[source]

Setup the Rosenbrock function benchmark

f(z)=d/2i=1[100(z22i1z2i)2+(z2i11)2]

This benchmark can also be used to test Bayesian inference methods. Specifically this benchmarks returns the log likelihood

l(z)=f(z)

which can be used to compute the posterior distribution

\pi_{\text{post}}(\rv)=\frac{\pi(\V{y}|\rv)\pi(\rv)}{\int_{\rvdom} \pi(\V{y}|\rv)\pi(\rv)d\rv}

where the prior is the tensor product of d independent and identically distributed uniform variables on [-2,2], i.e. \pi(\rv)=\frac{1}{4^d}, and the likelihood is given by

\pi(\V{y}|\rv)=\exp\left(l(\rv)\right)
Parameters
nvarsinteger

The number of variables of the Rosenbrock function

Returns
benchmarkpya.Benchmark

Object containing the benchmark attributes documented below

funcallable

The rosenbrock with signature

fun(z) -> np.ndarray

where z is a 2D np.ndarray with shape (nvars,nsamples) and the output is a 2D np.ndarray with shape (nsamples,1)

jaccallable

The jacobian of fun with signature

jac(z) -> np.ndarray

where z is a 2D np.ndarray with shape (nvars,nsamples) and the output is a 2D np.ndarray with shape (nvars,1)

hesspcallable

Hessian of fun times an arbitrary vector p with signature

hessp(z, p) ->  ndarray shape (nvars,1)

where z is a 2D np.ndarray with shape (nvars,nsamples) and p is an arbitraty vector with shape (nvars,1)

variablepya.IndependentMultivariateRandomVariable

Object containing information of the joint density of the inputs z which is the tensor product of independent and identically distributed uniform variables on [-2,2].

meanfloat

The mean of the rosenbrock function with respect to the pdf of variable.

loglikecallable

The log likelihood of the Bayesian inference problem for inferring z given the uniform prior specified by variable and the negative log likelihood given by the Rosenbrock function. loglike has the signature

loglike(z) -> np.ndarray

where z is a 2D np.ndarray with shape (nvars,nsamples) and the output is a 2D np.ndarray with shape (nsamples,1)

loglike_gradcallable

The gradient of the loglike with the signature

loglike_grad(z) -> np.ndarray

where z is a 2D np.ndarray with shape (nvars,nsamples) and the output is a 2D np.ndarray with shape (nsamples,1)

References

DixonSzego1990

Dixon, L. C. W.; Mills, D. J. “Effect of Rounding Errors on the Variable Metric Method”. Journal of Optimization Theory and Applications. 80: 175–179. 1994

Examples

>>> from pyapprox.benchmarks.benchmarks import setup_benchmark
>>> benchmark=setup_benchmark('rosenbrock',nvars=2)
>>> print(benchmark.keys())
dict_keys(['fun', 'jac', 'hessp', 'variable', 'mean', 'loglike', 'loglike_grad'])