The Bootstrap | Vose Software

# The Bootstrap

The Bootstrap (sometimes called Resampling) was introduced by Efron (1979) and is explored in great depth and very readably in both Efron and Tibshirani (1993) and Davison and Hinkley (1997). It is an extremely flexible technique belonging to the classical school of statistics.

This section presents a brief introduction that covers all of the important concepts. The Bootstrap has earned its place as a useful technique because:

1. It is easy to do and transparent;

2. It makes use of the increased power and ease of use of computers;

3. It corresponds well with traditional techniques where they are available, particularly when a large data set has been obtained; and

4. It offers an opportunity to assess the uncertainty about a parameter where more traditional classical statistics techniques are not available.

Bootstrapping is particularly relevant as a statistical technique here because the Monte Carlo simulation capability that simulation software adds to Excel allows one to perform Bootstrapping with great ease.

The Jacknife

The precursor to the Bootstrap

##### The non-parametric Bootstrap

The non-parametric Bootstrap is used to estimate parameters of a population or probability distribution when we do not know the distributional form, which is the most common situation.

##### The parametric Bootstrap

The parametric Bootstrap is used to estimate parameters of a population or probability distribution when we believe we know the distributional form (e.g. Normal, Lognormal, Gamma, Poisson, etc).

Bootstrap likelihood function

Using the Bootstrap as a likelihood function in Bayesian inference

Estimating parameters for multiple variables

We are sometimes interested in estimating parameters that describe relationships between variables, for example: regression parameters and rank correlation coefficients. The Bootstrap can provide uncertainty about these estimates in an intuitive way, by Bootstrapping the paired data values.

Examples: