Ensemble models may employ bootstrap sampling (“bootstrapping”) as a strategy for regularization. In fact, bootstrapping is the foundational concept behind parallel ensembles such as bagging and random forest. As these models take a (potentially weighted) average of their base learners, bootstrap sampling has the effect of reducing the contribution of any one data point. For these models, the smaller the sample size, the less probable it is that a particular outlier will appear in each sample; hence, smaller bootstrap samples tend to reduce variance but increase bias in the ensemble prediction.

Bootstrap sampling is not supported for sequential ensembles, but subsampling is.