Bagging (Bootstrap Aggregating) reduces variance without increasing bias.
- Bootstrap: draw n samples with replacement from training data n times → n datasets of same size.
- Train: fit a high-variance model (e.g., decision tree) on each dataset → n predictors.
- Aggregate:
- Regression → average predictions
- Classification → majority vote