zEnsemble Methods that build models by combining multiple weak learners (typically Decision Trees) to create a strong predictive model (e.g. Random Forest).
- XGBoost
- LightGBM
- CatBoost
#How it works
- Minimize a loss function (e.g. MSE, log loss) using Gradient Descent and iteratively improve the model.
- Initialization: Simple model to predict mean value ( #ml/regression ) or uniform probabilities #ml/classification .
- Compute residuals: difference between observed values and model predictions (residuals).
- Fit a weak learner: train it to predict the residuals from the previous step.
- Update the model: Add the predictions from the weak learner to the model, scaled by a learning rate.
- Iterate: until matching a stopping criterion (e.g. num. iterations, no improvement in performance).
#Preconditions
- Hyperparameter tuning (learning rate)
- Regularization (prevent overfitting)
#Evaluation
#Advantages
- High predictive accuracy
- Handles non-linear relationships
#Limitations
- Computationally expensive
- Sensitive to ?