Bias-Variance Tradeoff - Yousef's Notes
Bias-Variance Tradeoff

Bias-Variance Tradeoff

Hypterparameter Tuning controls two tradeoffs: Precision-Recall Tradeoff and the Bias-Variance tradeoff.

In practice, by trying to reduce variance, you increase bias, and vice versa. i.e. reducing Overfitting leads to Underfitting, and the other way around.

The most important factor is model complexity.

A sufficiently complex model will learn to memorize all training examples and their labels and, thus, will not make prediction errors when applied to the training data. It will have low bias. However, a model relying on memorization will not be able to correctly predict labels of previously unseen data. It will have high variance.

The “zone of solutions” is where both bias and variance are low. Once in this zone, you can do Hypterparameter Tuning to reach the needed Precision-Recall Tradeoff, or opimize another Model Performance Metrics appropriate for your problem.

#How to reach zone of solutions

  • Move to the right by increasing the complexity of the model, and , by doing so, reducing its bias.
  • Move to the left by regularizing the model to reduce variance by making the model simpler.

#Increasing Complexity

  • In Shallow models like Linear Regression, you can increase complexity by switching to a higher-order polynomial regression.
  • In a Decision Tree, increase the depth of the tree.
  • In a SVM, use polynomial or RBF kernels instead of linear kernel.
  • In a Neural Network, increase its size: number of units per layer, and number of layers.