Quantcast
Channel: Question and Answer » regularization
Viewing all articles
Browse latest Browse all 33

Ridge regression — why does the model only care to control large outliers?

$
0
0

One of the purposes of ridge regression is to curb the effects of outliers which may cause the regression coefficients to be so large and hence cause a highly biased model.

That’s why the constraint $Sigmabeta_j^2<s$ is imposed, forcing the coefficients to not exceed a certain value.

Here is my issue. An outlier could be a value that is either too large or too small. I think this should mean than outliers could cause the $beta_j$’s to be too small or too large. The formulation of this constraint inequality seems to only care about controlling those outliers which might make the $beta_j$’s too large.

I believe Ordinary Least Squares Regression equally suffers from the effects of very small outliers.

Can someone please explain if or how Ridge regression controls the “small” outliers as well?


Viewing all articles
Browse latest Browse all 33

Trending Articles