Quantcast
Channel: Question and Answer » regularization
Viewing all articles
Browse latest Browse all 33

Is my understanding of regularized logistic regression correct?

$
0
0

I learned that regularized logistic regression helps prevent the model from over-fitting the data. I understand that the function is still technically a high-order polynomial, but the effect is reduced so it looks more like a curve, but here’s the part where I may understand incorrectly: The function is regularized by adding a term to the end that penalizes the rest of the parameters.

In a sense, the way I think of this is imagining a sine wave with a very small coefficient, like 0.0001, so the graph of 0.00001 * sin(x) would look much like a straight line compared to the same perspective of sin(x).

Is this the correct way to look at how regularized logistic regression works, or does the regularization follow some other principle?


Viewing all articles
Browse latest Browse all 33

Trending Articles