Access Millions of academic & study documents

Machine Learning

Content type
User Generated
Subject
Computer Science
School
California University of Management and Sciences
Type
Homework
Showing Page:
1/6
Running head: MACHINE LEARNING 1
Machine Learning
Student’s Name
Institution Affiliation

Sign up to view the full document!

lock_open Sign Up
Showing Page:
2/6
MACHINE LEARNING 2
Lasso and Ridge Regression
Lasso and ridge regression techniques are used in the machine learning algorithm. The
primary difference between the two techniques is that they use different penalty functions in their
computations. Ridge regression is viewed as a regularized linear regression model. In ridge
regression, the regression coefficients are shrunk by imposing a penalty on their size. One of the
assumptions in ridge regression is that predictors are standardized, and the response is centered.
Rodriguez (2013) notes that the ridge coefficients minimize a penalized residual sum of squares.
As a result, weights tend to have smaller absolute values that are evenly distributed. Moreover,
they tend to be close to zero. Although the ridge regression technique enforces the β coefficients
to be lower, it does not enforce coefficients to be zero. Therefore, the technique does not
eliminate irrelevant features but minimizes their impact on a trained model. GeeksforGeeks.org
(2020) notes that the technique leads to both low bias and low variance.
Lasso is the Least Absolute Shrinkage and Selection Operator. Unlike ridge regression,
Lasso penalizes the sum of coefficients absolute values. Further, Lasso differs from ridge
regression in that there is no closed-form expression to compute β (Rodriguez, 2013). The Lasso
technique is also capable of performing the variable selection. Rodriguez (2013) highlights that
as λ increases, a higher number of coefficients are set to zero. In turn, fewer variables are
selected. Indeed, it is a significant advantage in machine learning when fewer features are
included in the model than the initial ones. Lasso tends to have a competitive advantage over the
ridge regression technique due to its prediction error and interpretation ability.
Nonetheless, Lasso is faced with some limitations. GeeksforGeeks.org (2020) explains that
it struggles with some types of data, especially when the number of predictors (p) is greater than
the number of observations (n). In such a case, the technique can select most n predictors as non-

Sign up to view the full document!

lock_open Sign Up
Showing Page:
3/6

Sign up to view the full document!

lock_open Sign Up
End of Preview - Want to read all 6 pages?
Access Now
Unformatted Attachment Preview
Running head: MACHINE LEARNING 1 Machine Learning Student’s Name Institution Affiliation MACHINE LEARNING 2 Lasso and Ridge Regression Lasso and ridge regression techniques are used in the machine learning algorithm. The primary difference between the two techniques is that they use different penalty functions in their computations. Ridge regression is viewed as a regularized linear regression model. In ridge regression, the regression coefficients are shrunk by imposing a penalty on their size. One of the assumptions in ridge regression is that predictors are standardized, and the response is centered. Rodriguez (2013) notes that the ridge coefficients minimize a penalized residual sum of squares. As a result, weights tend to have smaller absolute values that are evenly distributed. Moreover, they tend to be close to zero. Although the ridge regression technique enforces the β coefficients to be lower, it does not enforce coefficients to be zero. Therefore, the technique does not eliminate irrelevant features but minimizes their impact on a trained model. GeeksforGeeks.org (2020) notes that the technique leads to both low bias and low variance. Lasso is the Least Absolute Shrinkage and Selection Operator. Unlike ridge regression, Lasso penalizes the sum of coefficients absolute values. Further, Lasso differs from ridge regression in that there is no closed-form expression to compute β (Rodriguez, 2013). The Lasso technique is also capable of performing the variable ...
Purchase document to see full attachment
User generated content is uploaded by users for the purposes of learning and should be used following Studypool's honor code & terms of service.
Studypool
4.7
Indeed
4.5
Sitejabber
4.4

Similar Documents