What is LASSO?
LASSO, short for Least Absolute Shrinkage and Selection Operator, is a statistical formula whose main purpose is the feature selection and regularization of data models. The method was first introduced in 1996 by Statistics Professor Robert Tibshirani. LASSO introduces parameters to the sum of a model, giving it an upper bound that acts as a constraint for the sum to include absolute parameters within an allowable range.
The LASSO method regularizes model parameters by shrinking the regression coefficients, reducing some of them to zero. The feature selection phase occurs after the shrinkage, where every non-zero value is selected to be used in the model. This method is significant in the minimization of prediction errors that are common in statistical models.
LASSO offers models with high prediction accuracy. The accuracy increases since the method includes shrinkage of coefficients, which reduces variance and minimizes bias. It performs best when the number of observations is low and the number of features is high. It heavily relies on parameter λ, which is the controlling factor in shrinkage. The larger λ becomes, then the more coefficients are forced to be zero.
When λ is equal to zero, then the model becomes the Ordinary Least Squares regression. Consequently, when λ increases, the variance decreases significantly, and the bias in the result increases, too. Lasso is also a useful tool in eliminating all irrelevant variables that are not related to the response variable.
LASSO in Statistical Linear Models
A statistical model is a real-life mathematical representation of a problem. The model should express the problem as closely as possible to the real world while making it simple and easy to understand. A model is composed of explanatory and response variables.
The explanatory variable is an independent variable that is at the discretion of the researcher. The independent variables are the inputs in the model that can be measured by the researcher to determine their effect on the results of the model.
The response variable is a dependent variable that forms the main focus of the experiment. It forms the result of the experiment, which can be a single result in the case of univariate models, or, in the case of multivariate models, multiple results.
LASSO forms an integral part of the model building process, especially using the features selection. The features selection phase helps in the selection of explanatory variables, which are the independent variables and, hence, the input variables in the model.
The input variables are important elements that determine the model’s output and help measure their effect on the response variables. Choosing the right variables determines the accuracy of the model. The features selection phase of the LASSO helps in the proper selection of the variables.
Estimation with LASSO
Statistical models rely on LASSO for accurate variable selection and regularization. For example, in linear regression, LASSO introduces an upper bound for the sum of squares, hence minimizing the errors present in the model. The LASSO estimator depends on the parameter λ.
The parameter λ controls the strength of the shrinkage, where an increase in λ results in an increase in shrinkage. The upper bound of the sum of all coefficients is inversely proportional to the parameter λ. When the upper bound increases in value, the parameter λ decreases. When the upper bound decreases, the parameter λ increases simultaneously.
As the upper bound increases toward infinity, the parameter λ approaches zero, converting the experiment into an Ordinary Least Squares, where the parameter λ is always equal to zero. When the upper bound coefficients approach zero, the value of parameter λ increases toward infinity.
LASSO forms a diamond shape in the plot for its constraint region, as shown in the image above. The diamond shape includes corners, unlike the circular shape formed by ridge regression. The proximity of the first point to the corner shows that the model comes with one coefficient, which is equal to zero.
The ridge regression constraints region forms a circular shape that includes no corners similar to that formed by the LASSO constraints region when plotted. The ridge regression coefficients can, therefore, not be equal to zero.
Weighted LASSO is the result of a researcher penalizing the regression coefficients in isolation. This means that instead of penalizing a common parameter λ to all coefficients, the coefficients are penalized individually, using different parameters.
The weights can be determined by using a LASSO algorithm to assign weights appropriately for accurate modeling. A similar weighting of regression coefficients is the cooperative LASSO, where the coefficients are penalized in groups that are deemed similar.
To keep learning and developing your knowledge of business intelligence, we highly recommend the additional CFI resources below: