hello4540
hello4540
14.10.2020 • 
Mathematics

It is well-known that ridge regression tends to give similar coefficient values to correlated variables, whereas lasso may give quite different coefficient values to correlated variables. We will now explore this property in a very simple setting. Assume n are the number of training samples, p the number of dimensions, x in the input and y the output. Suppose that n = 2, p = 2, x11 = x12, and x21 = x22. Furthermore, suppose that y1 + y2 = 0 and x11 + x21 = 0 and x12 + x22 = 0, so that the estimate for the intercept in a least squares, ridge regression, or lasso model is zero: βˆ 0 = 0. (a) Write out the ridge regression optimization problem in this setting. (b) Argue that in this setting, the ridge coefficient estimates satisfy βˆ 1 = βˆ 2. (c) Write out the lasso optimization problem in this setting. (d) Argue that in this setting, the lasso coefficients βˆ 1 and βˆ 2 are not unique – in other words, there are many possible solutions to the optimization problem. Describe these solutions

Solved
Show answers

Ask an AI advisor a question