Can I implement ridge regression in terms of OLS regression? Is it even possible?
I am interested because scikit-learn supports non-negative least squares (NNLS), but not non-negative ridge regression. So, I'd like to transform my data as to be able to call the underlying NNLS function, but achieve ridge regression functionality.
lmridge_nnls = function (X, Y, lambda) nnls(A=crossprod(X)+lambda*diag(ncol(X)), b=crossprod(X,Y))$x(which I think is the correct solution), and another option which islmridge_nnls_rbind = function (X, Y, lambda) nnls(A=rbind(X,sqrt(lambda)*diag(ncol(X))), b=c(Y,rep(0,ncol(X))))$x(which I think is for the nonnegativity constrained case not quite correct; the the unconstrained case the two would be equivalent). – Tom Wenseleers Aug 29 '19 at 19:06