I'm trying to fit a ridge regression model with a single predictor. However, when I try to do so in three different R packages I get the three following errors:
Error in colMeans(X[, -Inter]) :
'x' must be an array of at least two dimensions
Error in if (is.null(np) | (np[2] <= 1)) stop("x should be a matrix with 2 or more columns") :
argument is of length zero
Error in colMeans(x[, -Inter]) :
'x' must be an array of at least two dimensions
The bottom line from these errors is that x needs to have at least 2 dimensions. Why is this necessary for ridge regression? Does this mean that I can't use ridge regression with a single predictor? Just seems weird I couldn't use ridge regression to get regularization for something like a t-test.
Here is my code:
library(lmridge)
library(glmnet)
library(ridge)
data
set.seet(100)
y <- rnorm(100)
x <- rbinom(100, 1, .5)
z <- rbinom(100, 1, .5)
data <- cbind.data.frame(y, x, z)
ridge
linearRidge(y ~ x, data = data)
glmnet
glmnet(data$x, data$y, nlambda = 25, alpha = 0, family = 'gaussian', lambda = .5)
lmridge
lmridge(y ~ x, data = data, scaling = "sc", K = seq(0, 1, 0.001))
colMeansexamples should probablydrop = FALSEand the middle example seems to think|short-circuits when it actually does not, so the wrong error message is returned), probably because a single column is not expected behavior. – Chris Haug Apr 20 '22 at 14:00Xyou can refer to the columns of the matrix likeX[,1]. But if instead of a matrix you have only a single vector, thenX[,1]gives an error. If you would want to improve the function then this means that you need to work with an exception (for the case of only one single variable). – Sextus Empiricus Apr 21 '22 at 22:48