My question is related to the question from this thread (Cause of singularity in matrix for quantile regression). But I am not able to solve my specific issue. My data-set includes 2000 observations (returns of prices) looking like this:
My goal is to estimate the $i$-th conditional quantile of $y$ given $y-1$ in a non-parametric way. I adjusted the locally polynomial quantile regression from the quantreg package in R to estimate the conditonal quantile for its own sample instead of an equally-sized x-bin sequence. The resulting weights vector ($wx$) however give a lot of precise zero's for observation t=786. And I think this result in a Singular design matrix error. Please see code snippet with my current approach:
test_lprq2 <- function(x, y, h, tau) # modified from lprq
{
xx <- x # seq(min(x),max(x),length=m) in quantreg
fv <- xx
dv <- xx
for(i in 1:length(xx)) {
cat("Streak: ", i, "\n")
z <- x - xx[i]
wx <- dnorm(z0/h)
r <- rq(y~z, weights=wx, tau=tau, ci=FALSE)
fv[i] <- r$coef[1.]
dv[i] <- r$coef[2.]
}
list(xx = xx, fv = fv, dv = dv)
}
Some solutions suggest to increase the bandwidth parameter ($h$) but I don't feel very comfortable taking that approach as it changes the results a lot. I would like to solve the issue by "dithering" the vector but only after there are weighted to $wx$. I tried to run the $rq$ without a weights argument and just weight the vectors myself but I found that the results are different, meaning that the $rq$ method is doing something with the weights vector under the hood but I can't find what exactly.
Can anyone help me with this issue or know a better approach?
Thanks in advance!
