I have to use GP regression on a complex time series, and the kernel function is not known in closed form. I have found a numerical approximation with the Gauss-Laugerre quadrature. It takes the following form:
$$ C(\tau) = C(t_1 - t_2) = \exp(j C_1 \tau) \sum_{i = 1}^n w_i(\eta) f(x_i, \tau, \Lambda) , $$
where $\Theta = [\eta, \Lambda]$ are my hyperparameters, $w_i$ are the weights of the generalized Gauss-Laugerre polynomial, $x_i$ are the zeros generalized Gauss-Laugerre polynomial of order $n$, and $j = \sqrt{-1}$.
Is there any literature about estimating hyper-parameters when the covariance function is not a direct function of the parameters but has a numerical intractable form with hidden hyper-parameters?
EDIT:
I have a similar question here. In my case, the log-likelihood is extremely smooth, having almost a gradient of zero for a wide range of parameter space. I believe that the parameters are not too strongly affecting the log-likelihood and can be verified by my model. That is why I also tried to formulate the same problem with a GP this time.