I want to fit model parameters $\omega$ to a data set $y$, where the model is time dependent, i.e. I want to optimise $f(t,\omega)$. The main problem I'm facing right now is that the data is left censored (below $c$). So I need an error function for my fitting algorithm.
My idea would be to take the maximum likelihood function (or the log version of it), assuming that the data points $y$ are all normally distributed around a time dependent $\mu(t)$ and that the variance $\sigma$ does not depend on $t$. Let $y_i$ for $i\in{1,...,m}$ be all the censored data points , then
$L(\omega)=\prod_{i=1}^m\Phi(\frac{c-f(t_i,\omega)}{\sigma})\prod_{i=m+1}^n\phi(\frac{y_i-f(t_i,\omega)}{\sigma})$, (1)
where $\Phi$ is the cumulative density function and $\phi$ is the denisty function. In the uncensored case one could ignore $\sigma$ as it does not affect the order of different solutions. In the censored case, however, this is not the case. So the question is which $\sigma$ to take. My idea would be to take the one that minimises equation (1), i.e.
$\tilde{\sigma}=argmax_\sigma\{L(\omega;\sigma)\}$.
Is there an efficient way to derive $\tilde \sigma$ or do I need to use numerical methods and if so, which would be efficient for this problem?
Is there a better choice for $\tilde \sigma$?
Thanks :)