Let $f(n)$ be the worst case running time of a problem on input of size $n$. Let us make the problem a bit weird by fixing $f(n) = n^2$ for $n=2k$ but $f(n) = n$ for $n=2k+1$.
So, what is the lower bound of the problem? The way I understood it is just the lower bound of $f(n)$. But we know that $f(n) = \Omega(n^2)$ implies that there exists constant $k$, $n_0$ such that for all $n > n_0$, $f(n) > kn^2$, which is not true. Thus it seems that we can only say $f(n) = \Omega(n)$. But usually, we will call the problem has a lower bound of $\Omega(n^2)$, right?
Assuming that $g(n) = \Omega(n^2)$, which means that there exists constant $k$, $n_0$ such that for all $n > n_0$, $g(n) > kn^2$. Let's also assume a problem has running time $g(n)$. If we can reduce this problem for all the primes $n$ to another problem (with same input size), can we say the running time of the other problem has a lower bound of $\Omega(n^2)$?
But in case I want to make an explicit distinction, are there any notations I can use other than expanding it?
– Wei Yu Mar 19 '11 at 18:17