Is there a complexity degree that is bigger than $O(n)$ and smaller than $O(n \log n)$?
-
1I think perhaps this question would fit better in the Computer Science stackexchange? – LKlevin Jun 02 '14 at 10:06
-
@LKlevin: Agreed. – Geoff Oxberry Jun 02 '14 at 21:23
-
2The computer science stack exchange is not very friendly towards basic questions like this. – Nick Alger Dec 09 '14 at 05:44
3 Answers
$n \log\log n$ is between $n$ and $n \log n$, and is a relatively common one to find in the wild.
- 10,905
- 1
- 21
- 39
-
5
-
1Though, depending on the motivation of the asker this may not be relevant distinction - for all practical purposes $\log \log n$ is just a small constant factor. – Eamon Nerbonne Jun 01 '14 at 14:39
-
2Yeah, though that's true for for $\log n$ as well if $n$ is small enough! – Bill Barth Jun 01 '14 at 15:07
-
1@BillBarth Yes, but it's exponentially less constant than the $\log \log n$ constant! – Pål GD Jun 01 '14 at 18:16
On top of $O(n\log(\log(n)))$, there's also $O(n \log^*(n))$ in which $\log^*$ is the number of times the logarithm function must be applied in order for the result to be less than or equal to 1.
For instance, if you already know an Euclidean minimum spanning tree, the Delaunay triangulation may be discovered in $O(n\log^*(n))$ time.
More extremely, one can look at the inverse Ackermann function $\alpha(n,n)$, which may be found in the analysis of several algorithms of complexity $O(n\alpha(n,n))$. There's a good introduction here.
- 1,675
- 12
- 20
-
2Don't forget the glory that is $\alpha^*(n)$, the iterated inverse ackermann function! – Alexis Beingessner Jun 01 '14 at 17:50
There are infinitely many, since $O(n(\log n)^\alpha) \subsetneq O(n(\log n)^\beta)$ for any $\alpha<\beta$. So, in particular, $O(n) = O(n(\log n)^0) \subsetneq O(n(\log n)^\alpha) \subsetneq O(n\log n)$ for any $\alpha\in (0,1)$.
- 141
- 4