In connection with this question it occurred to me to wonder: what is the time complexity for a single-tape single-head Turing machine to compute the length of its input? To be specific, let's say that the tape alphabet is $\{0,1,b\}$, the input is a string in $(0+1)^*$ surrounded by blanks, the machine starts at the leftmost input symbol, and it must terminate at the leftmost symbol of a string in $(0+1)^*$ (again surrounded by blanks) that gives the binary representation of the input length. This can also be thought of as the problem of converting a number from unary to binary.
It's easy to solve this on a two-tape machine or two-head machine in linear time (just scan the input with one head while using the other head to repeatedly increment a counter; incrementing is a constant amortized time operation). But the single-head solutions I can come up with are only $O(n\log n)$ (e.g. repeatedly increment a counter and then shift it by one position along the tape). Is there a matching lower bound?
I tried some searches but phrases like "one head" and "input length" are so common as to make it difficult to search the literature for known results on this problem.
I'm curious if there's a relation between a lower bound for this and a lower bound for oblivious TM simulation. (Any TM solving this problem would, by definition, be oblivious (or have unnecessary code).)
– Daniel Apon Sep 14 '13 at 03:12