0

HI i want to learn how to get the time complexity of programs , i actually went to youtube and search for it i found some of the tutorials explaining the time complexity of nested loop like if for example i have this code

int a=1;
for(i=0;i<3;i++){
 for(j=0;j<3;j++){
  a=a+1;
  print("%d",a);
 }
}

so in this case the time complexity would be 3x3=9 (not sure still) , but i do not know how what would be the time complexity of recursive function and other programs , this was just a simple program :( i actually want to figure it out completely so that whenever i make my own program it would be easy for me to do so , especially in a job test they asks this alot :(

Umair Ali
  • 39
  • 6
  • 3
    Does this answer your question? [How to find time complexity of an algorithm](https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm) – Alexander Freyr Feb 25 '20 at 12:07
  • i am still confused who to find complexity of the recursive function:( . – Umair Ali Feb 25 '20 at 13:37

1 Answers1

0

First, you need to write out an expression for the runtime T of the algorithm as a function of some variables; a single variable n representing problem size is typical, but even better is a variable representing the size in bits of the input.

Next, you need to reduce that expression to a closed-form expression if it is not already in a closed form. This means evaluating summations (which you get from loops, including nested loops) and solving recurrences (instances where T(n) is defined in terms of T(f(n)) for some function f).

There are lots of techniques for evaluating summations and for solving recurrences. Some of the most useful ones are outlined below.

Let us write SUM(i=a...b)(f(i)) for the sum of f(i) where i varies from a to b, inclusive. Some known results are the following:

  1. SUM(i=a...b)(c) = (b - a + 1)c, where c is a constant independent of i. This just says that adding up N copies of a constant C gives CN.

  2. SUM(i=a...b)(f(i)+g(i)) = SUM(i=a...b)(f(i)) + SUM(i=a...b)(g(i)). That is, if summing the sum of two functions, you can sum each function separately, then add the summations.

  3. SUM(i=a...b)(c * f(i)) = c * SUM(i=a...b)(f(i)). That is, if summing a function multiplied by a constant, you can pull the constant out of the summation and multiply the constant by the summation of the function itself

  4. SUM(i=a...b)(i) = b(b+1)/2 - a(a-1)/2. The summation 1 + 2 + … + k can be rearranged like 1 + k + 2 + (k - 1) + … + (k/2) + (k/2 + 1); then, each pair of terms sums to k + 1, and there are k/2 terms; so, k(k+1)/2 is the summation.

  5. SUM(i=a...b)(2^i) = 2^(b+1) - 2^a. The summation 1 + 2 + … + 2^k has binary representation 11...1 (k times) which is exactly one less than 100...0 (with k instances of 0) which is 2^(k+1).

There are lots of other summation formulas that can be derived but these are some of the most widely encountered. Sum summations are telescoping in that the first and last terms are all that remains after canceling terms across summations.

For recurrences, in general you can try to understand how the function changes, make an educated guess about the form of the function that works for it, and then prove that guess to be correct. Proof by mathematical induction is a not a bad way to approach this. To get the guess, you can write out a few terms and then see how the difference between successive terms changes over time. You can even check the how the difference of successive differences changes, and so on. For instance, consider something like

T(n) = 2T(n/3) + nlogn

We can write out some terms:

 n    T(n)
 -    ----
 1    c
 3    2c + 3log3
 9    4c + 6log3 + 9log9 = 4c + 24log3
27    8c + 48log3 + 27log27 = 8c + 129log3

It looks like the general form of this is

T(n) = (2^log_3(n))c + SUM(i=1...n)(i*log_3(i)*log(3))

Now, I don't know what the closed-form solution might be for the summation of i*log_3(i). Wolfram Alpha seems to indicate there is no closed-form elementary function that gives this exactly. We know the answer is definitely at least n(n+1)/2 since log_3(i) is eventually always greater than 1 for n > 3; so an asymptotic bound is at least Omega(n^2); however, it's unclear if n^2 is also an upper bound, or if Omega(n^2) is tight.

If you're interested in getting asymptotic bounds only, and not necessarily the exact expression for a recurrence relation, you can look into the Master theorem. It gives a way of getting asymptotic bounds directly without getting the closed-form expression.

In general, the problem of finding the runtime of an algorithm is undecidable - no Turing machine can do it and, if Church and Turing were right, no other system of effective computation (including human beings) can algorithmically solve the problem in all instances. The reason for this is that to know the complexity of a function for all (or at least all but a finite number) of inputs directly solves the halting problem: given an algorithm and an input, can you determine whether the algorithm halts on the input or runs forever?

Patrick87
  • 26,639
  • 3
  • 37
  • 71