COMPLEXITY OF ALGORITHMS


It is very convenient to classify algorithms based on the relative amount of time or relative amount of space they require and specify the growth of time / space requirements as a function of the input size. Thus we have the notions of

Time Complexity : Running time of the program as a function of size of the input.

Space Complexity : Amount of computer memory required during the program exicution.

BIG Oh NOTATION :
        A convenient way of describing the growth rate of a function and hence the time complexity of an algorithm.

Let n be the size of the input and f(n) , g(n) be positive functions of n.

DEF : f(n) is O(g(n)) if and only if there exits a real , positive constant C and a positive integer m such that
                        f(n) ≤ C × g(n)  for all n>m
 1)note that O(g(n)) is a class of functions.
2)the big Oh notation specifies asymptotic upper bounds.
3)O(1) refers to constant time.
O(n) indicates linear time.
O(n^k)  (k fixed) refers to polynomial time.
O(log n) is called logarithmic time.
O(2^n) refers to exponential time , etc.
Examples :
       Let f(n) = n² + n +5
                then f(n) is O(n²)
                        f(n) is O(n³)
                        f(n) is not O(n)

One thought on “COMPLEXITY OF ALGORITHMS”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s