It is very convenient to classify algorithms based on the relative amount of time or relative amount of space they require and specify the growth of time / space requirements as a function of the input size. Thus we have the notions of

– Time Complexity : Running time of the program as a function of size of the input.

–Space Complexity : Amount of computer memory required during the program exicution.

BIG Oh NOTATION :
A convenient way of describing the growth rate of a function and hence the time complexity of an algorithm.

Let n be the size of the input and f(n) , g(n) be positive functions of n.

DEF : f(n) is O(g(n)) if and only if there exits a real , positive constant C and a positive integer m such that
f(n) ≤ C × g(n) for all n>m
1)note that O(g(n)) is a class of functions.
2)the big Oh notation specifies asymptotic upper bounds.
3)O(1) refers to constant time.
O(n) indicates linear time.
O(n^k) (k fixed) refers to polynomial time.
O(log n) is called logarithmic time.
O(2^n) refers to exponential time , etc. Examples : Let f(n) = n² + n +5
then f(n) is O(n²)
f(n) is O(n³)
f(n) is not O(n)

If the code doesn't work, please replace the single quotes and double quotes(Actually these are not proper single and double quotes) in the code with single quotes and double quotes using your keyboard..

want more on complexity of data structure