To learn the analysis of algorithms and to dive in the study of
complexity for algorithms few mathematical concepts are required and
used in almost all cases.

I am presenting few concepts in simple manner, so a beginner can grasp it easily and use it in further study of algorithms.

O(g(n)) = {f(n): there exist positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0}.

It specifies that function gives a upper bound on the complexity of the algorithm and it's the highest value that an algorithm could achieve and not depends on any particular condition, whatever is the input its the highest bound on the time.

whenever we write f(n) = O(g(n)) then f(n) is upper bounded by g(n), it means g(n) is having more value than f(n) or in other words{though not technically correct, we can write value of f(n)<= g(n)}.

Θ-notation :(First image in above fig.)

Θ(g(n)) = {f(n) : there exist positive constants c1, c2, and n0 such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0}.

function f(n) is sandwiched between g(n) for any two arbitrary constants c1 and c2 such that for any value(eg c1=1/2) f(n) is smaller than g(n) as well as for any other value (eg c2=3/2) f(n) is greater than g(n).

The definition of Θ(g(n)) requires that every member f(n) Θ(g(n)) be asymptotically non-negative, that is, that f(n) be non-negative whenever n is sufficiently large. (An asymptotically positive function is one that is positive for all sufficiently large n.) Basically it is all related to set theory and we denote each set as a set of functions and complexity is based only for positive functions only.

Ω-notation: (Last image in above fig.) :

Ω(g(n)) = {f(n): there exist positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ no}.

it means function f(n) is greater than g(n) for a sufficient value of constant c.

Few facts:

floor(

for example floor(9.6)=9 and ceiling(9.6)=10.

I am presenting few concepts in simple manner, so a beginner can grasp it easily and use it in further study of algorithms.

**Asymptotic notations:**Asymptotic notations |

**Big O notation (middle image in above figure):**

O(g(n)) = {f(n): there exist positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0}.

It specifies that function gives a upper bound on the complexity of the algorithm and it's the highest value that an algorithm could achieve and not depends on any particular condition, whatever is the input its the highest bound on the time.

whenever we write f(n) = O(g(n)) then f(n) is upper bounded by g(n), it means g(n) is having more value than f(n) or in other words{though not technically correct, we can write value of f(n)<= g(n)}.

Θ-notation :(First image in above fig.)

Θ(g(n)) = {f(n) : there exist positive constants c1, c2, and n0 such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0}.

function f(n) is sandwiched between g(n) for any two arbitrary constants c1 and c2 such that for any value(eg c1=1/2) f(n) is smaller than g(n) as well as for any other value (eg c2=3/2) f(n) is greater than g(n).

**Θ notation is more stronger than O notation and we can write Θ subset of O.**The definition of Θ(g(n)) requires that every member f(n) Θ(g(n)) be asymptotically non-negative, that is, that f(n) be non-negative whenever n is sufficiently large. (An asymptotically positive function is one that is positive for all sufficiently large n.) Basically it is all related to set theory and we denote each set as a set of functions and complexity is based only for positive functions only.

Ω-notation: (Last image in above fig.) :

Ω(g(n)) = {f(n): there exist positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ no}.

it means function f(n) is greater than g(n) for a sufficient value of constant c.

Few facts:

- The constants(c1,c2 etc) mentioned in each case should be only numerical constants and should not be any exponential or any other value , for example :

^{n+1}== O(2^{n}) but 2^{2n }!= O(2^{n}) because we can write 2^{n+1 }as 2*2^{n }and consider 2 as constant(c1) but in case of 2^{2n },it can be written as 2^{n }*2^{n }and here 2^{n }cannot be considered as a constant. So it can’t be considered as O(2^{n}).- f(n)
= Θ(g(n))
**iff**f(n) = O(g(n)) and f(n) = Ω(g(n)) because in that way only it will be upper bounded by g(n) as well as lower bounded by g(n) for any constants c1,c2. - We generally neglect lower order terms for analyzing the complexity of algorithm. eg( an
^{2}+bn+c) could be written as O(n^{2}))

floor(

*x*) = is the largest integer not greater than*x*and ceiling(*x*) = is the smallest integer not less than*x.*for example floor(9.6)=9 and ceiling(9.6)=10.