What is big ‘O’ and omega notation? Explain the growth rate of n log n, n^2 and 2^n functions with n.
Subject Algorithm Design Set: 2.(a) Marks: 5 Year: 2009
We say that the running time is "big-Ω of f, left parenthesis, n, right parenthesis." We use big-Ω notation for asymptotic lower bounds, since it bounds the growth of the running time from below for large enough input sizes.
Just as \Theta(f(n)) automatically implies O, left parenthesis, f, left parenthesis, n, right parenthesis, right parenthesis, it also automatically implies \Omega(f(n)). So we can say that the worst-case running time of binary search is \Omega(\log_2 n).

f(n) = 8n^3+2n^2

g(n) = n^3

From definition, we know f(n) = Theta(g(n)) if and only if f(n) = O(g(n)) and

f(n) = Omega(g(n)).

So, we need to show f(n)= O(g(n)) and f(n) = Omega(g(n))

f(n) = O(g(n)) : if there are positive constants c and nsuch that f(n) <= c(g(n)) when n >= no

8n^3+2n^2 <= c(g(n))

when c = 9, no = 2,  for all n>=no 8n^3+2n^2 <= 9 (n^3)

Therefore, f(n) = O(g(n))

f(n) = Omega (g(n)) : if there are positive constants c and no such that f(n) >= c(g(n)) when n>=no

8n^3+2n^2 >= c(n^3)

when c = 8, no = 0, for all n>=no 8n^3+2n^2 >= 8(n^3)

Therefore, f(n) = Omega(g(n))

Therefore, 8n^3+2n^2 = Theta (n^3).