Let
$f(n)$ and$g(n)$ be asymptotically nonnegative functions. Using the basic definition of$\Theta$ -notation, prove that$\max(f(n), g(n)) = \Theta (f(n) + g(n))$ .
Show that for any real constants
$a$ and$b$ , where$b>0$ ,$(n+a)^b=\Theta(n^b)$ .
If
If
Since
Explain why the statement, "The running time of algorithm
$A$ is at least$O(n^2)$ ," is meaningless.
Is
$2^{n+1}=O(2^n)$ ? Is$2^{2n}=O(2^n)$ ?
-
$2^{n+1}=O(2^n)$ ?
-
$2^{2n}=O(2^n)$ ?
Prove Theorem 3.1.
Theorem 3.1
For any two function
$f(n)$ and$g(n)$ , we have$f(n)=\Theta(g(n))$ if and only if$f(n)=O(g(n))$ and$f(n)=\Omega(g(n))$ .
Thus
Prove that the running time of an algorithm is
$\Theta(g(n))$ if and only if its worst-case running time is$O(g(n))$ and its best-case running time is$\Omega(g(n))$ .
Theorem 3.1
Prove that
$o(g(n)) \cap \omega(g(n))$ is the empty set.
There is no
We can extend our notation to the case of two parameters
$n$ and$m$ that can go to infinity independently at different rates. For a given function$g(n,m)$ , we denote by$O(g(n,m))$ the set of functions
$O(g(n,m))=\{f(n,m)$ : there exist positive constants$c$ ,$n_0$ , and$m_0$ such that$0 \le f(n,m) \le c g(n,m)$ for all$n \ge n_0$ or$m \ge m_0$ $\}$ .Give corresponding definitions for
$\Omega(g(n,m))$ and$\Theta(g(n,m))$ .