3.1 Asymptotic notation
3.1-1
Let \(f(n) + g(n)\) be asymptotically nonnegative functions. Using the basic definition of \(\Theta\)-notation, prove that \(\max(f(n), g(n)) = \Theta(f(n) + g(n))\).
For asymptotically nonnegative functions \(f(n)\) and \(g(n)\), we know that
Let \(n_0 = \max(n_1, n_2)\) and we know the equations below would be true for \(n > n_0\):
Then we can combine last two inequalities:
Which is the definition of \(\Theta{(f(n) + g(n))}\) with \(c_1 = \frac{1}{2}\) and \(c_2 = 1\)
3.1-2
Show that for any real constants \(a\) and \(b\), where \(b > 0\),
\[(n + a)^b = \Theta(n^b). \tag{3.2}\]
Expand \((n + a)^b\) by the Binomial Expansion, we have
Besides, we know below is true for any polynomial when \(x \ge 1\).
Thus,
3.1-3
Explain why the statement, "The running time of algorithm \(A\) is at least \(O(n^2)\)," is meaningless.
\(T(n)\): running time of algorithm \(A\). We just care about the upper bound and the lower bound of \(T(n)\).
The statement: \(T(n)\) is at least \(O(n^2)\).
- Upper bound: Because "\(T(n)\) is at least \(O(n^2)\)", there's no information about the upper bound of \(T(n)\).
- Lower bound: Assume \(f(n) = O(n^2)\), then the statement: \(T(n) \ge f(n)\), but \(f(n)\) could be any fuction that is "smaller" than \(n^2\). For example, constant, \(n\), etc, so there's no conclusion about the lower bound of \(T(n)\), too.
Therefore, the statement, "The running time of algorithm \(A\) is at least \(O(n^2)\)," is meaningless.
3.1-4
Is \(2^{n + 1} = O(2^n)\)? Is \(2^{2n} = O(2^n)\)?
-
True. Note that \(2^{n + 1} = 2 \times 2^n\). We can choose \(c \ge 2\) and \(n_0 = 0\), such that \(0 \le 2^{n + 1} \le c \times 2^n\) for all \(n \ge n_0\). By definition, \(2^{n + 1} = O(2^n)\).
-
False. Note that \(2^{2n} = 2^n \times 2^n = 4^n\). We can't find any \(c\) and \(n_0\), such that \(0 \le 2^{2n} = 4^n \le c \times 2^n\) for all \(n \ge n_0\).
3.1-5
Prove Theorem 3.1.
The theorem states:
For any two functions \(f(n)\) and \(g(n)\), we have \(f(n) = \Theta(g(n))\) if and only if \(f(n) = O(g(n))\) and \(f(n) = \Omega(g(n))\).
From \(f = \Theta(g(n))\), we have that
We can pick the constants from here and use them in the definitions of \(O\) and \(\Omega\) to show that both hold.
From \(f(n) = \Omega(g(n))\) and \(f(n) = O(g(n))\), we have that
If we let \(n_3 = \max(n_1, n_2)\) and merge the inequalities, we get
Which is the definition of \(\Theta\).
3.1-6
Prove that the running time of an algorithm is \(\Theta(g(n))\) if and only if its worst-case running time is \(O(g(n))\) and its best-case running time is \(\Omega(g(n))\).
If \(T_w\) is the worst-case running time and \(T_b\) is the best-case running time, we know that
Combining them we get
Since the running time is bound between \(T_b\) and \(T_w\) and the above is the definition of the \(\Theta\)-notation, proved.
3.1-7
Prove \(o(g(n)) \cap w(g(n))\) is the empty set.
Let \(f(n) = o(g(n)) \cap w(g(n))\). We know that for any \(c_1 > 0\), \(c_2 > 0\),
If we pick \(n_0 = \max(n_1, n_2)\), and let \(c_1 = c_2\), from the problem definition we get
There is no solutions, which means that the intersection is the empty set.
3.1-8
We can extend our notation to the case of two parameters \(n\) and \(m\) that can go to infinity independently at different rates. For a given function \(g(n, m)\) we denote \(O(g(n, m))\) the set of functions:
\[ \begin{aligned} O(g(n, m)) = \\{f(n, m): & \text{ there exist positive constants } c, n_0, \text{ and } m_0 \\\\ & \text{ such that } 0 \le f(n, m) \le cg(n, m) \\\\ & \text{ for all } n \ge n_0 \text{ or } m \ge m_0.\\} \end{aligned} \]Give corresponding definitions for \(\Omega(g(n, m))\) and \(\Theta(g(n, m))\).
本页面的全部内容在 小熊老师 - 莆田青少年编程俱乐部 0594codes.cn 协议之条款下提供,附加条款亦可能应用